As radiology practices around the country become increasingly reliant on business analytics and intelligence for decision-making support, the time is ripe to begin devoting additional attention to refining the processes by which their measurements are generated, according to Bill Pickart, CEO of Integrated Medical Partners (IMP). “It is imperative that you understand the quality of the underlying data you are getting, and there are degrees,” he says. “If a database is constructed properly, there should be very little additional effort or cost associated with quantifying and qualifying the data for use by practice decision makers.”
Pickart outlines three key considerations for improved database architecture: data sourcing and origin, data integrity in acceptance and handling, and presentation of analytics.
“Many groups focus on data presentation or dashboarding, but if thought isn’t given to the structure and logic of your overall analytics strategy, the dashboards become less relevant and useful over time,” he says. “When this is handled properly, however, you can have confidence that you’re taking the right course of action.”
Data Sourcing
For comprehensive decision support, today’s radiology practices must utilize data from a variety of sources, both internal and external, Pickart says. Internal sources include RIS, PACS, revenue cycle management (RCM) and practice-management systems, utilization-management or appropriateness-criteria systems, precertification and preauthorization programs, critical results alert programs and financial costing systems, to name a few. “The challenge is that these sources tend to be highly vertical and oftentimes closed,” Pickart notes. “For instance, you might get some dynamite reports from the PACS, but they won’t be correlated with information from the RCM or financial system.”
External data sources might include hospital or departmental information systems, Pickart says. “Typically, a hospital-based practice will want to draw from the information system where the patient demographics were originally captured,” he explains. “You might also want to include studies or benchmarks from third-party research houses or data from your community’s health information exchange.”
Data Acceptance
The next step is for a practice’s database architecture to facilitate acceptance of these data. Having such a wide array of sources makes reconciliation and cross-referencing particularly important. “You need matrix capabilities—the opportunity to create internal data points relevant to any of the information you are bringing in, so you can design valid and relevant practice benchmarks,” Pickart says. “Internal benchmarking and cross-field referencing are major components in elevating the quality of your analytics and decision support.”
For instance, he says, when comparing radiologist utilization against external benchmarks, a radiology practice could decide that in order to qualify as full-time equivalent (FTE), its radiologists each need to log 45 hours of work time a week. To ensure the quality of the measure, “You send that data point through a series of filters such that when it gets to the analytics data warehouse from which the analytics draw, it has been broken down to its most basic element,” Pickart says.
Comparing that measure to an external benchmark has its own challenges, Pickart notes. “Typical external benchmarks rely on self-reported data, but that data is subject to inconsistencies because it comes from different practices, each with different definitions of an FTE,” he says. By establishing rules for how data enters the data warehouse or repository from which analytics are produced, practices can avoid the “garbage in, garbage out” trap that leaves them with unreliable information, Pickart says. “When you are relying solely on third-party, self-reported data, you’re introducing ambiguity to the data itself, and unless you account for that, you’re doing a disservice to the leaders of your practice when they make decisions on it,” he says. “By understanding and setting rules for data acceptance, by the time the information reaches the data pool, it is very clean and crisp. There’s no variation to account for.”
Reconciliation and cross-referencing, if properly managed, allows the practice to improve the quality of the data being used for comparison, Pickart explains. “You can then run all kinds of analytics on basic data points without being subjected to the interpretation of what somebody did in terms of self-reporting,” he says.
Data Presentation
Presentation of data is the final element to be considered, Pickart says. Here, he aligns himself philosophically with thought leaders in imaging informatics such as Paul Chang, MD, professor of radiology and vice chair of radiology informatics at the University of Chicago. Chang makes the distinction between two terms that practice leaders have a tendency to use interchangeably—dashboards and scorecards—and Pickart agrees that this distinction is critical. “With valid scorecards, many practices would be in a position to optimize their performance,” Pickart says. “Our belief, at IMP, is that scorecards drive results, while the typical pretty graphs and nice-looking pie charts just indicate where the practice is at a given point in time.”
With clean data from a wide array of sources, practices can develop scorecards for financial, operational, and clinical measures, which Pickart defines as report cards for the practice. “Scorecards contain immense strategic value,” he notes. “You get a timed series of practice-performance indicators measured against your target or benchmark, and then dashboards allow you to see how you are doing that day, relative to the predetermined study period.”
The potential of business analytics for the practice is not yet being fully realized by many radiology groups, Pickart concludes. “Many of the entities out there thinking of driving their practices through analytics and informatics are giving thought to issues pertaining to presentation,” he says, “but a lot more serious thought has to go into where your data are truly coming from and the architecture you have developed to receive, manage and store them. Practices that want to remain independent and thrive have to be efficient, watch costs, and positively respond to declining reimbursement—and well-structured analytics and decision support systems are key facilitators. The better practices can manage their informatics capabilities, the more long-term, independent success they will experience.”
Cat Vasko is editor of RadAnalytics and associate editor of Radiology Business Journal.
Pickart outlines three key considerations for improved database architecture: data sourcing and origin, data integrity in acceptance and handling, and presentation of analytics.
“Many groups focus on data presentation or dashboarding, but if thought isn’t given to the structure and logic of your overall analytics strategy, the dashboards become less relevant and useful over time,” he says. “When this is handled properly, however, you can have confidence that you’re taking the right course of action.”
Data Sourcing
For comprehensive decision support, today’s radiology practices must utilize data from a variety of sources, both internal and external, Pickart says. Internal sources include RIS, PACS, revenue cycle management (RCM) and practice-management systems, utilization-management or appropriateness-criteria systems, precertification and preauthorization programs, critical results alert programs and financial costing systems, to name a few. “The challenge is that these sources tend to be highly vertical and oftentimes closed,” Pickart notes. “For instance, you might get some dynamite reports from the PACS, but they won’t be correlated with information from the RCM or financial system.”
External data sources might include hospital or departmental information systems, Pickart says. “Typically, a hospital-based practice will want to draw from the information system where the patient demographics were originally captured,” he explains. “You might also want to include studies or benchmarks from third-party research houses or data from your community’s health information exchange.”
Data Acceptance
The next step is for a practice’s database architecture to facilitate acceptance of these data. Having such a wide array of sources makes reconciliation and cross-referencing particularly important. “You need matrix capabilities—the opportunity to create internal data points relevant to any of the information you are bringing in, so you can design valid and relevant practice benchmarks,” Pickart says. “Internal benchmarking and cross-field referencing are major components in elevating the quality of your analytics and decision support.”
For instance, he says, when comparing radiologist utilization against external benchmarks, a radiology practice could decide that in order to qualify as full-time equivalent (FTE), its radiologists each need to log 45 hours of work time a week. To ensure the quality of the measure, “You send that data point through a series of filters such that when it gets to the analytics data warehouse from which the analytics draw, it has been broken down to its most basic element,” Pickart says.
Comparing that measure to an external benchmark has its own challenges, Pickart notes. “Typical external benchmarks rely on self-reported data, but that data is subject to inconsistencies because it comes from different practices, each with different definitions of an FTE,” he says. By establishing rules for how data enters the data warehouse or repository from which analytics are produced, practices can avoid the “garbage in, garbage out” trap that leaves them with unreliable information, Pickart says. “When you are relying solely on third-party, self-reported data, you’re introducing ambiguity to the data itself, and unless you account for that, you’re doing a disservice to the leaders of your practice when they make decisions on it,” he says. “By understanding and setting rules for data acceptance, by the time the information reaches the data pool, it is very clean and crisp. There’s no variation to account for.”
Reconciliation and cross-referencing, if properly managed, allows the practice to improve the quality of the data being used for comparison, Pickart explains. “You can then run all kinds of analytics on basic data points without being subjected to the interpretation of what somebody did in terms of self-reporting,” he says.
Data Presentation
Presentation of data is the final element to be considered, Pickart says. Here, he aligns himself philosophically with thought leaders in imaging informatics such as Paul Chang, MD, professor of radiology and vice chair of radiology informatics at the University of Chicago. Chang makes the distinction between two terms that practice leaders have a tendency to use interchangeably—dashboards and scorecards—and Pickart agrees that this distinction is critical. “With valid scorecards, many practices would be in a position to optimize their performance,” Pickart says. “Our belief, at IMP, is that scorecards drive results, while the typical pretty graphs and nice-looking pie charts just indicate where the practice is at a given point in time.”
With clean data from a wide array of sources, practices can develop scorecards for financial, operational, and clinical measures, which Pickart defines as report cards for the practice. “Scorecards contain immense strategic value,” he notes. “You get a timed series of practice-performance indicators measured against your target or benchmark, and then dashboards allow you to see how you are doing that day, relative to the predetermined study period.”
The potential of business analytics for the practice is not yet being fully realized by many radiology groups, Pickart concludes. “Many of the entities out there thinking of driving their practices through analytics and informatics are giving thought to issues pertaining to presentation,” he says, “but a lot more serious thought has to go into where your data are truly coming from and the architecture you have developed to receive, manage and store them. Practices that want to remain independent and thrive have to be efficient, watch costs, and positively respond to declining reimbursement—and well-structured analytics and decision support systems are key facilitators. The better practices can manage their informatics capabilities, the more long-term, independent success they will experience.”
Cat Vasko is editor of RadAnalytics and associate editor of Radiology Business Journal.