BERN MEDICAL
  • Home
  • Solutions
    • Radiology
    • About Data Visualization
    • Other Services

Survey on Radiology Business Analytics

10/31/2012

 
Declining reimbursements is the new normal in radiology. Other industries have tackled shrinking margins by making adjustments based on discoveries in their data. I want to collect and share with readers where we believe improvements will be made, where the bottlenecks are in analyzing the data, and what prevents us from moving forward on getting better tools.  After closing the poll, I will write up a summary and discuss the results and explore the future based on the survey.

http://www.surveymonkey.com/s/QGLRVWQ

40 Minute Crash Course in Design Thinking 

10/24/2012

 
Below is an overview on design thinking by Inge Druckery, "Teaching to See" Check it out below:

New Blog Post on Diagnostic Imaging

10/17/2012

 
How the Right Operations Data Leads to the Wrong Analysis

Architecture and Data Integrity Are Critical to Analytics Success

10/17/2012

 
As radiology practices around the country become increasingly reliant on business analytics and intelligence for decision-making support, the time is ripe to begin devoting additional attention to refining the processes by which their measurements are generated, according to Bill Pickart, CEO of Integrated Medical Partners (IMP). “It is imperative that you understand the quality of the underlying data you are getting, and there are degrees,” he says. “If a database is constructed properly, there should be very little additional effort or cost associated with quantifying and qualifying the data for use by practice decision makers.”

Pickart outlines three key considerations for improved database architecture: data sourcing and origin, data integrity in acceptance and handling, and presentation of analytics.

“Many groups focus on data presentation or dashboarding, but if thought isn’t given to the structure and logic of your overall analytics strategy, the dashboards become less relevant and useful over time,” he says. “When this is handled properly, however, you can have confidence that you’re taking the right course of action.”

Data Sourcing

For comprehensive decision support, today’s radiology practices must utilize data from a variety of sources, both internal and external, Pickart says. Internal sources include RIS, PACS, revenue cycle management (RCM) and practice-management systems, utilization-management or appropriateness-criteria systems, precertification and preauthorization programs, critical results alert programs and financial costing systems, to name a few. “The challenge is that these sources tend to be highly vertical and oftentimes closed,” Pickart notes. “For instance, you might get some dynamite reports from the PACS, but they won’t be correlated with information from the RCM or financial system.”

External data sources might include hospital or departmental information systems, Pickart says. “Typically, a hospital-based practice will want to draw from the information system where the patient demographics were originally captured,” he explains. “You might also want to include studies or benchmarks from third-party research houses or data from your community’s health information exchange.”

Data Acceptance

The next step is for a practice’s database architecture to facilitate acceptance of these data. Having such a wide array of sources makes reconciliation and cross-referencing particularly important. “You need matrix capabilities—the opportunity to create internal data points relevant to any of the information you are bringing in, so you can design valid and relevant practice benchmarks,” Pickart says. “Internal benchmarking and cross-field referencing are major components in elevating the quality of your analytics and decision support.”

For instance, he says, when comparing radiologist utilization against external benchmarks, a radiology practice could decide that in order to qualify as full-time equivalent (FTE), its radiologists each need to log 45 hours of work time a week. To ensure the quality of the measure, “You send that data point through a series of filters such that when it gets to the analytics data warehouse from which the analytics draw, it has been broken down to its most basic element,” Pickart says.

Comparing that measure to an external benchmark has its own challenges, Pickart notes. “Typical external benchmarks rely on self-reported data, but that data is subject to inconsistencies because it comes from different practices, each with different definitions of an FTE,” he says. By establishing rules for how data enters the data warehouse or repository from which analytics are produced, practices can avoid the “garbage in, garbage out” trap that leaves them with unreliable information, Pickart says. “When you are relying solely on third-party, self-reported data, you’re introducing ambiguity to the data itself, and unless you account for that, you’re doing a disservice to the leaders of your practice when they make decisions on it,” he says. “By understanding and setting rules for data acceptance, by the time the information reaches the data pool, it is very clean and crisp. There’s no variation to account for.”

Reconciliation and cross-referencing, if properly managed, allows the practice to improve the quality of the data being used for comparison, Pickart explains. “You can then run all kinds of analytics on basic data points without being subjected to the interpretation of what somebody did in terms of self-reporting,” he says.

Data Presentation

Presentation of data is the final element to be considered, Pickart says. Here, he aligns himself philosophically with thought leaders in imaging informatics such as Paul Chang, MD, professor of radiology and vice chair of radiology informatics at the University of Chicago. Chang makes the distinction between two terms that practice leaders have a tendency to use interchangeably—dashboards and scorecards—and Pickart agrees that this distinction is critical. “With valid scorecards, many practices would be in a position to optimize their performance,” Pickart says. “Our belief, at IMP, is that scorecards drive results, while the typical pretty graphs and nice-looking pie charts just indicate where the practice is at a given point in time.”

With clean data from a wide array of sources, practices can develop scorecards for financial, operational, and clinical measures, which Pickart defines as report cards for the practice. “Scorecards contain immense strategic value,” he notes. “You get a timed series of practice-performance indicators measured against your target or benchmark, and then dashboards allow you to see how you are doing that day, relative to the predetermined study period.”

The potential of business analytics for the practice is not yet being fully realized by many radiology groups, Pickart concludes. “Many of the entities out there thinking of driving their practices through analytics and informatics are giving thought to issues pertaining to presentation,” he says, “but a lot more serious thought has to go into where your data are truly coming from and the architecture you have developed to receive, manage and store them. Practices that want to remain independent and thrive have to be efficient, watch costs, and positively respond to declining reimbursement—and well-structured analytics and decision support systems are key facilitators. The better practices can manage their informatics capabilities, the more long-term, independent success they will experience.”

Cat Vasko is editor of RadAnalytics and associate editor of Radiology Business Journal.

RBMA

10/9/2012

 
Picture
I have spent the past couple of days at RBMA Fall Education Conference. It has been good to visit and talk with what people are doing in radiology, what some problems are, and where the market is going.

I hope this picture from the iphone turns out ok. (And this post was written on my phone too)

Sometimes I undervalue the real insight that can be pulled from meeting together with people that are all doing the same thing. But, has been a great couple of days.


In the Tech Jobs Market, Data Analysis Is Tops

10/5/2012

 
By Jon Swartz

Like a coveted free agent in sports, Kelly Halfin had a multitude of choices when she decided to take a job in tech in the U.S.The Belgian had five American companies lined up, eager to sign her on to lead their data analysis team. She chose Livestream, where, as head of business intelligence, she reviews data to help the New York-based live-events site make product decisions.

"It was somewhat surprising when entering a new market, but not in how companies are putting a focus on data," says Halfin, 26, who has three years of experience in the field.

Data analysts are as important as the best engineers and designers. Job recruiters would say they're more important.

A recent McKinsey Global Institute study called data analytics "the next frontier for innovation, competition and productivity."

Experts are crucial to dive into and parse more than 250 billion publicly available likes, follows and other social relationships between people and things, products or brands that are growing at the rate of 2 billion data points a day, according to social-technology company 140 Proof.

"It's never been a better time to be a data scientist," known in the industry as quantitative jocks, says John Manoogian III, co-founder and chief technology officer at 140 Proof. "Companies want to turn this data into insights about what people like and what might be relevant to them, but they need very specialized analytical talent to do this."

Hence, the recruiting tug-of-war over Halfin and thousands like her.

The field has "exploded" the last 18 months, yet there is a dearth of talent because the job requires math skills that college graduates often lack, says Jim Zimmermann, director of SkillSoft, which provides online learning and training.

Lacking potential recruits, companies are "forced to home-grow their own talent through online training," Zimmermann says.

And the job pays well -- whether in San Francisco (an average annual salary of $104,000, New York ($102,000) or Chicago ($86,000), according to Indeed.com. The average salary is $74,000, says site Simply Hired.

Jobs site Glassdoor lists 17,699 jobs in Big Data. To be sure, demand for tech jobs remains high across the board: engineers, designers, ethical hackers and apps makers.

Career site Dice lists more than 84,000 tech jobs in North America, up 2% from a year ago. The fastest-growing areas from a year ago are iPhone-related skills (78%), cloud computing (68%), mobile  applications (42%) and Android -related skills (42%).

Demand for tech workers is expected to grow at a 19% clip through 2020 -- in line with an insatiable need for college graduates with degrees in the field -- according to the U.S. Department of Labor. And the annual pay is good, at an average $88,909, says job-search site Indeed.com.

Even manufacturing is thirsting for workers. It has nearly doubled to more than 170,000 from two years ago, based on CareerBuilder and Demand Portal data. In demand: programmers, computer -assisted technicians and machinists.

    BERN BLOG

    Blog written by:
    David Fuhriman, CEO
    Contact Me

    Other Useful blogs

    Occam's Razor
    Six Pixels
    Junk Charts
    Kaizen Analytics
    Zoom Metrix

    Archives

    June 2013
    May 2013
    April 2013
    March 2013
    February 2013
    January 2013
    December 2012
    November 2012
    October 2012
    September 2012
    August 2012
    July 2012
    June 2012
    May 2012
    April 2012
    March 2012
    February 2012
    January 2012
    December 2011
    November 2011
    October 2011
    September 2011
    August 2011

    Springboard
Powered by Create your own unique website with customizable templates.