RSNA 2017

Abstract Archives of the RSNA, 2017


QS131-ED-THB2

Transitioning from Peer Review to a Peer Learning System in a Multi-Centric Group Practice

Thursday, Nov. 30 12:45PM - 1:15PM Room: QS Community, Learning Center Station #2



Participants
Upma K. Rawal, MD, Medinah, IL (Presenter) Nothing to Disclose
Richard E. Heller III, MD, Chicago, IL (Abstract Co-Author) Nothing to Disclose
Abraham J. Bronner, MD, Harvey, IL (Abstract Co-Author) Nothing to Disclose
Nina E. Kottler, MD, MS, Sydney, Australia (Abstract Co-Author) Nothing to Disclose
Syed Furqan Zaidi, MD, El Segundo, CA (Abstract Co-Author) Nothing to Disclose
Aaron Huang, BA, Los Angeles, CA (Abstract Co-Author) Employee, Radiology Partners, Inc
Joyce Liang, El Segundo, CA (Abstract Co-Author) Nothing to Disclose
Jason Mitsky, Portland, OR (Abstract Co-Author) Nothing to Disclose

For information about this presentation, contact:

richard.heller@radpartners.com

PURPOSE

The 2015 report by the Institute of Medicine, Improving Diagnosis in Health Care, highlighted the impact of diagnostic errors in medicine and suggested several strategies for improvement. At the core of these strategies is a cultural transformation to a system based on the ideal of learning from errors and near-misses in a safe and protected space. Popular peer review programs, which focus on error detection, numerical scoring of errors, and radiologist-specific error rates do not meet this charge. The effectiveness of these peer review programs, considered by some necessary to meet accreditory obligations (such as Ongoing Professional Practice Evaluation) has also been questioned. Additionally, variances (errors) and other opportunities for learning detected by radiologists during their daily work outside of these programs are lost. To address these limitations, our national practice of 350+ radiologists from multiple locally led practices across several states developed a second, separate internal Variance Program. This program emphasizes collaborative learning in a legally protected, safe space.

METHODS

Our radiologist leaders, in consultation with our legal team, developed a structure of peer review committees (PRCs), consisting of local radiologists, for each locale where we practice. The arrangement was created such that the peer review data is protected from outside groups, including hospitals, and from legal discovery under each state's statutes. Collaborating with our IT team, a methodology was developed to store peer review data in a secure format accessible only to local PRC members. Simple and secure online web forms were devised to enable radiologists to enter cases with minimum disruption to workflow. Sites using our cloud-based PACS system are able to flag cases and enter a brief description. On-site and web-based meetings were held to familiarize our radiologists with the non-punitive, non-judgmental, and educational goals and operations of this Variance Program. Radiologists are encouraged, not mandated, to enter any variances, great catches, and teaching cases they encounter during their routine work, as well as any variances reported by a referring physician or other outside source. The entries are then reviewed in a timely basis by members of the local PRC. The PRCs ensure educational feedback to the interpreting radiologist(s) and appropriate follow-up with referring physicians. The data is analyzed by our support staff to create monthly rolling scorecards of the prior six months data providing feedback to the local PRCs on frequency of types of variances entered, local practice and radiologist participation in the variance program, and an aging analysis of cases pending review [Figure]. Measures of the program include individual and practice participation rates. Consistent with the educational goals of the program, submitted cases are used to create local and practice-wide learning case conferences, which are also posted on our practice's internal online portal for review at the radiologists' convenience.

RESULTS

Seven of our local practices in six states implemented the variance program in early 2015. Over a period of 24 months (April 2015 through March 2017), a total of 1482 cases were entered into the variance program across all seven sites, of which 88% (1304/1482) were reported as variances, 7% (98/1482) as great catches and 5% (80/1482) as teaching cases. In 11% (148/1304) of the reported variances, the local PRC members agreed with the reading radiologist, determining upon review that the reported variance was not a variance. Interestingly, of these, 24% (35/148) were brought for review to the reporting radiologist by a referring physician as a variance in the (incorrect) opinion of the referring physician. Multiple practice-wide learning conferences have been held utilizing cases entered into the Variance Program. Additionally, one of our practices regularly holds their own local learning conference. The learning conferences have included general and subspecialty themes, sharing variances resulting from failure of system operations (faulty/incomplete history, incorrect technique/protocol), unexpected findings and incorrect interpretations, among others, as well as sharing great catches and teaching cases.

CONCLUSION

Recognizing the deficiencies of traditional peer review systems, and consistent with the goals outlined in the IOM report, we utilized the resources of our practice to develop a voluntary internal peer review Variance Program with the primary focus on education through collaborative learning from sharing of our mistakes, great catches, and teaching cases. The non-punitive and legally protected structure of this program provides a safe environment to accomplish these goals and supports our practice's broader aim of prioritizing safety and enhancing clinical value for our patients and clients.