RSNA 2009 

Abstract Archives of the RSNA, 2009


SSJ12-04

A Computer Generated Randomized, Sub-Specialized Over-Reads System for Quality Assurance: Raising the Bar of the Peer Review Process 

Scientific Papers

Presented on December 1, 2009
Presented as part of SSJ12: Informatics (Quality)

Participants

Jonathan S. Movson MBChB, Presenter: Nothing to Disclose
Thomas K. Egglin MD, Abstract Co-Author: Nothing to Disclose
Wendy J. Smith RT(R), Abstract Co-Author: Nothing to Disclose
Cynthia M. Cobb, Abstract Co-Author: Nothing to Disclose

CONCLUSION

A commercially available software system has been implemented in a large academic practice and automatically assigns exams to be reviewed into subspecialty groups 

BACKGROUND

Our academic radiology practice has 28 residents and 53 attendings and performs 500,000 exams a year at two institutions. To comply with ACR standards and to fulfill the requirement of instituitional credentialing, our department mandates that radiologists over-read at least 2% of all exams performed and grade them on a 4 point scale. The results are submitted to our QA officer and are incorporated into the hospital re-credentialing process. In order for these data to have meaning, we believe that two requirements need to be met.  The cases need to be randomly selected and the second reader needs to be a sub-specialist at interpreting the exam being reviewed.  

EVALUATION

Between 11/1/08 and 4/1/09 our academic sub-specialized department instituted and evaluated a commercially available peer review system Radiology Insight (Insight Health Solutions, East Providence, RI). We selected 341 exam codes that were each assigned to 1 of 9 sub-specialty categories. These exam codes account for approximately 90% of all exams done by our department. 43 radiologists were each assigned to 1 of 9 sub-specialty groups. Between 11/08 and 4/09 1437 cases were randomly selected and distributed by the computer algorithm to the reviewing radiologists. Initially, adjustments needed to be made to the categories and groups in order to distribute the cases evenly between over-readers and to match procedure codes with the correct over-reader group. After these changes had been made, every case was correctly assigned to the correct group. Of the cases assigned, 1143 over-reads were completed by the radiologists.

DISCUSSION

The Radiology Insight application was successful at assigning cases for sub-specialized peer review.  In addition this application allows the radiologist to identify technical, communication and protocol issues within the department and to bring these to the attention of the QA officer.  The interface provides a mechanism for the QA officer to provide feedback to all parties involved and to close the QA loop.

Cite This Abstract

Movson, J, Egglin, T, Smith, W, Cobb, C, A Computer Generated Randomized, Sub-Specialized Over-Reads System for Quality Assurance: Raising the Bar of the Peer Review Process .  Radiological Society of North America 2009 Scientific Assembly and Annual Meeting, November 29 - December 4, 2009 ,Chicago IL. http://archive.rsna.org/2009/8006218.html