Abstract Archives of the RSNA, 2013
SSQ11-05
Automatic Integration of Joint Commission-required Critical Results Auditing into Institutional Peer Review Using a Software Tool
Scientific Formal (Paper) Presentations
Presented on December 5, 2013
Presented as part of SSQ11: ISP: Informatics (Results and Reporting)
Tarik K. Alkasab MD, PhD, Presenter: Nothing to Disclose
H. Benjamin Harvey MD, JD, Abstract Co-Author: Nothing to Disclose
Gloria Maria Martinez Salazar MD, Abstract Co-Author: Nothing to Disclose
Daniel Ira Rosenthal MD, Abstract Co-Author: Nothing to Disclose
G. Scott Gazelle MD, PhD, Abstract Co-Author: Consultant, General Electric Company
Consultant, Marval Biosciences Inc
We have created and deployed a tool to integrate critical results auditing into the peer review efforts in our large, academic department. The process, as implemented, meets the regulatory requirements of the Joint Commission.
To address compliance requirements and ensure the quality and consistency of non-routine communication for critical results in our department, we sought to integrate critical results auditing into our department’s on-going process of peer review. We created and deployed add-ons to the COGR software tool to seamlessly integrate routine critical results auditing.
Our departmental peer review process, known as consensus oriented group review (COGR), involves groups of radiologists regularly meeting to review randomly selected cases and record consensus on the acceptability of the issued report, supported by a software tool. The COGR software tool accesses data from the department's radiology information system (Centricity, GE Healthcare) and PACS workstations (Impax; AGFA Healthcare). We extended the COGR software tool to integrate additional case-specific questions regarding critical results reporting including, whether a critical result was present in the report and, if so, whether institutional guidelines for critical results communication were followed. Department administrators are able to generate automated reports to document compliance with critical results auditing requirements as needed.
The software tool has enabled our department to perform regular critical results auditing as an automatic component of group peer review. The described model posed the critical results questions in association with every case undergoing peer review from July 2012 through March 2013, resulting in over 5,000 cases audited across all divisions. The current system engages radiologists to detect critical results and assess the appropriateness of the timing and method of non-routine communication, per Joint Commission requirements. Recognizing the unique value of a group of radiologists engaged in peer review, we hope to use this model to implement other types of auditing questions without unduly weighing down the peer review process.
Alkasab, T,
Harvey, H,
Salazar, G,
Rosenthal, D,
Gazelle, G,
Automatic Integration of Joint Commission-required Critical Results Auditing into Institutional Peer Review Using a Software Tool. Radiological Society of North America 2013 Scientific Assembly and Annual Meeting, December 1 - December 6, 2013 ,Chicago IL.
http://archive.rsna.org/2013/13017811.html