RSNA 2011 

Abstract Archives of the RSNA, 2011


LL-INS-SU4A

Assisted Peer Review Quality Assurance Tool

Scientific Informal (Poster) Presentations

Presented on November 27, 2011
Presented as part of LL-INS-SU: Informatics

Participants

Zahir A. Momin MD, Presenter: Nothing to Disclose
Douglas D. Robertson MD, PhD, Abstract Co-Author: Nothing to Disclose
Walter A. Carpenter MD, PhD, Abstract Co-Author: Nothing to Disclose

CONCLUSION

Integration of peer review, PACS, and EMR significantly reduces the effort to effectively and efficiently perform peer review and provide quality assurance. aPRQA peer reviews are performed when one wants, not interrupting workflow, nor requiring ancillary staff. Initial evaluation indicates it significantly increases ease of use and we expect to demonstrate efficacy. It also eliminates personnel support and could be performed regularly on recent cases potentially improving patient care.

BACKGROUND

Peer review and quality assurance are important yet cumbersome processes to execute. Our goal was to create a tool which simplified and reduced the time to perform peer review. The tool runs with PACS and integrates ACR RADPEER, PACS, and the EMR. Truly randomized review worklists can be created using default settings or customized based on radiologist, subspecialty, image modality, facility, and time period.

EVALUATION

The assisted peer-review, quality assurance tool (aPRQA) enables radiologist specific or subspecialty specific reviews. Once an individual logs in, individuals may select radiologists to review, subspecialties, modalities, locations, and date ranges. The selected cases are random within the specified constraints. The study to be reviewed is displayed on PACS and the dictated report displayed within aPRQA window. The individual performs the review, assigns a score, and may add comments or mark the study for teaching file. Data is automatically entered into ACR RADPEER and a separate local database, from which analyses and reports may be created. Number of cases peer-reviewed, time per case, and ease of process (modified Wong-Baker Pain Score) were compared between 5 radiologists each not using and then using the aPRQA tool for 2 months, respectively. Results were compared using students paired t-test, with significance set at p<0.05.

DISCUSSION

The aPRQA tool increased total monthly cases peer reviewed for the 5 radiologists reviewed from 11 to 200 (p<0.01), a twenty fold increase. aPRQA also deceased average radiologist time per peer-review case from 170 sec on average to 30 sec (p<0.01). Radiologist ease of use improved from a 1 (1/5) to a 5 (5/5) (p<0.01).

Cite This Abstract

Momin, Z, Robertson, D, Carpenter, W, Assisted Peer Review Quality Assurance Tool.  Radiological Society of North America 2011 Scientific Assembly and Annual Meeting, November 26 - December 2, 2011 ,Chicago IL. http://archive.rsna.org/2011/11013181.html