Abstract Archives of the RSNA, 2013
Olga Rachel Brook MD, Presenter: Research Grant, Guerbet SA
Janneth Yolanda Romero MD, Abstract Co-Author: Nothing to Disclose
Alexander Brook PhD, Abstract Co-Author: Spouse, Research Grant, Guerbet SA
Jonathan B. Kruskal MD, PhD, Abstract Co-Author: Author, UpToDate, Inc
Deborah Levine MD, Abstract Co-Author: Editor with royalties, UpToDate, Inc
Editor with royalties, Amirsys, Inc
Editor with royalties, Reed Elsevier
Our quality assurance (QA) database is a voluntary learning system for radiologists to submit technical and clinical QA errors, complications and related events. Submissions into peer review (PR) are mandatory entries by radiologists through a process of retrospective review of cases. Our purpose of this study was to evaluate patterns of submissions into PR and QA databases involving Ob/Gyn imaging.
Submissions to departmental QA (9/2004-11/2012) and PR (3/2007-11/2012) databases were searched for Ob/Gyn-related keywords. After exclusion of duplicates, there were 202 cases in QA and 73 in PR databases. Review and grading of cases was performed independently by two ultrasonologists. Cases were categorized into perceptional, interpretive, communication and procedural errors. Impact of the errors was assessed based on clinical and radiological follow up. Probability of the error occurrence was estimated.17 cases from QA and 9 cases from PR database were not true QA issues by consensus agreement, thus excluded from further analysis. The final study group included 185 cases in QA and 64 in PR databases.
There was no significant difference in patient age (44 ± 18 vs. 42 ± 16 yrs, p=0.41), or time period between study and error reporting, (298 ± 584 vs. 152 ± 368 days, p=.10) in PR and QA databases, respectively. The majority of the submissions were for outpatient studies37/64, 58% and 139/185, 75%, respectively. More emergency room studies were submitted to PR 25/64, 39%, compared to QA 26/185, 14% (p<0.001). The distribution of error category differed significantly between PR and QA databases (p< 0.001). Procedural entries were submitted almost exclusively through the QA database with 2/64 (3%) submitted through PR, compared to 62/185 (34%) in QA database. QA database had significantly more entries with moderate and major events 60/185 (32%) compared to PR 10/64 (16%), p<0.001. No catastrophic events were reported. Cases submitted through PR were of the type felt to recur more frequently than those submitted through QA database (p<0.001).
More clinically relevant, but less frequent cases are submitted through a voluntary quality assurance reporting mechanism than through the peer review process.
Our results suggest that efforts to improve quality (by increasing the reporting of adverse events and diagnostic errors) should continue to encourage voluntary entry of all QA cases.
Brook, O,
Romero, J,
Brook, A,
Kruskal, J,
Levine, D,
Peer Review (Retrospective Sampling) vs. Quality Assurance Database (Voluntary Data Entry) in Ob/Gyn Imaging. Radiological Society of North America 2013 Scientific Assembly and Annual Meeting, December 1 - December 6, 2013 ,Chicago IL.
http://archive.rsna.org/2013/13044169.html