RSNA 2010 

Abstract Archives of the RSNA, 2010


SSA11-05

Calibration of a Semantic Search Engine to Assess the Concordance between Head CT Exam Indications and the Supporting Evidence for These Indications in the Electronic Medical Record (EMR): An Analysis of 500 Cases

Scientific Formal (Paper) Presentations

Presented on November 28, 2010
Presented as part of SSA11: ISP: Informatics (Clinical Decision Support)

Participants

Supriya Gupta MBBS, MD, Presenter: Nothing to Disclose
Jeffrey B. Weilburg MD, Abstract Co-Author: Nothing to Disclose
Mitchell A. Harris PhD, Abstract Co-Author: Nothing to Disclose
Keith J. Dreyer DO, PhD, Abstract Co-Author: Medical Advisor, General Electric Company Medical Advisor, Siemens AG Medical Advisor, Nuance Communications, Inc Medical Advisor, Carestream Health, Inc Medical Advisor, Vital Images, Inc Medical Advisor, Amirsys, Inc Medical Advisor, Life Image Inc Medical Advisor, McKesson Corporation
Michael Ethan Zalis MD, Abstract Co-Author: Research grant, General Electric Company

PURPOSE

Discordance between symptom, sign and diagnosis (SS&D) indications communicated with imaging exam order and evidence for SS&D in EMR can reflect inappropriate ordering (“gaming”) and limitations of Radiology Order Entry (ROE) systems. A semantic EMR search engine (Queriable Patient Inference Dossier, QPID) was used to find discordance rate between the two for outpatient head CT .

METHOD AND MATERIALS

Retrospective review of 500 outpatient CT head exams ordered by ROE was done. For each of the 36 SS&D options available on ROE for head CT, a separate topic-specific QPID search module was created to identify EMR data supporting SS&D. Following QPID search, concordance on per-case and per-SS&D option was displayed which was classified as : true positive—TP, (ROE +, QPID+), true negative—TN, (ROE-,QPID-), false positive-FP, (ROE+,QPID-) & false negative—FN, (ROE-,QPID+). FPs were manually reviewed for SS&D by domain expert and sub-classified as: Gaming—no EMR evidence (Gm), Clinically appropriate exams based on medical evidence (Ea), ROE error—SS&D in EMR but no relevant option in ROE (Rc), QPID error: QPID failed to detect justifying EMR data for SS&D (Qp). 

RESULTS

For each SS&D option among the 500 cases, we initially observed: TP= 349, FP=91, TN=9716, FN=7344. Expert review of 91 FP SS&D were adjudicated as: Gm=6,Ea-26,Rc=10,Qp=28, no records were available for 8 cases. Based on the feedback obtained from Rc and Qp terms, we modified QPID search criteria to enhance QPID detection performance (TP-357, FP-74,TN-9340, FN-7412) and observed final results as: Gm=7,Ea-29,Rc= 10, Qp=20. Of discordant SS&D, 48.64% (Gm + Ea, 36/74) reflected combination of potential ordering physician errors or gaming by ordering physicians and 14% (10/74) represent limitations in existing ROE user options. The 7 incidences of ‘gaming’ were distributed in 7 patients yielding an estimated rate of gaming as 1.4% (7/500).

CONCLUSION

In the majority of cases, ordering physicians chose SS&D options supported by EMR data and readily identifiable by QPID (high specificity). Both ‘gaming’ and ROE limitations are evident and QPID could serve as a potential tool for detecting and modifying them.

CLINICAL RELEVANCE/APPLICATION

Automated auditing of exam ordering can provide important feedback for improving appropriateness of exam ordering and can guide improvements in both ordering software (ROE) and auditing mechanisms. 

Cite This Abstract

Gupta, S, Weilburg, J, Harris, M, Dreyer, K, Zalis, M, Calibration of a Semantic Search Engine to Assess the Concordance between Head CT Exam Indications and the Supporting Evidence for These Indications in the Electronic Medical Record (EMR): An Analysis of 500 Cases.  Radiological Society of North America 2010 Scientific Assembly and Annual Meeting, November 28 - December 3, 2010 ,Chicago IL. http://archive.rsna.org/2010/9009220.html