RSNA 2014 

Abstract Archives of the RSNA, 2014


SSJ13-04

Department Data Depot: Merging Databases for Composite Wait Time Performance Metrics

Scientific Papers

Presented on December 2, 2014
Presented as part of SSJ13: Informatics (Business Analytics)

Participants

Christine M. Zwart PhD, Presenter: Nothing to Disclose
Kun Wang MS, Abstract Co-Author: Nothing to Disclose
Ellen Ermer MS, Abstract Co-Author: Nothing to Disclose
Amy Kiyo Hara MD, Abstract Co-Author: Nothing to Disclose
Clinton V. Wellnitz MD, Abstract Co-Author: Nothing to Disclose
J. Ross Mitchell PhD, Abstract Co-Author: Co-founder, Calgary Scientific, Inc Intellectual property, Calgary Scientific, Inc Shareholder, Calgary Scientific, Inc

CONCLUSION

By having a singular repository from which we could pull alternative timestamps, we were able to select the most appropriate indicator for exam start time. This type of unified framework for department databases allows easier analysis and comparison of radiology exam workflow.

BACKGROUND

Most radiology departments make use of multiple databases for storing exam information: a PACS for images; a RIS for reports; and an EMR for patient data. These systems are complex, often from different vendors, and may not work well together. This makes measurement of critical performance metrics difficult. Furthermore, it is challenging to identify the best source for data when there is overlap. To address this we developed a Department Data Depot (DDD) – a system that pulls data from department systems then uses cross-database relationships to integrate and validate the data into a single Microsoft SQL Server database. We used FileMaker (FileMaker, Inc, Santa Clara, California) to abstract the DDD, allow simple report generation and delivery to web and mobile clients.

EVALUATION

The pre-scan time interval is an important metric of department efficiency. It includes the time between patient check-in, appointment, and scan start. We used our EMR for appointment and check-in times. We explored three options for scan-start time: 1) The technologist recorded scan time (previous reporting standard), 2.) The time at which the exam ‘arrived’ in PACS, and 3.) The time recorded in the DICOM header of the first image. Each scan-start options was retrieved from a different database. Respectively: 1.) RIS, 2.) PACSHealth PACS monitoring (PACSHealth, LLC, Scottsdale, AZ), and 3.) Custom DICOM header parsing software (designed for dose monitoring).

DISCUSSION

The least accurate indicator of start time was the RIS as the recorded scan time marked either the beginning or end of the scan. The most accurate indicator was the DICOM header. For CT, study arrival into PACS was similar to the DICOM indicated start time (due to the initial scout scan). Study arrival, however, was not a good start time indicator for MR where the collection and processing time may delay the first scan’s arrival significantly.

Cite This Abstract

Zwart, C, Wang, K, Ermer, E, Hara, A, Wellnitz, C, Mitchell, J, Department Data Depot: Merging Databases for Composite Wait Time Performance Metrics.  Radiological Society of North America 2014 Scientific Assembly and Annual Meeting, - ,Chicago IL. http://archive.rsna.org/2014/14010961.html