RSNA 2005 

Abstract Archives of the RSNA, 2005


SSE17-05

MoniQA (Monitor Quality Assurance): Initial Experience with an Integrated Software Tool of Daily Renewed Test Images to Monitor the Quality of Viewing Stations

Scientific Papers

Presented on November 28, 2005
Presented as part of SSE17: Physics (Image Displays, Interfaces)

Participants

Jurgen Jacobs, Presenter: Nothing to Disclose
Tom Deprez, Abstract Co-Author: Nothing to Disclose
Guy Marchal MD, PhD, Abstract Co-Author: Nothing to Disclose
Hilde Bosmans PhD, Abstract Co-Author: Nothing to Disclose

PURPOSE

We have developed a complete environment (MoniQA) for quality control of all types of radiological screen devices, including specific viewing stations for digital mammography. We report on the practical work-out of the software tools and on the first validation on a large series of different monitors in a digital filmless radiology department.

METHOD AND MATERIALS

MoniQA was developed in JAVA. It uses a modular approach in which patterns are packed into plugins. This allows MoniQA to present any available test pattern, including a dedicated pattern that we have set-up for daily quality control. Our pattern allows evaluation of luminance, gradient, resolution and geometric distortion. Certain parts in this image are randomly created to avoid memorization. Examples are the position of our different checks and different characters to be discriminated from the background. The validation of the software tool was performed in our radiology department on 68 medical screen devices with different monitor setups. Radiologists scored the patterns on a daily basis. We logged the evaluation time of every part of our test pattern. All scores are automatically sent to a central computer for analysis as a function of time by 2 other newly developed software tools, QAMPR and QASPR, for centralized quality control of radiological data. We compared the scores of different monitors and we assessed the learning curve and the reproducibility of the scores. An evaluation of the MoniQA pattern against the AAPMtg18 QC pattern is on-going.

RESULTS

Our software tools were successfully tested in a medical environment and proved to be stable. After a small learning curve (< 1 week), the evaluation procedure was well accepted by the personnel. In average, it takes the radiologists around 300s to evaluate two screen devices. As expected, the results were different for different monitor type and known artefacts were detected in the data.

CONCLUSION

Our preliminary results demonstrate that we can appropriately monitor the quality of medical screens on a daily basis for a large series of monitors spread over different locations.

Cite This Abstract

Jacobs, J, Deprez, T, Marchal, G, Bosmans, H, MoniQA (Monitor Quality Assurance): Initial Experience with an Integrated Software Tool of Daily Renewed Test Images to Monitor the Quality of Viewing Stations.  Radiological Society of North America 2005 Scientific Assembly and Annual Meeting, November 27 - December 2, 2005 ,Chicago IL. http://archive.rsna.org/2005/4411930.html