RSNA 2013 

Abstract Archives of the RSNA, 2013


SSE19-01

Integration of Automated Quality Control Using Image Classification into a CAD System for Bone Scan Lesion Detection

Scientific Formal (Paper) Presentations

Presented on December 2, 2013
Presented as part of SSE19: Nuclear Medicine (Quantitative Imaging)

Participants

Keith Woodrow Henkel MS, BEng, Presenter: Employee, MedQIA Imaging Core Laboratory
Matthew Sherman Brown PhD, Abstract Co-Author: Director, MedQIA Imaging Core Laboratory
Jonathan G. Goldin MBChB, PhD, Abstract Co-Author: Nothing to Disclose
Grace Kim MD, Abstract Co-Author: Nothing to Disclose
Katherine Yang, Abstract Co-Author: Employee, MedQIA Imaging Core Laboratory
Bharath Ramakrishna, Abstract Co-Author: Nothing to Disclose
Greg Chu, Abstract Co-Author: Nothing to Disclose
Richard Pais, Abstract Co-Author: Nothing to Disclose

PURPOSE

The aim of this research is to develop an automated bone scan image classifier for quality control as a pre-processing step prior to application of a CAD lesion detection system. As quantitative image analysis of bone scans becomes increasingly useful in clinical trials, so does the need to define quality bone scans in such a way that it predicts images’ usability for an automated lesion detection system.

RESULTS

Based on review of the CAD segmentation, 35.5% of the 833 images were usable. In the test data set, images were split by those performed on ADAC machines (n=30), and those performed on other machines (n=803). To confirm the algorithm identified in the training set, its sensitivity and specificity were compared against the usability predictive power of the individual parameters. Overall, the manufacturer (ADAC vs. not ADAC) appeared to have a low classification accuracy, but there was not enough data in the ADAC dataset reach a firm conclusion. In the other machine group, pixel spacing also showed a low classification accuracy (sensitivity of 0.889, specificity of 0.702). The image type most commonly associated with usable images (ORIGINAL / PRIMARY / WHOLE BODY / EMISSION) had a very high classification accuracy with a sensitivity of 0.918 and a perfect specificity (no false negatives) of 1.000. Image Type is defined in DICOM Part 3: Information Object Definitions, and had a perfect specificity. Image size (256 by 1024 pixels as usable, others as requiring manual review) had a sensitivity of 0.968 and specificity of 0.985 in predicting image usability, the largest individual parameter sensitivity. The combination of manufacturer, image type, and image size provide the best criteria for identifying quality bone scans: a sensitivity of 0.973 and a perfect specificity of 1.000. Further classification of images by pixel spacing (the last step of the algorithm identified in the training data set) actually had no further effect on the sensitivity and specificity. Images incorrectly identified as usable (n=7) were not usable due to extravasation hindering anatomic segmentation or due to missing anatomy with two notable exceptions: a pair of blood pool NM images. Including all images performed on ADAC scanners, for which the sample size was too small to identify an association, 56 of 833 images (6.7%) would require review via a non-automated process to determine usability of an image.

CONCLUSION

The question of how to approach quality control of medical images for use in automated systems appears to have an answer in traditional image classification. Close regulation of consistent scanner use, delay times from injection of radiotracer to acquisition of image, and varying doses received across time points, though not completely irrelevant, are not as significant factors in identifying quality images as simple DICOM header values like image type, image size, and manufacturer. Based on the inefficacy of including pixel spacing as a step in classifying the images of the test set, further evidence will be sought to determine if including the parameter is redundant or not.

METHODS

Acceptable bone scan image quality was defined in terms of usability for processing by an CAD lesion detection system currently in use in clinical trials (see Brown et al. Computer-aided quantitative bone scan assessment of prostate cancer treatment response. Nuclear Medicine Communications. 33(4):384-394, April 2012.). The CAD system atlas-based segmentation of anatomic landmarks and normal bone has been observed to fail when non-standard and/or non whole-body DICOM images are acquired, i.e., for secondary screen captures, spot views, tomos, key images, etc. Such images are unusable by CAD processing required in clinical trials and are thus considered of unacceptable quality. From a training set of over 3,000 images (Phase 2 multi-center clinical trial of VEGFR-2 inhibitor in prostate cancer), four technical imaging parameters from the DICOM header were identified as features to classify image quality as acceptable or not: (1) image size, (2) image type, (3) pixel spacing, and (4) manufacturer. In the training data set, the best correlations with usability were found by differentiating by manufacturer first, then by a combination of image type, image size, and pixel spacing. While additional factors such as radiotracer dosage and timing may affect the quality of a bone scan, they are not consistently available within the DICOM header and are prone to manual entry error, and therefore have been excluded from analysis. To test the imaging parameter features, 833 images from 25 patients across 23 sites were analyzed from a different multi-center Phase 2 prostate cancer clinical trial. Each of the images was processed by the CAD lesion detection system, and usability was determined as defined above. Statistically, sensitivity and specificity are reported to test the association between classified image quality and CAD usability.  

Cite This Abstract

Henkel, K, Brown, M, Goldin, J, Kim, G, Yang, K, Ramakrishna, B, Chu, G, Pais, R, Integration of Automated Quality Control Using Image Classification into a CAD System for Bone Scan Lesion Detection.  Radiological Society of North America 2013 Scientific Assembly and Annual Meeting, December 1 - December 6, 2013 ,Chicago IL. http://archive.rsna.org/2013/13016105.html