Abstract Archives of the RSNA, 2014
SSE13-01
Reducing RECIST (Response Evaluation Criteria in Solid Tumors) Index Lesion Selection Variability through Radiology-driven Candidate Selection
Scientific Papers
Presented on December 1, 2014
Presented as part of SSE13: Informatics (Workflow and Displays)
Merlijn Sevenster PhD, Abstract Co-Author: Employee, Koninklijke Philips NV
Paul J. Chang MD, Presenter: Co-founder, Stentor/Koninklijke Philips Electronics NV
Technical Advisory Board, Amirsys, Inc
Research Contracts, Koninklijke Philips NV
Medical Advisory Board, lifeIMAGE Inc
Medical Advisory Board, Merge Healthcare Incorporated
Jeffrey Bozeman BA, Abstract Co-Author: Nothing to Disclose
Andrea Cowhy BS, Abstract Co-Author: Nothing to Disclose
Piotr Roman Obara MD, Abstract Co-Author: Nothing to Disclose
Joost Peters, Abstract Co-Author: Employee, Koninklijke Philips NV
Manish Sharma MD, Abstract Co-Author: Nothing to Disclose
Will Trost BA, Abstract Co-Author: Nothing to Disclose
Lauren Wall MS, Abstract Co-Author: Nothing to Disclose
Charles William Westin MD, Abstract Co-Author: Nothing to Disclose
RECIST guidelines objectivize treatment response assessment in oncology trial patients by standardizing comparison of index lesions. Clinical research associates (CRAs) typically select index lesions that are described in a narrative radiology report. We describe a Lesion Dashboard (LD) tool that presents index candidates that were identified by the radiologist as selectable items leveraging an electronic based workflow orchestration. The LD tool presents these index lesions in a longitudinal view on patient disease progression. We measure agreement in index lesion selection comparing the traditional manual and LD workflows.
RECIST worksheets of 9 trial patients were obtained under IRB 13-0397. Images and reports of each worksheet's first two cycles were obtained and deidentified.
Part 1: Mimicking the traditional workflow, 4 CRAs reviewed the 2 deidentified radiology reports for each patient, selected index lesions out of the candidates described, and transcribed these into a blank RECIST worksheet
Part 2: Radiologists measured and annotated all candidates discussed in the reports on the images. Annotation data was persisted in AIM-consistent format and electronically sent to LD. 3 oncologists and 3 CRAs subsequently selected and marked candidates as index lesions in LD and completed RECIST response assessment consistent with anticipated use of LD.
F-measure statistic was used to quantify agreement between selections.
Part 1: Agreement between CRAs for selecting index lesions was 75%. Part 2: LD contained 42 candidates. Participants selected 94% of LD candidates as index lesions (oncologists: 92%; CRAs: 97%). 22% of CRA selected index lesions in part 1 were not candidate lesions in LD. Agreement was 98% for CRAs, 94% for oncologists, 96% for all participants.
Substantial variability exists in index lesion selection by end users in the traditional workflow, which is nearly eliminated by utilizing LD as users generally comply with the radiologist’s index lesion suggestions. By consuming computer-interpretable data generated upstream, LD leverages the radiologist’s expertise in the oncology workflow.
Oncology index lesion selection variability using traditional methods is significant. Electronic based workflow orchestration can reduce variability and better leverage the radiologist's expertise and relevance in oncologic patient management.
Sevenster, M,
Chang, P,
Bozeman, J,
Cowhy, A,
Obara, P,
Peters, J,
Sharma, M,
Trost, W,
Wall, L,
Westin, C,
Reducing RECIST (Response Evaluation Criteria in Solid Tumors) Index Lesion Selection Variability through Radiology-driven Candidate Selection. Radiological Society of North America 2014 Scientific Assembly and Annual Meeting, - ,Chicago IL.
http://archive.rsna.org/2014/14019368.html