Abstract Archives of the RSNA, 2011
David Joseph Vining MD, Presenter: Royalties, Bracco Group
Naveen Garg MD, Abstract Co-Author: Nothing to Disclose
Mia Kathleen Markey PhD, Abstract Co-Author: Researcher, Hologic, Inc
Tejaswini Ganapathi MS, Abstract Co-Author: Nothing to Disclose
Radu Rosu, Abstract Co-Author: Nothing to Disclose
Mihai Jurca, Abstract Co-Author: Nothing to Disclose
Iulian Aghenitei, Abstract Co-Author: Nothing to Disclose
Understanding and exploiting gaze-speech relationships to capture a radiologist's points-of-interest and corresponding structured data shows promise for structured reporting that will accommodate a radiologist's natural workflow.
We previously developed a multimedia structured reporting system, known as ViSion, which captures key images from a PACS and requires a radiologist to manually label image findings with anatomy and pathology terms. To improve efficiency, we explored the use of eye-tracking technology to capture a radiologist's points-of-interest and natural language processing (NLP) to label the findings with the appropriate metadata.
We replaced manual data input with eye-tracking technology to unintentionally record a radiologist's gaze points and dwell times, and coupled that with NLP of voice dictation. We found that deliberate fixations corresponding to areas-of-interest have longer dwell times than random fixations. We also discovered that deliberate fixations occur earlier than corresponding speech but with a significant time overlap between the two in our controlled setting. Hence, we exploited the time overlap to create a data input mechanism for our structured reporting system.
A major hurdle to structured reporting is the reliance up manual input for the selection of data elements. Radiologists are accustomed to simply looking at images and describing findings. A man-machine interface utilizing eye-tracking and NLP can more naturally fit into a radiologist's workflow. Both technologies are nascent and will undergo significant advances in the future, but the potential for this man-machine interface is demonstrated in our application.
Use of Eye-Tracking Technology and Natural Language Processing for Data Input into a Multimedia Structured Report. Radiological Society of North America 2011 Scientific Assembly and Annual Meeting, November 26 - December 2, 2011 ,Chicago IL. http://archive.rsna.org/2011/11008390.html