RSNA 2016

Abstract Archives of the RSNA, 2016


SSM12-04

Collaborative Robotic Ultrasound: Towards Clinical Application

Wednesday, Nov. 30 3:30PM - 3:40PM Room: S403A



FDA Discussions may include off-label uses.

Benjamin Frisch, PhD, Munich, Germany (Presenter) Nothing to Disclose
Oliver Zettinig, Munich, Germany (Abstract Co-Author) Nothing to Disclose
Bernhard Furst, Baltimore, MD (Abstract Co-Author) Nothing to Disclose
Salvatore Virga, Garching, Germany (Abstract Co-Author) Nothing to Disclose
Christoph Hennersperger, Munich, Germany (Abstract Co-Author) Nothing to Disclose
Nassir Navab, PhD, Garching, Germany (Abstract Co-Author) Nothing to Disclose
CONCLUSION

Collaborative medical robotics pave the way for advanced and complex diagnostic US and US-guided interventions whose value will be proven by patient trials.

Background

Ultrasound (US) is a compact, widely available and customizable non-radiative imaging modality. However, the image quality and its interpretation depend on the user’s abilities, leading to high interobserver variation. The acquisition of 3D datasets is limited to a comparatively narrow field of view. We introduce an US transducer mounted on a light-weight robotic arm that collaborates with the operator for 3D high-quality image acquisition and interventional support. The arm is equipped with torque sensors in all joints to ensure constant contact force of the transducer onto the skin and to detect collisions with the environment. External structured light cameras (SLCs) monitor the position of the patient, operators and tools to plan or dynamically replan the arm’s trajectory. The inclusion of confidence maps allows for a real-time automatic optimization of the US image quality.

Evaluation

The first demonstrator, for vascular imaging, recognizes the patient's position with the SLCs to register an anatomical atlas, and automatically guides the US transducer to the aorta. Confidence maps optimize the acquisition of 3D US images, which allowed successful aortic diameter measurements on healthy volunteers. A second demonstrator evaluates a neurosurgical application, where the spine is monitored for targeted needle insertion. We acquire 3D images of the region of interest on a spine model embedded in a dedicated gelatin-agar phantom. A needle guide is rigidly mounted to the US transducer to allow automatic alignment of the needle to a preinterventional target trajectory defined in a registered CT volume. After insertion, correct placement in the facet joint could be validated using cone-beam CT in four phantom experiments.

Discussion

Both demonstrators confirm the feasibility of robotic US. The combination of situational awareness through sensors and external cameras, and real-time image optimization creates a system that supports an operator for imaging and intervention.