Room: 303
Purpose: Real-time volumetric ultrasound (4D-US) has shown high potential for motion compensation tasks. However, one of the major perceived drawbacks of US guidance to date is the reliance on manual probe positioning before and during treatment. Robotic assistance can potentially overcome these challenges. The aim of this study was to assess the feasibility of robotic probe placement and dynamic adjustments in vivo during prolonged imaging sessions.
Methods: The system consists of a force-sensitive robot with 7 degrees of freedom (iiwa, KUKA) and a 4D-US system capable of real-time data access. In this study, an Epiq7 station (Philips) was used with a matrix array probe (X6-1) fixed to the robot’s end-effector. Five healthy volunteers (male, age 27-38y) received liver and transabdominal prostate scans during free breathing over 30min each. The probe was initially positioned via real-time remote control with a predefined contact force of 10N. Afterwards, the probe position was automatically adjusted to body surface motion via force control. In case of large motion or target drift, the probe was repositioned using force-compensated remote control to simulate a realistic treatment scenario.
Results: Probe contact was uninterrupted over the entire scan duration in all 10 sessions, compensating slow breathing motion as well as fast erratic motion of the body surface. The mean force measured along the probe’s main axis was 10.0±0.3N over all trials, with values ranging from 3.9-13.2N (liver) and 4.4-21.3N (prostate). Forces >11N only occurred in 0.3% of the time. Relative probe motion was more pronounced in the liver with a mean variability of 2.4mm and mean maximum excursion of 11.2mm compared to 0.6mm and 7.0mm for prostate.
Conclusion: Robotic US with dynamic force control can be used for stable, long-term imaging of regions affected by motion. The system can be used for motion compensation as well as diagnostic tasks.
Funding Support, Disclosures, and Conflict of Interest: Parts of this work were supported by the German Federal Ministry of Education and Research (grant no. 13GW0228B).