An AI-Assisted, Motion-Aware Robotic Framework for Adaptive Ultrasound-Guided High-Intensity Focused Ultrasound Therapy
Loading...
Date
Authors
Advisor
Kwon, Hyock Ju
Journal Title
Journal ISSN
Volume Title
Publisher
University of Waterloo
Abstract
High-intensity focused ultrasound (HIFU) is a non-invasive therapeutic modality capable
of inducing localized tissue ablation through the precise delivery of focused acoustic energy.
Ultrasound-guided HIFU (USgHIFU) offers real-time imaging, portability, and broad
clinical accessibility; however, its effective deployment remains challenged by limited interpretability
of ultrasound images, patient motion, and the difficulty of tightly integrating
perception, planning, and robotic execution within a unified treatment workflow.
This thesis presents an integrated, Artificial-Intelligent-assisted (AI-assisted) and robotenabled
framework for adaptive USgHIFU therapy. While HIFU has broader clinical applicability
across multiple solid tumors, the present work focuses specifically on breast
cancer therapy and therefore leverages breast ultrasound data for framework development
and evaluation. In the first stage, open-access breast ultrasound images from the Breast
Ultrasound Images (BUSI) dataset are processed using a foundation model–based segmentation
approach, namely Segment Anything Model 2 (SAM2), to automatically delineate
tumor regions. The resulting segmentation masks are converted into geometric boundary
representations, from which interior target points are generated at fixed spatial resolution
and ordered using a nearest-neighbour (NN) strategy. These targets are exported as trajectory
files that are continuously monitored by a robotic control system. A Franka Emika
Panda robot executes the planned trajectories using a position-based Cartesian controller,
while a geometric safety projection mechanism enforces boundary-aware safety constraints
derived directly from the segmented tumor geometry. Motion-aware logic is incorporated
into the segmentation pipeline such that detected displacements exceeding a predefined
threshold trigger re-segmentation and trajectory regeneration, enabling safe interruption
and resumption of robotic execution.
In the second stage, the framework is extended to physics-informed ablation planning
and experimental validation using a Verasonics ultrasound system. Tumor boundary
information obtained from the segmentation pipeline is used as input to high-intensity
therapeutic ultrasound (HITU) acoustic simulations to determine optimal focal locations,
lesion dimensions, focal depths, and sonication parameters that maximize coverage while
minimizing spill-out. These simulation-derived parameters are exported as structured
comma-separated values (CSV) files and used to dynamically configure Verasonics sonication
settings during experiments. Real-time ultrasound frames acquired during ablation
are continuously monitored by the segmentation system, enabling online assessment of
ablation extent relative to the tumor boundary and providing a mechanism for early termination
in the presence of boundary violations or excessive spill-out. Robotic execution is synchronized with sonication through dwell-based targeting, and volumetric tumor coverage
is achieved by iteratively advancing the robot across multiple planar slices.
Experimental results demonstrate accurate segmentation of ultrasound tumor images,
reliable geometric boundary extraction, uniform interior coverage, and high-fidelity robotic
trajectory tracking under safety constraints. Quantitative analysis of coverage and spill-out
shows strong agreement between simulation predictions and experimental outcomes. Collectively,
this work establishes a closed-loop, motion-aware, and experimentally validated
framework that tightly couples AI-based ultrasound image segmentation, robotic control,
and acoustic simulation, advancing toward practical autonomous and adaptive USgHIFU
therapy systems.