An AI-Assisted, Motion-Aware Robotic Framework for Adaptive Ultrasound-Guided High-Intensity Focused Ultrasound Therapy
| dc.contributor.author | Taghipour, Alaleh | |
| dc.date.accessioned | 2026-04-21T19:58:03Z | |
| dc.date.available | 2026-04-21T19:58:03Z | |
| dc.date.issued | 2026-04-21 | |
| dc.date.submitted | 2026-04-15 | |
| dc.description.abstract | High-intensity focused ultrasound (HIFU) is a non-invasive therapeutic modality capable of inducing localized tissue ablation through the precise delivery of focused acoustic energy. Ultrasound-guided HIFU (USgHIFU) offers real-time imaging, portability, and broad clinical accessibility; however, its effective deployment remains challenged by limited interpretability of ultrasound images, patient motion, and the difficulty of tightly integrating perception, planning, and robotic execution within a unified treatment workflow. This thesis presents an integrated, Artificial-Intelligent-assisted (AI-assisted) and robotenabled framework for adaptive USgHIFU therapy. While HIFU has broader clinical applicability across multiple solid tumors, the present work focuses specifically on breast cancer therapy and therefore leverages breast ultrasound data for framework development and evaluation. In the first stage, open-access breast ultrasound images from the Breast Ultrasound Images (BUSI) dataset are processed using a foundation model–based segmentation approach, namely Segment Anything Model 2 (SAM2), to automatically delineate tumor regions. The resulting segmentation masks are converted into geometric boundary representations, from which interior target points are generated at fixed spatial resolution and ordered using a nearest-neighbour (NN) strategy. These targets are exported as trajectory files that are continuously monitored by a robotic control system. A Franka Emika Panda robot executes the planned trajectories using a position-based Cartesian controller, while a geometric safety projection mechanism enforces boundary-aware safety constraints derived directly from the segmented tumor geometry. Motion-aware logic is incorporated into the segmentation pipeline such that detected displacements exceeding a predefined threshold trigger re-segmentation and trajectory regeneration, enabling safe interruption and resumption of robotic execution. In the second stage, the framework is extended to physics-informed ablation planning and experimental validation using a Verasonics ultrasound system. Tumor boundary information obtained from the segmentation pipeline is used as input to high-intensity therapeutic ultrasound (HITU) acoustic simulations to determine optimal focal locations, lesion dimensions, focal depths, and sonication parameters that maximize coverage while minimizing spill-out. These simulation-derived parameters are exported as structured comma-separated values (CSV) files and used to dynamically configure Verasonics sonication settings during experiments. Real-time ultrasound frames acquired during ablation are continuously monitored by the segmentation system, enabling online assessment of ablation extent relative to the tumor boundary and providing a mechanism for early termination in the presence of boundary violations or excessive spill-out. Robotic execution is synchronized with sonication through dwell-based targeting, and volumetric tumor coverage is achieved by iteratively advancing the robot across multiple planar slices. Experimental results demonstrate accurate segmentation of ultrasound tumor images, reliable geometric boundary extraction, uniform interior coverage, and high-fidelity robotic trajectory tracking under safety constraints. Quantitative analysis of coverage and spill-out shows strong agreement between simulation predictions and experimental outcomes. Collectively, this work establishes a closed-loop, motion-aware, and experimentally validated framework that tightly couples AI-based ultrasound image segmentation, robotic control, and acoustic simulation, advancing toward practical autonomous and adaptive USgHIFU therapy systems. | |
| dc.identifier.uri | https://hdl.handle.net/10012/23028 | |
| dc.language.iso | en | |
| dc.pending | false | |
| dc.publisher | University of Waterloo | en |
| dc.title | An AI-Assisted, Motion-Aware Robotic Framework for Adaptive Ultrasound-Guided High-Intensity Focused Ultrasound Therapy | |
| dc.type | Master Thesis | |
| uws-etd.degree | Master of Applied Science | |
| uws-etd.degree.department | Mechanical and Mechatronics Engineering | |
| uws-etd.degree.discipline | Mechanical Engineering | |
| uws-etd.degree.grantor | University of Waterloo | en |
| uws-etd.embargo.terms | 1 year | |
| uws.contributor.advisor | Kwon, Hyock Ju | |
| uws.contributor.affiliation1 | Faculty of Engineering | |
| uws.peerReviewStatus | Unreviewed | en |
| uws.published.city | Waterloo | en |
| uws.published.country | Canada | en |
| uws.published.province | Ontario | en |
| uws.scholarLevel | Graduate | en |
| uws.typeOfResource | Text | en |