An AI-Assisted, Motion-Aware Robotic Framework for Adaptive Ultrasound-Guided High-Intensity Focused Ultrasound Therapy

dc.contributor.authorTaghipour, Alaleh
dc.date.accessioned2026-04-21T19:58:03Z
dc.date.available2026-04-21T19:58:03Z
dc.date.issued2026-04-21
dc.date.submitted2026-04-15
dc.description.abstractHigh-intensity focused ultrasound (HIFU) is a non-invasive therapeutic modality capable of inducing localized tissue ablation through the precise delivery of focused acoustic energy. Ultrasound-guided HIFU (USgHIFU) offers real-time imaging, portability, and broad clinical accessibility; however, its effective deployment remains challenged by limited interpretability of ultrasound images, patient motion, and the difficulty of tightly integrating perception, planning, and robotic execution within a unified treatment workflow. This thesis presents an integrated, Artificial-Intelligent-assisted (AI-assisted) and robotenabled framework for adaptive USgHIFU therapy. While HIFU has broader clinical applicability across multiple solid tumors, the present work focuses specifically on breast cancer therapy and therefore leverages breast ultrasound data for framework development and evaluation. In the first stage, open-access breast ultrasound images from the Breast Ultrasound Images (BUSI) dataset are processed using a foundation model–based segmentation approach, namely Segment Anything Model 2 (SAM2), to automatically delineate tumor regions. The resulting segmentation masks are converted into geometric boundary representations, from which interior target points are generated at fixed spatial resolution and ordered using a nearest-neighbour (NN) strategy. These targets are exported as trajectory files that are continuously monitored by a robotic control system. A Franka Emika Panda robot executes the planned trajectories using a position-based Cartesian controller, while a geometric safety projection mechanism enforces boundary-aware safety constraints derived directly from the segmented tumor geometry. Motion-aware logic is incorporated into the segmentation pipeline such that detected displacements exceeding a predefined threshold trigger re-segmentation and trajectory regeneration, enabling safe interruption and resumption of robotic execution. In the second stage, the framework is extended to physics-informed ablation planning and experimental validation using a Verasonics ultrasound system. Tumor boundary information obtained from the segmentation pipeline is used as input to high-intensity therapeutic ultrasound (HITU) acoustic simulations to determine optimal focal locations, lesion dimensions, focal depths, and sonication parameters that maximize coverage while minimizing spill-out. These simulation-derived parameters are exported as structured comma-separated values (CSV) files and used to dynamically configure Verasonics sonication settings during experiments. Real-time ultrasound frames acquired during ablation are continuously monitored by the segmentation system, enabling online assessment of ablation extent relative to the tumor boundary and providing a mechanism for early termination in the presence of boundary violations or excessive spill-out. Robotic execution is synchronized with sonication through dwell-based targeting, and volumetric tumor coverage is achieved by iteratively advancing the robot across multiple planar slices. Experimental results demonstrate accurate segmentation of ultrasound tumor images, reliable geometric boundary extraction, uniform interior coverage, and high-fidelity robotic trajectory tracking under safety constraints. Quantitative analysis of coverage and spill-out shows strong agreement between simulation predictions and experimental outcomes. Collectively, this work establishes a closed-loop, motion-aware, and experimentally validated framework that tightly couples AI-based ultrasound image segmentation, robotic control, and acoustic simulation, advancing toward practical autonomous and adaptive USgHIFU therapy systems.
dc.identifier.urihttps://hdl.handle.net/10012/23028
dc.language.isoen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.titleAn AI-Assisted, Motion-Aware Robotic Framework for Adaptive Ultrasound-Guided High-Intensity Focused Ultrasound Therapy
dc.typeMaster Thesis
uws-etd.degreeMaster of Applied Science
uws-etd.degree.departmentMechanical and Mechatronics Engineering
uws-etd.degree.disciplineMechanical Engineering
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms1 year
uws.contributor.advisorKwon, Hyock Ju
uws.contributor.affiliation1Faculty of Engineering
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Taghipour_Alaleh.pdf
Size:
40.44 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections