Show simple item record

dc.contributor.authorLaschowski, Brokoslaw
dc.date.accessioned2021-12-23 21:23:17 (GMT)
dc.date.available2021-12-23 21:23:17 (GMT)
dc.date.issued2021-12-23
dc.date.submitted2021-12-16
dc.identifier.urihttp://hdl.handle.net/10012/17816
dc.description.abstractRobotic leg prostheses and exoskeletons can provide powered locomotor assistance to older adults and/or persons with physical disabilities. However, limitations in automated control and energy-efficient actuation have impeded their transition from research laboratories to real-world environments. With regards to control, the current automated locomotion mode recognition systems being developed rely on mechanical, inertial, and/or neuromuscular sensors, which inherently have limited prediction horizons (i.e., analogous to walking blindfolded). Inspired by the human vision-locomotor control system, here a multi-generation environment sensing and classification system powered by computer vision and deep learning was developed to predict the oncoming walking environments prior to physical interaction, therein allowing for more accurate and robust high-level control decisions. To support this initiative, the “ExoNet” database was developed – the largest and most diverse open-source dataset of wearable camera images of indoor and outdoor real-world walking environments, which were annotated using a novel hierarchical labelling architecture. Over a dozen state-of-the-art deep convolutional neural networks were trained and tested on ExoNet for large-scale image classification and automatic feature engineering. The benchmarked CNN architectures and their environment classification predictions were then quantitatively evaluated and compared using an operational metric called “NetScore”, which balances the classification accuracy with the architectural and computational complexities (i.e., important for onboard real-time inference with mobile computing devices). Of the benchmarked CNN architectures, the EfficientNetB0 network achieved the highest test accuracy; VGG16 the fastest inference time; and MobileNetV2 the best NetScore. These comparative results can inform the optimal architecture design or selection depending on the desired performance of an environment classification system. With regards to energetics, backdriveable actuators with energy regeneration can improve the energy efficiency and extend the battery-powered operating durations by converting some of the otherwise dissipated energy during negative mechanical work into electrical energy. However, the evaluation and control of these regenerative actuators has focused on steady-state level-ground walking. To encompass real-world community mobility more broadly, here an energy regeneration system, featuring mathematical and computational models of human and wearable robotic systems, was developed to simulate energy regeneration and storage during other locomotor activities of daily living, specifically stand-to-sit movements. Parameter identification and inverse dynamic simulations of subject-specific optimized biomechanical models were used to calculate the negative joint mechanical work and power while sitting down (i.e., the mechanical energy theoretically available for electrical energy regeneration). These joint mechanical energetics were then used to simulate a robotic exoskeleton being backdriven and regenerating energy. An empirical characterization of an exoskeleton was carried out using a joint dynamometer system and an electromechanical motor model to calculate the actuator efficiency and to simulate energy regeneration and storage with the exoskeleton parameters. The performance calculations showed that regenerating electrical energy during stand-to-sit movements provide small improvements in energy efficiency and battery-powered operating durations. In summary, this research involved the development and evaluation of environment classification and energy regeneration systems to improve the automated control and energy-efficient actuation of next-generation robotic leg prostheses and exoskeletons for real-world locomotor assistance.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.relation.urihttps://ieee-dataport.org/open-access/exonet-database-wearable-camera-images-human-locomotion-environmentsen
dc.subjectrehabilitationen
dc.subjectbiomechatronicsen
dc.subjectcomputer visionen
dc.subjectdeep learningen
dc.subjectassistive technologyen
dc.subjectexoskeletonsen
dc.subjectartificial intelligenceen
dc.subjectmachine learningen
dc.subjectprostheticsen
dc.subjectrehabilitation roboticsen
dc.subjectactuatorsen
dc.subjectlegged locomotionen
dc.subjectmultibody dynamicsen
dc.subjectbiomechanicsen
dc.subjectmodelling and simulationen
dc.subjectwearablesen
dc.subjectoptimizationen
dc.subjectneural networksen
dc.subjectrehabilitation engineeringen
dc.subjectroboticsen
dc.subjectwearable roboticsen
dc.titleEnergy Regeneration and Environment Sensing for Robotic Leg Prostheses and Exoskeletonsen
dc.typeDoctoral Thesisen
dc.pendingfalse
uws-etd.degree.departmentSystems Design Engineeringen
uws-etd.degree.disciplineSystem Design Engineeringen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeDoctor of Philosophyen
uws-etd.embargo.terms0en
uws.contributor.advisorMcPhee, John
uws.contributor.affiliation1Faculty of Engineeringen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages