Show simple item record

dc.contributor.authorChoi, Christopher
dc.date.accessioned2018-11-30 18:27:36 (GMT)
dc.date.available2018-11-30 18:27:36 (GMT)
dc.date.issued2018-11-30
dc.date.submitted2018-08
dc.identifier.urihttp://hdl.handle.net/10012/14192
dc.description.abstractAs micro aerial vehicles (MAVs) become increasingly common as platforms for aerial inspection, monitoring and tracking, the need for robust automated landing methods increases, for both static and dynamic landing targets. Precision MAV landings are difficult, even for experienced human pilots. While semi-autonomous MAV landings have proven effective, they add additional requirements for multiple skilled operators, which in turn increase the operational costs. This is not always practical and the human in the loop prevents the possibility of more efficient robotic teams that do not require human operators. As such, an automated landing system has been a growing topic of interest to both industry and academia. In this thesis the aim is to address three different issues. First, in order for a MAV to land autonomously onto a moving target, a complete tracking and landing system for MAVs is needed. An end-to-end system termed ATL is introduced. Results show that ATL is able to track and execute a planned trajectory onto a moving landing target at speeds of 10m/s in simulation. Secondly, to enable autonomous MAV landings in GPS-denied environments, multiple cameras are needed for simultaneously tracking the landing target and performing state estimation. With the prevalence of gimbal cameras on commercially available MAVs for applications such as cinematography, it is advantageous to use the gimbal camera along with other cameras on-board for state estimation. An encoder-less gimbal calibration method is introduced to enable gimbal cameras to be used with state estimation algorithms. The method was validated by modifying OKVIS to jointly optimize for the gimbal joint angle. Finally, to achieve full MAV autonomy, all software components on-board must run in real-time on a computer with limited resources. To address this issue and to take advantage of a gimbal camera the Multi-State Constraint Kalman Filter (MSCKF) algorithm is extended by incorporating a gimbal camera. The method was validated in simulation and on a KITTI raw dataset both show promising results.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectQuadrotoren
dc.subjectMicro-Aerial Vehiclesen
dc.subjectPerceptionen
dc.subjectVisual-Inertial Odometryen
dc.titleTowards Robust Autonomous MAV Landing with a Gimbal Cameraen
dc.typeMaster Thesisen
dc.pendingfalse
uws-etd.degree.departmentMechanical and Mechatronics Engineeringen
uws-etd.degree.disciplineMechanical Engineeringen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeMaster of Scienceen
uws.contributor.advisorWaslander, Steven
uws.contributor.affiliation1Faculty of Engineeringen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages