Towards Robust Autonomous MAV Landing with a Gimbal Camera
MetadataShow full item record
As micro aerial vehicles (MAVs) become increasingly common as platforms for aerial inspection, monitoring and tracking, the need for robust automated landing methods increases, for both static and dynamic landing targets. Precision MAV landings are difficult, even for experienced human pilots. While semi-autonomous MAV landings have proven effective, they add additional requirements for multiple skilled operators, which in turn increase the operational costs. This is not always practical and the human in the loop prevents the possibility of more efficient robotic teams that do not require human operators. As such, an automated landing system has been a growing topic of interest to both industry and academia. In this thesis the aim is to address three different issues. First, in order for a MAV to land autonomously onto a moving target, a complete tracking and landing system for MAVs is needed. An end-to-end system termed ATL is introduced. Results show that ATL is able to track and execute a planned trajectory onto a moving landing target at speeds of 10m/s in simulation. Secondly, to enable autonomous MAV landings in GPS-denied environments, multiple cameras are needed for simultaneously tracking the landing target and performing state estimation. With the prevalence of gimbal cameras on commercially available MAVs for applications such as cinematography, it is advantageous to use the gimbal camera along with other cameras on-board for state estimation. An encoder-less gimbal calibration method is introduced to enable gimbal cameras to be used with state estimation algorithms. The method was validated by modifying OKVIS to jointly optimize for the gimbal joint angle. Finally, to achieve full MAV autonomy, all software components on-board must run in real-time on a computer with limited resources. To address this issue and to take advantage of a gimbal camera the Multi-State Constraint Kalman Filter (MSCKF) algorithm is extended by incorporating a gimbal camera. The method was validated in simulation and on a KITTI raw dataset both show promising results.
Cite this version of the work
Christopher Choi (2018). Towards Robust Autonomous MAV Landing with a Gimbal Camera. UWSpace. http://hdl.handle.net/10012/14192