Enabling Human-Machine Collaborative Inspections through Smart Infrastructure Metaverse

Loading...
Thumbnail Image

Date

2024-02-08

Authors

Al-Sabbag, Zaid

Advisor

Yeum, Chul Min
Narasimhan, Sriram

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Over the last ten years, novel Artificial Intelligence (AI) based structure assessment methods and tools have been proposed to identify and quantify structural damage indicators (e.g., cracking, spalling, corrosion, etc.) from visual data (e.g., images, LiDAR). However, despite an urgent need for such technology in managing infrastructure, widespread adoption of these technologies in the field has been quite limited. One of the main obstacles is the lack of real-time communication and interaction between the human inspectors, on- and off-site, and the technologies being deployed to support data collection, processing, and decision-making. This thesis focuses on enabling real-time remote collaborative structural inspections by integrating on-site inspectors, remote experts (e.g., engineers, stakeholders, etc.) responsible for making critical decisions, advanced data collection platforms (e.g., ground robots, drones, etc.), and AI algorithms that can rapidly interpret data, into an automated inspection system that supports human-machine collaboration. The motivation is to solve the technical and scientific challenges that prevent real-time collaboration between human users (on-site inspectors, remote experts) and machine agents (robots, AI). This thesis proposes a system called the Smart Infrastructure Metaverse (SIM) to enable human users and machine agents to collaborate in real-time by utilizing Mixed Reality (MR) and Virtual Reality (VR) headsets which enable humans to interact with each other and with machine agents remotely in an immersive environment. The SIM system integrates robotic data collection platforms to collect visual data of the site, with critical guidance from human users on how to collect the best data. The data is then analyzed in real-time by AI computer vision algorithms that utilize input from the human users to localize and quantify structural damage. The on-site inspectors and remote experts are then able to collaborate on reviewing the results in a spatially aware context through an immersive environment supported by MR/VR technology, and can utilize machine agents to collect more data from the site and/or re-analyze previous data based on the human users' expert judgement. Several scientific challenges are addressed in this thesis as part of the process of creating SIM. Each challenge deals with facilitating collaboration between human users and machine agents for a different component of the SIM system. First, input from the on-site inspector must be incorporated into the data analysis step to minimize the gap between data analysis and verification by humans and ensure the high quality of results. The approach is to utilize human-AI collaboration for quantifying the sizes of structural damage regions. This is accomplished by integrating an AI-based interactive image segmentation algorithm with the MR headset which allows for refining the segmentation results interactively through user feedback. Second, accurate spatial alignment between separated devices with heterogeneous sensing and processing capabilities (e.g., MR headsets, robots) is still an open problem that is critical for spatially-aware human-robot collaboration. The approach utilized was to develop an image-based localization algorithm to spatially align the MR headset and robot in real-time, which facilitates human-robot collaboration to enhance the reliability of the data collection process by engaging the MR-equipped human inspector with the data collection platform. Third, seamless integration of VR users into SIM is required for distributed collaboration between remote VR users and on-site MR users. This includes solving the technical challenges related to spatial alignment between VR and MR users, as well as how VR users can interact with other components of SIM such as robots and AI. The approach is to utilize panoramic images to allow VR users to remotely inspect the site, and a novel image-based localization algorithm was developed to spatially align panoramic images with their real locations on-site. Distributed collaboration also includes integrating all of these components into a unified system as part of SIM, with the goal of enabling on-site and remote inspectors to collaborate with each other and with robots and AI through MR/VR. Experimental results are presented for evaluating each component of SIM individually, including lab and field results for evaluating the accuracy of the proposed systems for MR/VR and robotic implementations.

Description

Keywords

visual inspection, human-machine collaboration, mixed reality, virtual reality, metaverse, vision-based inspection, computer vision, structural health monitoring

LC Keywords

Citation