Enabling Human-Machine Collaborative Inspections through Smart Infrastructure Metaverse

dc.contributor.authorAl-Sabbag, Zaid
dc.date.accessioned2024-02-08T19:07:47Z
dc.date.available2024-06-08T04:50:04Z
dc.date.issued2024-02-08
dc.date.submitted2024-02-06
dc.description.abstractOver the last ten years, novel Artificial Intelligence (AI) based structure assessment methods and tools have been proposed to identify and quantify structural damage indicators (e.g., cracking, spalling, corrosion, etc.) from visual data (e.g., images, LiDAR). However, despite an urgent need for such technology in managing infrastructure, widespread adoption of these technologies in the field has been quite limited. One of the main obstacles is the lack of real-time communication and interaction between the human inspectors, on- and off-site, and the technologies being deployed to support data collection, processing, and decision-making. This thesis focuses on enabling real-time remote collaborative structural inspections by integrating on-site inspectors, remote experts (e.g., engineers, stakeholders, etc.) responsible for making critical decisions, advanced data collection platforms (e.g., ground robots, drones, etc.), and AI algorithms that can rapidly interpret data, into an automated inspection system that supports human-machine collaboration. The motivation is to solve the technical and scientific challenges that prevent real-time collaboration between human users (on-site inspectors, remote experts) and machine agents (robots, AI). This thesis proposes a system called the Smart Infrastructure Metaverse (SIM) to enable human users and machine agents to collaborate in real-time by utilizing Mixed Reality (MR) and Virtual Reality (VR) headsets which enable humans to interact with each other and with machine agents remotely in an immersive environment. The SIM system integrates robotic data collection platforms to collect visual data of the site, with critical guidance from human users on how to collect the best data. The data is then analyzed in real-time by AI computer vision algorithms that utilize input from the human users to localize and quantify structural damage. The on-site inspectors and remote experts are then able to collaborate on reviewing the results in a spatially aware context through an immersive environment supported by MR/VR technology, and can utilize machine agents to collect more data from the site and/or re-analyze previous data based on the human users' expert judgement. Several scientific challenges are addressed in this thesis as part of the process of creating SIM. Each challenge deals with facilitating collaboration between human users and machine agents for a different component of the SIM system. First, input from the on-site inspector must be incorporated into the data analysis step to minimize the gap between data analysis and verification by humans and ensure the high quality of results. The approach is to utilize human-AI collaboration for quantifying the sizes of structural damage regions. This is accomplished by integrating an AI-based interactive image segmentation algorithm with the MR headset which allows for refining the segmentation results interactively through user feedback. Second, accurate spatial alignment between separated devices with heterogeneous sensing and processing capabilities (e.g., MR headsets, robots) is still an open problem that is critical for spatially-aware human-robot collaboration. The approach utilized was to develop an image-based localization algorithm to spatially align the MR headset and robot in real-time, which facilitates human-robot collaboration to enhance the reliability of the data collection process by engaging the MR-equipped human inspector with the data collection platform. Third, seamless integration of VR users into SIM is required for distributed collaboration between remote VR users and on-site MR users. This includes solving the technical challenges related to spatial alignment between VR and MR users, as well as how VR users can interact with other components of SIM such as robots and AI. The approach is to utilize panoramic images to allow VR users to remotely inspect the site, and a novel image-based localization algorithm was developed to spatially align panoramic images with their real locations on-site. Distributed collaboration also includes integrating all of these components into a unified system as part of SIM, with the goal of enabling on-site and remote inspectors to collaborate with each other and with robots and AI through MR/VR. Experimental results are presented for evaluating each component of SIM individually, including lab and field results for evaluating the accuracy of the proposed systems for MR/VR and robotic implementations.en
dc.identifier.urihttp://hdl.handle.net/10012/20338
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectvisual inspectionen
dc.subjecthuman-machine collaborationen
dc.subjectmixed realityen
dc.subjectvirtual realityen
dc.subjectmetaverseen
dc.subjectvision-based inspectionen
dc.subjectcomputer visionen
dc.subjectstructural health monitoringen
dc.titleEnabling Human-Machine Collaborative Inspections through Smart Infrastructure Metaverseen
dc.typeDoctoral Thesisen
uws-etd.degreeDoctor of Philosophyen
uws-etd.degree.departmentCivil and Environmental Engineeringen
uws-etd.degree.disciplineCivil Engineeringen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms4 monthsen
uws.contributor.advisorYeum, Chul Min
uws.contributor.advisorNarasimhan, Sriram
uws.contributor.affiliation1Faculty of Engineeringen
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
alsabbag_zaid.pdf
Size:
48.39 MB
Format:
Adobe Portable Document Format
Description:
thesis rev 1
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: