Extraction of Digital Terrain Models from Airborne Laser Scanning Data based on Transfer-Learning

dc.contributor.advisorLi, Jonathan
dc.contributor.authorYe, Weiya
dc.date.accessioned2019-04-30T19:38:43Z
dc.date.available2019-04-30T19:38:43Z
dc.date.issued2019-04-30
dc.date.submitted2019-04-29
dc.description.abstractWith the rapid urbanization, timely and comprehensive urban thematic and topographic information is highly needed. Digital Terrain Models (DTMs), as one of unique urban topographic information, directly affect subsequent urban applications such as smart cities, urban microclimate studies, emergency and disaster management. Therefore, both the accuracy and resolution of DTMs define the quality of consequent tasks. Current workflows for DTM extraction vary in accuracy and resolution due to the complexity of terrain and off-terrain objects. Traditional filters, which rely on certain assumptions of surface morphology, insufficiently generalize complex terrain. Recent development in semantic labeling of point clouds has shed light on this problem. Under the semantic labeling context, DTM extraction can be viewed as a binary classification task. This study aims at developing a workflow for automated point-wise DTM extraction from Airborne Laser Scanning (ALS) point clouds using a transfer-learning approach on ResNet. The workflow consists of three parts: feature image generation, transfer learning using ResNet, and accuracy assessment. First, each point is transformed into a feature image based on its elevation differences with neighbouring points. Then, the feature images are classified into ground and non-ground using ResNet models. The ground points are extracted by remapping each feature image to its corresponding points. Lastly, the proposed workflow is compared with two traditional filters, namely the Progressive Morphological Filter (PMF) and the Progress TIN Densification (PTD). Results show that the proposed workflow establishes an advantageous accuracy of DTM extraction, which yields only 0.522% Type I error, 4.84% Type II error and 2.43% total error. In comparison, Type I, Type II and total error for PMF are 7.82%, 11.6%, and 9.48%, for PTD are 1.55%, 5.37%, and 3.22%, respectively. The root mean squared error of interpolated DTM of 1 m resolution is only 7.3 cm. Moreover, the use of pre-trained weights largely accelerated the training process and enabled the network to reach unprecedented accuracy even on a small amount of training set. Qualitative analysis is further conducted to investigate the reliability and limitations of the proposed workflow.en
dc.identifier.urihttp://hdl.handle.net/10012/14602
dc.language.isoenen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectdigital terrain modelen
dc.subjectdeep learningen
dc.subjectairborne lidaren
dc.titleExtraction of Digital Terrain Models from Airborne Laser Scanning Data based on Transfer-Learningen
dc.typeMaster Thesisen
uws-etd.degreeMaster of Scienceen
uws-etd.degree.departmentGeography and Environmental Managementen
uws-etd.degree.disciplineGeographyen
uws-etd.degree.grantorUniversity of Waterlooen
uws.contributor.advisorLi, Jonathan
uws.contributor.affiliation1Faculty of Environmenten
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ye_Weiya.pdf
Size:
3.45 MB
Format:
Adobe Portable Document Format
Description:
master thesis

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.08 KB
Format:
Item-specific license agreed upon to submission
Description: