Tang, Mei Qi2025-01-282025-01-282025-01-282025-01-27https://hdl.handle.net/10012/21442Lidar sensors enable precise 3D object detection for autonomous driving under clear weather but face significant challenges in snowy conditions due to signal attenuation and backscattering. While prior studies have explored the effects of snowfall on lidar returns, its impact on 3D object detection performance remains underexplored. Conducting such an evaluation objectively requires a dataset with abundant labelled data from both weather conditions and ideally captured in the same driving environment. Current driving datasets with lidar data either do not provide enough labelled data in both snowy and clear weather conditions, or rely on simulation methods to generate data for the weather domain with insufficient data. Simulations, nevertheless, often lack realism, introducing an additional domain shift that impedes accurate evaluations. This thesis presents our work in creating CADC++, a paired weather domain adaptation dataset that extends the existing snowy dataset, CADC, with clear weather data. Our CADC++ clear weather data have been recorded on the same roads and around the same days as CADC. We pair each CADC sequence with a clear weather one as closely as possible, both spatially and temporally. Our curated CADC++ achieves similar object distributions as CADC, enabling minimal domain shift in environmental factors beyond the presence of snow. Additionally, we propose track-based auto-labelling methods to overcome a limited labelling budget. Our approach, evaluated on the Waymo Open Dataset, achieves a balanced performance across stationary and dynamic objects and still surpasses a standard 3D object detector when using as low as 0.5% of human-annotated ground-truth labels.enautonomous drivingperception3D object detectionauto-labellingadverse weatherdatasetwinter conditionsdomain adaptationLiDARCADC++: Extending CADC with a Paired Weather Domain Adaptation Dataset for 3D Object Detection in Autonomous DrivingMaster Thesis