Camera Calibration from Out-of-Focus Images

dc.contributor.authorSchmalenberg, Ryan
dc.date.accessioned2024-10-01T15:51:24Z
dc.date.available2024-10-01T15:51:24Z
dc.date.issued2024-10-01
dc.date.submitted2024-09-24
dc.description.abstractFor many 3D computer vision applications, accurate camera calibration is a necessary pre-requisite task. Generally, the objective is to find a camera’s intrinsic parameters such as focal lengths, or extrinsic parameters such as the camera’s pose in 3D space, or both. Camera calibration using structured calibration targets relies on special patterns which contain features that are used to localize control points with sub-pixel accuracy. The most frequently used patterns are checkerboards and circle grids, and in well constrained environments, these patterns are known to provide accurate feature correspondences for accurate camera calibration results. One challenging case for camera calibration is in the instance of calibrating a long focal length camera. In this case, the focal plane can be too far away in distance, and the only practical solution is to capture images of the calibration pattern out-of-focus while it is closer to the camera. Due to the radial distribution of out-of-focus blur, and biases created by a lack of distance preservation, as well as changes in spatial blur with perspective, checkerboard patterns have been proven to lose accuracy when they are captured in out-of-focus images, and with increased blur, can fail to provide feature correspondences all together. To address this, phase-shift circular gradient (PCG) patterns had been proposed as a method to encode control point positions into phase distributions, rather than through pixel intensities. Our work aims to validate previous authors claims of out-of-focus blur invariance and accuracy when using PCG patterns. Using PCG, small circle, and concentric circle grid patterns, we made comparisons using their respective retrieved pixel value focal lengths, and in-focus vs. out-of-focus percentage differences. Initial comparisons showed that PCGs were largely invariant to blur. However, their accuracy was marginally worse than comparable small circles when real-world noise was introduced. In this real case, a 7-DOF robot arm was used for repeatable calibration target positioning. The recorded set of poses was also used to mirror conditions in a further synthetic experiment. From this work, PCGs initially showed mixed results, but when extended beyond real-world conditions, PCGs were the only pattern that worked under the most severe levels of out-of-focus blur. This validated their improved detectability under extreme blur, and theoretical effectiveness for use with long focal length cameras. From these results, this study acknowledges the trade-offs in calibration pattern selection for respective use cases. It also highlights the importance of ellipse fitting techniques, as well as acknowledging the role of other learned methods. Finally, this study outlines the benefits that were observed when using robotic target positioning, and our synthetic validation pipeline for experimentation with calibration patterns under various conditions.
dc.identifier.urihttps://hdl.handle.net/10012/21120
dc.language.isoen
dc.pendingfalse
dc.publisherUniversity of Waterlooen
dc.subjectout-of-focus camera calibration
dc.subjectdefocus blur
dc.subjectperspective blur
dc.subjectrobotic camera calibration
dc.subjectsimulated camera calibration
dc.subjectsynthetic rendering
dc.subjectphase-shift circular gradients
dc.subjectPCG calibration patterns
dc.subjectcircle grid calibration patterns
dc.subjectlong focal length camera calibration
dc.subjectdeep learning camera calibration
dc.subjectcenter point correction
dc.titleCamera Calibration from Out-of-Focus Images
dc.typeMaster Thesis
uws-etd.degreeMaster of Science
uws-etd.degree.departmentSystems Design Engineering
uws-etd.degree.disciplineSystem Design Engineering
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.embargo.terms0
uws.comment.hiddenIn reference to IEEE copyrighted material which is used with permission in this thesis, the IEEE does not endorse any of the University of Waterloo's products or services. Internal or personal use of this material is permitted. If interested in reprinting/republishing IEEE copyrighted material for advertising or promotional purposes or for creating new collective works for resale or redistribution, please go to http://www.ieee.org/publications_standards/publications/rights/rights_link.html to learn how to obtain a License from RightsLink.
uws.contributor.advisorZelek, John
uws.contributor.affiliation1Faculty of Engineering
uws.peerReviewStatusUnrevieweden
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.scholarLevelGraduateen
uws.typeOfResourceTexten

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Schmalenberg_Ryan.pdf
Size:
31.64 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
6.4 KB
Format:
Item-specific license agreed upon to submission
Description: