Camera Calibration from Out-of-Focus Images
Loading...
Date
2024-10-01
Authors
Advisor
Zelek, John
Journal Title
Journal ISSN
Volume Title
Publisher
University of Waterloo
Abstract
For many 3D computer vision applications, accurate camera calibration is a necessary pre-requisite task. Generally, the objective is to find a camera’s intrinsic parameters such as focal lengths, or extrinsic parameters such as the camera’s pose in 3D space, or both. Camera calibration using structured calibration targets relies on special patterns which contain features that are used to localize control points with sub-pixel accuracy. The most frequently used patterns are checkerboards and circle grids, and in well constrained environments, these patterns are known to provide accurate feature correspondences for accurate camera calibration results.
One challenging case for camera calibration is in the instance of calibrating a long focal length camera. In this case, the focal plane can be too far away in distance, and the only practical solution is to capture images of the calibration pattern out-of-focus while it is closer to the camera. Due to the radial distribution of out-of-focus blur, and biases created by a lack of distance preservation, as well as changes in spatial blur with perspective, checkerboard patterns have been proven to lose accuracy when they are captured in out-of-focus images, and with increased blur, can fail to provide feature correspondences all together.
To address this, phase-shift circular gradient (PCG) patterns had been proposed as a method to encode control point positions into phase distributions, rather than through pixel intensities. Our work aims to validate previous authors claims of out-of-focus blur invariance and accuracy when using PCG patterns. Using PCG, small circle, and concentric circle grid patterns, we made comparisons using their respective retrieved pixel value focal lengths, and in-focus vs. out-of-focus percentage differences.
Initial comparisons showed that PCGs were largely invariant to blur. However, their accuracy was marginally worse than comparable small circles when real-world noise was introduced. In this real case, a 7-DOF robot arm was used for repeatable calibration target positioning. The recorded set of poses was also used to mirror conditions in a further synthetic experiment. From this work, PCGs initially showed mixed results, but when extended beyond real-world conditions, PCGs were the only pattern that worked under the most severe levels of out-of-focus blur. This validated their improved detectability under extreme blur, and theoretical effectiveness for use with long focal length cameras.
From these results, this study acknowledges the trade-offs in calibration pattern selection for respective use cases. It also highlights the importance of ellipse fitting techniques, as well as acknowledging the role of other learned methods. Finally, this study outlines the benefits that were observed when using robotic target positioning, and our synthetic validation pipeline for experimentation with calibration patterns under various conditions.
Description
Keywords
out-of-focus camera calibration, defocus blur, perspective blur, robotic camera calibration, simulated camera calibration, synthetic rendering, phase-shift circular gradients, PCG calibration patterns, circle grid calibration patterns, long focal length camera calibration, deep learning camera calibration, center point correction