Abstract:In the infrared spectrum, the atmosphere has different transmittances for different wavelengths, and the range of the higher transmittance is called the atmospheric window. In order to detect the radiation of the target in the long-wave infrared spectrum in the large angle of view, and to make up for the defect that the conventional visible light camera cannot detect the target in a complex environment, an ultra-wide-angle long-wave infrared camera emerges. Compared with the traditional visible light camera, ultra-wide-angle long-wave infrared camera covers a large field of view and can be used in complex environments such as night-time and smoke, and has a certain penetration effect. The binocular ultra-wide-angle long-wave infrared the stereo vision can be used for vehicle night-time assisted driving, military unmanned combat platform all-weather information reconnaissance and the like. As the first step to realize the stereo vision, the accuracy of calibration seriously affects the accuracy of three-dimensional reconstruction of objects in stereo vision. Therefore, improving calibration accuracy is a key issue in stereo vision research. The purpose of calibration is to find the intrinsic parameters and extrinsic parameters of stereo vision imaging. The intrinsic parameters describe the image relationship of the camera lens imaging, and the extrinsic parameters represent the relative positional relationship between the two cameras. However, ultra-wide-angle long-wave infrared camera has severe imaging distortion, low resolution and low image contrast, which is extremely difficult for stereo vision calibration. In order to accurately calibrate the extrinsic parameters of ultra-wide-angle long-wave infrared stereo vision, this paper proposes an extrinsic parameter calibration method based on the least squares optimization based on the Scaramuzza universal camera model. Evaluate the accuracy of the intrinsic and extrinsic parameters, and the common monocular angle is used. Based on the intrinsic parameters of point projection error evaluation and introducing extrinsic parameters, a method for evaluating the binocular angle reprojection error is proposed. In order to verify the validity and accuracy of the method, the active infrared radiation calibration plate is used to generate the corner points, and two sets of binocular ultra-wide-angle long-wave infrared cameras with the field of view (FOV) of 180° and 210° are respectively used at different positions to do calibration experiments. The experimental results show that the commonly used Bouguet method has a binocular average reprojection error (BMRE) of 0.782~0.943 pixels, while the BMRE based on the least squares optimization method is at 0.620~0.754 pixels. The experimental data show that the proposed method effectively reduces the binocular angle. Projection error improves the accuracy of extrinsic parameter calibration. In addition, the evaluation method is simple, objective and accurate, avoids the three-dimensional reconstruction of the object point in the evaluation process to introduce additional errors, and does not require high-precision three-dimensional coordinate measuring equipment.
[1] WANG Jiang-an, XIAO Wei-an, SHEN Lin, et al(王江安, 肖伟岸, 申 林,等). Journal of Naval University of Engineering(海军工程大学学报), 2001, 13(4): 29.
[2] Zhou Y. Optical & Quantum Electronics, 2012, 44: 741.
[3] Komatsu S, Markman A, Mahalanobis A, et al. Applied Optics, 2017,56: D120.
[4] Zhang L, Zhu F, Hao Y, et al. Applied Optics, 2017, 56: 4522.
[5] Pagel F. Proc. SPIE,2014,9026:902606.
[6] Engel J, Stückler J, Cremers D. Large-Scale Direct SLAM With Stereo Cameras. IEEE/RSJ International Conference on Intelligent Robots and Systems, 2015. 1935.
[7] Guan B, Shang Y, Yu Q, et al. Proc. SPIE, 2015, 9528: 95280Y.
[8] Anjum N, Cavallaro A. IEEE Intell. Syst., 2012, 27: 10.
[9] Gong Z, Liu Z, Zhang G. Applied Optics, 2017, 56: 3122.
[10] Li W, Shan S, Liu H. Applied Optics, 2017, 56: 2368.
[11] Liu Z, Yin Y, Liu S, et al. Applied Optics, 2016, 55: 7098.
[12] Zhen L, Yang Y, Shaopeng L, et al. Applied Optics,2016,55(25):7098.
[13] Lei Y, Wang Xiangjun, Ni Yubo, et al. Remote Sens., 2018, 10(1298): 1.
[14] Scaramuzza D, Martinelli A, Siegwart R. A Flexible Technigue for Accurate Omnidirectional Camera Calibration and Structure from Motion. 4th IEEE International Conference on Computer Vision Systems (ICVS), 2006.
[15] Bouguet J Y. Camera Calibration Toolbox for Matlab [EB/OL][2019-05-05]. http://www.vision.caltech.edu/bouguetj/calib_doc/.