Calibration and orientation of industrial online photogrammetry systems in situ

: This study proposes an analytical method for calibrating and orienting online photogrammetric systems using a scale bar. The primary idea is building control lengths in the measurement volume by moving and rotating the scale bar and then conducting a self-calibration bundle adjustment using images of the control lengths. In this study, the five-point relative orientation method is used to provide estimations of the exterior camera parameters. Additionally, this study elaborates the model of the lengths constrained self-calibration bundle adjustment, the block computation algorithm and the accuracy assessment. This method obtains both the interior and the exterior orientation parameters of the two cameras, and provides spatial bar length reconstruction precision as the performance evaluation. Simulations are conducted to analyze the calibration accuracy and the results show that the method achieves comparable accuracy and precision against the state-of-the-art method. However, the method does not need steady and well distributed 3D points and is easy to operate. Real data experimental results show that the maximum length measurement error of this method in a volume of 5 m × 4 m × 2 m is <0.4 mm.


Introduction
Photogrammetry is a technique for measuring spatial geometric quantities through taking, measuring and analysing images of targeted or featured points. Based on different sensor configurations, photogrammetric systems are categorised into offline and online systems [1]. Generally, an offline system uses a single camera to take multi-images from different positions and orientations and the typical measurement accuracy lies between 1/50,000 and 1/100,000. That substantially results from the 1/20-1/30 pixels image target measurement precision and the multiimage self-calibration bundle adjustment.
Unlike offline systems, two or more cameras are used in online systems to capture photos synchronously and reconstruct space points at any time moment. So, they are generally applied to movement and deformation inspection during a certain time period. Online systems are reported in industrial applications such as foil material deformation investigation in burst tests [2], concrete probe crack detection and analysis in civil engineering material test [3], aircraft wing and rotor blade deformation measurement in wind tunnels [4], wind turbine blade rotation measurement and dynamic strain analyses [5,6], concrete beams deflection measurement during destructive test [7], bridge deformation measurement during loading and earthquake test [8], vibration and acceleration measurement [9,10], membrane roof structure deformation analysis in different weather conditions [11], building structural vibration and collapse measurement [12,13] and heated steel beam deformation [14]. This non-contact and precise technique is still finding more potential applications.
In addition, online systems can be applied to medical positioning and navigation. These systems are generally composed of a rigid two camera tracking system, active light emitting diode (LED) or passive sphere retro reflect targets (RRTs), hand held probes and measurement software. They are calibrated and assessed by a reference point field generated by a coordinate measuring machine (CMM) [15] and can be integrated into computer-assisted therapy, dental implantology, medical robotics and other applications that need real-time, non-contact and accurate positioning and navigating. The reported positioning accuracy is 0.25 mm and the length measurement accuracy is better than 1.00 mm. Online systems equipped with high frame rate cameras are also employed in 3D motion tracking where cameras are calibrated by moving and imaging a 'T' or 'L' shaped wand in the measuring volume.
The intrinsic and extrinsic orientation of an online multi-camera system is crucial for high accuracy 3D reconstruction. Generally, online system cameras are calibrated and oriented in a test field that consists of points, bars or both. For example, a fixed station triple camera system is calibrated in a rotatory test field composed of points and scale bars [16,17]. Interior and exterior orientations of the cameras are calculated through bundle adjustment using the test field image sets taken at different rotation angles. Hådem [18] investigates calibration precision of stereo and triple camera systems under different station numbers, setups, test fields, relative orientation constraints, and interior orientation configurations. Specially, camera calibration in pure length test field is simulated and experimented, and it is mentioned that a moving scale bar can be easily adapted to stereo camera system calibration. Moreover, incorporating exterior orientation constraint between cameras in bundle adjustment will strengthen the network and improve accuracy. Pettersen [19,20] describes the reference bar calibration method and accuracy assessment of a stereo camera light pen system. Intrinsic orientations of the cameras are calibrated before measurement and the relative orientations are calibrated by a moving reference bar. Also, the accuracy of the method is assessed by length measurement uncertainty according to the verein deutscher ingenieure/ verband der elektrotechnik elektronik informationstechnik (VDI/VDE) CMM norm. Additionally, the author mentions that the length measurement of the orientation scale bar provides on-site accuracy verification. Maas [21] investigated the moving reference bar method on a triple camera system and proposes that the reference bar method is able to determine interior orientations and is reliable to calibrate multicamera systems. However, it is required in the paper that the cameras are fully or partially calibrated through an offline process [22] and that the measurement volume is relatively small, e.g. 1.0 m × 1.0 m × 1.0 m [18,19] and 1.5 m × 1.7 m × 1.0 m [21].
Self-calibration bundle adjustment is the most widely used method. Using convergent and rotated images of well distributed stable 3D points, bundle adjustment recovers the intrinsic and extrinsic parameters of a camera or a stereo camera rig, as well as the 3D object coordinates, at the same time. Also, there is no need to have any prior information about the 3D point coordinates. The self-calibration bundle adjustment method prevails in calibration and orientation of online systems. However, in practice, there are applications that are difficult to commence the method, such as in measurement environments that are unable to provide sufficient, well distributed and stable 3D points.
This study proposes a calibration and orientation method for stereo camera systems. Unlike previous moving scale bar method in medical tool navigation and motion tracking applications, this method aims to calibrate and orient camera systems used in large volume measurement by extending the moving range and diversifying the poses of the scale bar. Moreover, the proposed method enables full parameters of self-calibration measurement in situ. This study describes the building of control length field, block bundle adjustment algorithm and orientation accuracy assessment in detail, and presents simulation and experimental results to validate and evaluate the proposed method.

Methodology
In this research, a scale bar is used to calibrate and orient a camera pair for photogrammetric measurement in large space volume. A scale bar is an alloy or carbon fibre bar that has two photogrammetric RRTs fixed at the two ends. The length between the two RRTs is precisely calibrated. In multi-image offline systems, it provides an accurate scale to the measurement results. Fig. 1 describes the key steps of the proposed method. Before applying to 3D measurement, the two cameras are calibrated and orientated in four steps: control lengths field generation, bar RRTs image processing, relative orientation and camera parameters optimisation. First, the bar is moved in the measurement volume to predefined positions and in predefined orientations to generate a control lengths test field while the camera pair synchronically captures its images. Second, centroids of the control bar RRTs are measured and matched as image observations for later relative orientation and bundle adjustment. Third, the five-point method and root polish algorithm are used to get all the possible orientation solutions. Finally, a self-calibration bundle adjustment refines the camera interior and exterior orientations.

Relative orientation of the stereo cameras
The five-point method is used to estimate the six relative extrinsic parameters between the two cameras. To minimize the possibility of degeneration, this work proposes an algorithm that automatically selects the most suitable five point pairs, taking into account both maximising distribution dispersion and avoiding collinearity. Then, all the solutions are optimised by the root polish algorithm [23]. Moreover, approximate equality of reconstructed lengths is employed as spatial constraints in determining the true solution, which is more robust than image error analysis.

Self-calibration bundle adjustment of the camera pair
This section describes the model and block computation method of the two camera and multi-lengths constrained self-calibration bundle adjustment in detail. The n bars in the control field give 2n endpoints and n constrained lengths. Based on the relative orientation strategy, exterior parameters of the left camera are set to be zero and the relative exterior parameter vector E r of the right camera is unknown. The implicit projective equations of the two cameras are as follows: In (1), xy is the image coordinate vector; f is the modelled projective function [24,25]; I is the intrinsic parameter vector; E is v li where v is the residual vector of an image point; l is the disparity vector between the observed and the computed image coordinates; A and B are the partial derivative matrixes of the projective function f with respect to the camera parameters and space point coordinates; δ and δ˙ are the corrections of the camera parameters and the spatial coordinates, respectively. Considering the length constraint: where the subscripts m denotes the mth scale length; m 1 and m 2 denote the two endpoints. Linearisation of the length constraint equation is: where C is the partial derivative matrix of (3) with respect to bar end coordinates. Scale bar lengths are introduced into the bundle adjustment to avoid rank defect normal equations and the distance observation equation is described by vs m For a two camera system imaging n scale bars, the image coordinate observation equations and length observation equations are combined in the extended Jacobian matrix written as (see (6)) where the subscripts i, j k denote the ith (i = 1, 2) image, the jth (j = 1, 2, …, n) scale length and the kth (k = 1, 2) endpoint, respectively. The normal equation is .
Items in (7) are determined by block computation. Solving (7), we obtain the corrections for camera parameters and endpoint coordinates Again, using the block diagonal character of N 22 , δ can be computed camera by camera and δ˙ can be computed length by length.
The proposed algorithm is time efficient because block matrix computations eliminate the need for massive matrix inverse or pseudo inverse computation. In addition, the proposed algorithm is unaffected by invisible observations and allows for gross observation detection in the process of adjustment iteration.
The covariance matrix of all the adjusted parameters and coordinates is given through error propagation , where all the observations are considered to have a uniform accuracy and be weighted equally; v denotes the residual vector whose size equals 8n for n scale bars.

Simulation results and discussions
A simulation system is built to verify the proposed single bar orientation method and evaluate the performance. The system consists of the control length field generation module, the stereo camera imaging module, the self-calibration orientation module and the 3D reconstruction module.

Simulation configurations
Bar positions are distributed evenly in the defined volume and are one scale length apart from each other. The scale bar is moved to each position and posed in different orientations. Fig. 2 illustrates the six poses of the bar in one position and Fig. 3 demonstrates one simulated control length field.
The stereo system consists of two industrial cameras whose resolution is 4872 × 3248 pixels and the pixel size is 7.4 µm × 7.4 µm. Each camera is equipped with a 20 mm fixed focus lens. The simulated camera interior parameters are listed in Table 1. The cameras are directed at the centre of the measurement volume. Random image errors of σ = 0.2 µm are added to image coordinates. For the cameras used in this study, σ = 0.2 µm indicates a 1/37 pixels image measurement precision. The cameras are calibrated and oriented using the algorithm proposed in Section 2. Table 2 shows the self-calibration performance of the method. The mean error, root-mean-square (RMS) and max error are obtained from 1000 simulations.  The standard deviation (SD) of image errors of the two cameras is listed in Table 3. Image errors are at a reasonable level considering the added observation noise of σ = 0.2 µm.
The estimated SDs of the camera parameters and the endpoint coordinates after bundle adjustment are listed in Table 4.

Comparison of length measurement accuracy
The self-calibration bundle adjustment method is widely used in camera calibration and orientation. This method takes multi-photos of a steady but arbitrary 3D array of points and then conducts a bundle adjustment to solve all the unknown parameters and coordinates.
In this simulation, 105 3D points are distributed evenly in the 12 m × 8 m × 4 m volume as shown in Fig. 4. For each camera, 16 photos of the points are taken in which one is taken from the camera station as in Fig. 3. Image errors of σ = 0.2 µm are added to each image point coordinate. Then, self-calibration bundle adjustment is conducted using these image data to solve the intrinsic and extrinsic orientations of the two cameras. The measurement errors of a 10 m length placed along the diagonal of the volume are analysed as an accuracy evaluation. The RMS and max errors of 1000 simulations by both methods are listed in Table 5. It can be seen that the proposed control length method achieves comparable, and slightly better, accuracy and precision against the self-calibration bundle adjustment method.

Real data experimental results
The stereo camera system is composed of two industrial cameras (GE4900, AVT, Stadtroda, Germany) which are equipped with two consumer-level lenses (Nikkor 20 mm F/2.8D, Nikon, Japan). Resolution of the camera is 4872 × 3248 pixels and the charged coupled device sensor dimension is 36 mm × 24 mm. Two flashlights (YN 560 III, Yongnuo, China) are incorporated to provide illumination. A specially designed and manufactured carbon fibre bar is employed in the orientation experiments. Fig. 5 shows the carbon fibre scale bar, spherical and planar RRTs. The bar has symmetrically three bushing holes at each end. The holes are used to adapt and brace the plug-in shafted RRTs. It is convenient to make substitutions for RRTs of different size and type. Plugging a pair of RRTs in symmetrically different bushing holes makes three different length bars. The three lengths are measured on a granite linear rail by a laser interferometer for     distance measurement and a micro imaging camera for target locating. The standard measurement uncertainty is <1.0 μm. Four experiments were carried out in the laboratory to verify the proposed method. The cameras are 4 m away from the measurement volume that is 5 m × 4 m × 2 m. The centroid method is employed to measure image RRTs.

Comparison between spherical and planar targets
Two types of target are used: 9 mm diameter planer circle target and 6 mm diameter spherical target. Bar positions and orientations in the measurement volume are nearly the same for the two types of targets. The RMS and max errors of the scale bar lengths of five experiments are listed in Table 6. The spherical target is 24.5% better than the planar target in scale length reconstruction precision because more bars are visible in the measurement volume for its large viewing range. All the data in the following subsections are obtained using spherical RRTs.

Scale bar length
The three length configurations of the bar are used, respectively, for camera orientation in the same volume. The RMS and max errors of five experiment results under each scale length are listed in Table 7. Almost unchanged values indicate that calibration and orientation are almost not affected by scale bar length.

Intersection angle
By changing the baseline length, we have five different intersection angle configurations and the corresponding scale length reconstruction errors are plotted in Fig. 6. It can be seen that increasing the intersection angle improves the performance of this method. Fig. 7 demonstrates that the RMS remains almost unchanged and the maximum error decreases when bar position number increases.

Number of scale bar positions
From above experimental results, it can be concluded that when applying the proposed calibration and orientation method, using spherical RRTs, increasing the intersection angle and moving the    scale bar to as many positions as possible will improve measurement accuracy and precision.

Conclusions
This study proposes a method for simultaneously calibrating and orienting multi-camera systems in large measurement volume. The method only requires moving a scale bar in the volume to build a virtual control length field. After taking images of the moving scale bar, the two cameras are calibrated and orientated through a selfcalibration bundle adjustment algorithm that is constrained by spatial lengths. After calibration and orientation, error analysis of the reconstructed scale lengths provides an evaluation of orientation accuracy and precision. Simulations validate effectiveness of the method regarding the self-calibration of both interior and exterior camera parameters and test its accuracy and precision performance. Experiments are carried out to test the influence of the target type, scale bar length, intersection angle and bar position number on orientation precision. The proposed method does not require steady 3D point distribution in the measurement volume. Additionally, it achieves comparative accuracy against the state-of-the-art self-calibration bundle adjustment method. When using this orientation method, it is recommended to use spherical targets and move the bar to cover the entire measurement volume with in-plane and out-of-plane rotations in as many positions as possible. This method can be easily conducted in medium scale volumes within human arms reach, and can be extended to some large scale measurement applications with the help of an unmanned aerial vehicle to carry and operate the scale bar automatically. It can also be used in small or even microscale stereo vision systems such as structured light scanner, because compared with planer patterns, scale bars are easier to measure, less restricted by camera intersection angle and have higher image measurement accuracy.