Next Article in Journal
The Usefulness of Assessing Heart Rate Variability in Patients with Acute Myocardial Infarction (HeaRt-V-AMI)
Previous Article in Journal
Fault-Free Protection Level Equation for CLAS PPP-RTK and Experimental Evaluations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Linear Laser Scanning Measurement Method Tracking by a Binocular Vision

1
College of Metrology & Measurement Engineering, China Jiliang University, Hangzhou 310018, China
2
College of Information Engineering, China Jiliang University, Hangzhou 310018, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(9), 3572; https://doi.org/10.3390/s22093572
Submission received: 9 January 2022 / Revised: 28 April 2022 / Accepted: 5 May 2022 / Published: 7 May 2022
(This article belongs to the Section Optical Sensors)

Abstract

:
The 3D scanning of a freeform structure relies on the laser probe and the localization system. The localization system, determining the effect of the point cloud reconstruction, will generate positioning errors when the laser probe works in complex paths with a fast speed. To reduce the errors, in this paper, a linear laser scanning measurement method is proposed based on binocular vision calibration. A simple and effective eight-point positioning marker attached to the scanner is proposed to complete the positioning and tracking procedure. Based on this, the method of marked point detection based on image moment and the principle of global coordinate system calibration are introduced in detail. According to the invariance principle of space distance, the corresponding points matching method between different coordinate systems is designed. The experimental results show that the binocular vision system can complete localization under different light intensities and complex environments, and that the repeated translation error of the binocular vision system is less than 0.22 mm, while the rotation error is less than 0.15°. The repeated error of the measurement system is less than 0.36 mm, which can meet the requirements of the 3D shape measurement of the complex workpiece.

1. Introduction

In recent years, 3D topography measurement technology has been widely used in cultural relic protection, the aerospace industry, reverse engineering, biomedicine, and other fields [1,2,3,4]. A non-contact measurement—the scanning method based on optics and vision—has been increasingly popular due to its high accuracy, good flexibility, and fast speed [5].
According to the measurement principles, optical-based scanning methods mainly consist of the laser scanning method, the interferometry method, and the structured light method [6,7,8]. The linear laser scanning method is suitable for the measurement of complex hole surfaces based on laser triangulation technology [9]. This method generally requires a mobile platform to complete the scanning measurement. The interferometry method [10] is based on the dual-beam interference, the multi-beam interference, or the holographic interference to generate interference fringes. The geometric shape of the measured object can be measured according to the difference of the fringes. The measurement stability of this method is easily affected by optical vibration, humidity, and other factors. By utilizing image coding technology such as the gray code or the step-by-step phase-shifting method [11], the surface structured light method [12] can restore the 3D coordinate of an object surface to the measurement system. Compared with the linear laser scanning method, this method is more complicated and time-consuming. Therefore, in this paper, the objective is to achieve large-sized 3D measurement with the line laser technique.
Limited by the measurement principle, efficiency, and other issues, however, the field of view of the optical scanning equipment cannot be large enough, which prevents the probe from measuring the topography of a large-sized part directly. Therefore, it is necessary to adopt a hybrid measurement method to expand the measurement range. Also, since the mere local coordinate measurement cannot achieve global unification of measured values, the fusion and conversion methods are required for measured data under different measurement systems [13]. There are three kinds of methods [14] to be applied to the 3D data stitching, and they are as follows:
The first method is the multi-sensor perspective splicing method. For example, Liu et al. [15] proposed a 3D measurement system to improve the accuracy of point cloud stitching based on the Indoor Global Positioning System (IGPS) [16]. Du et al. [17] proposed a flexible large-scale 3D scanning system assembled by combining a robot, a binocular structured light scanner, and a laser tracker. This method relies on high-accuracy instruments such as the laser tracker and the IGPS to directly measure the pose of the scanning device in the global coordinate system. Apart from its high price, this method has a high measurement accuracy and speed. In addition, the relative position relationship between each sensor needs to be calibrated before use by utilizing this method.
The second method is the mechanical splicing method. For example, Novak et al. [18] proposed a new system for a 3D foot-shape rotating scanning measurement system based on the laser-multiple-line-triangulation principle. Liu et al. [19] introduced the axis-eye calibration algorithm to design a line laser rotating scanning measurement system, aiming to complete the registration of the point cloud in the same coordinate system. The mechanical splicing method mainly uses a moving platform such as the parallel guide rail and the high precision rotating table to complete data splicing with a known moving speed or the angular velocity. The measurement accuracy and the range of this type of method are limited by the motion mechanism.
The last method is the marker-assisted splicing method. For example, Braone et al. [20] developed a stereo vision system to efficiently align 3D point clouds, which allows for an automatic alignment by detecting fiducial markers distributed on overlapping areas of adjacent images. Wang et al. [21] presented a mobile 3D scanning system based on the known marked points. This method relies on the artificial markers to build a connection relationship to complete the conversion from the local measurement coordinate system to the global coordinate system. However, the process of pasting easily affects the surface characteristics of the measured workpiece, which will lead to an error accumulation on the scanning probe.
In addition, the hybrid scanning measurement is improved by scholars for different application areas and work scenarios. For example, Chen et al. [22] proposed a two-stage binocular vision system to measure the relative pose between two components, aiming to reduce coordinate transformation errors. Yin et al. [23] proposed a free-moving surface reconstruction technique based on the binocular structured light. This method can eliminate the movement constraints of the parallel guide rail and reduce the accumulation of error but is not suitable for large-sized objects. Liu et al. [24] built a binocular structured light system combined with a wide field of view camera, and a plane target was used as an intermediary to realize the alignment of local point cloud data. Shi et al. [25] proposed a 3D scanning measurement method based on a stereoscopic tracker. The position alignment of the scanner in different views was calculated by tracking the LED markers fixed to the scanner. However, in the case of occlusion, LED markers may be seriously deformed, which will result in inaccurate positioning. Hu et al. [26] proposed a new real-time catadioptric stereo tracking method, which can realize stereo measurement under monocular vision. Huang et al. [27] designed a new 3D scanner with a zoom lens unit, which realized large-area scanning at a low magnification rate and high-precision detail scanning at a high magnification rate. The two scanning results complement each other to ensure that the system has both high reconstruction accuracy and a large scanning area. Jiang et al. [28] proposed a system calibration method to reduce measurement errors caused by scale differences, which is suitable for combined measurement systems with different scales and ranges.
In order to enhance the flexibility of hybrid measurement, a method for localized surface scanning measurement is proposed by using circular markers to position the laser scanner. The main contribution lies in the design of a new circular mark recognition method and a coordinate transformation calibration method in the combined system. The proposed circle detection method based on image moments takes the weighted mean of the centroid of multiple circles as observation points, which can reduce the center deviation error and improve positioning accuracy. Different from the existing work, the laser scanner can be accurately located without fully identifying all the marked points, which can overcome the limitation of local occlusion and enhance the robustness of the system to a certain extent. Finally, through the calibration of each coordinate system, the alignment of 3D point cloud data at different angles is completed.
The remainder of this paper is structured as follows. Section 2 introduces the structure of the measurement system. Section 3 introduces the measurement principle and detection algorithms in the positioning process. In Section 4, the proposed method is verified through calibration experiments and measurement experiments, and concluding remarks are provided in Section 5.

2. Overview of the Laser Scanning System

The laser scanning system is designed based on binocular vision and mainly consists of a laser scanner, a 6 DOF robot, and a binocular camera. The structure and composition of the system are shown in Figure 1a. The laser scanner is fixed on the end of the robotic arm through a flange. As a significant mark for the binocular camera, the circular marked points are randomly pasted on the surface of the laser scanner to determine the pose of the laser scanner. The binocular camera is installed at a suitable distance to ensure that the circular marked points can be recognized.
In order to facilitate the positioning of the scanner, eight circular marked points are randomly pasted on the scanner to achieve robust and precise positioning. The inner diameters of the eight circular marking points are 10 mm. With the first center point in the upper left corner as the starting point, each circle is numbered in a clockwise direction. In addition, the first center point of the upper left corner is used as the reference point to establish the marker coordinate system, as shown in Figure 1b. The x-axis is parallel to the upper surface of the scanner, and the y-axis is parallel to the right side of the scanner. The three axes are perpendicular to each other, which satisfies the right-hand rule.
In the process of scanning, the laser scanner moves in any direction under the drive of the mechanical arm to complete the scanning measurement of the workpiece. The point cloud data, based on the scanner measurement coordinate system, is constantly changing with the scanning process. In order to complete the splicing of all point cloud data, it is necessary to transform point cloud data to a fixed coordinate system. The process of unifying point cloud data to the same coordinate system is shown in Figure 2.
The scanner measurement coordinate system is assumed as the laser coordinate system (LCS), which is the local coordinate system, noted as O s X s Y s Z s . The marker coordinate system (MCS) is expressed by O b X b Y b Z b . The camera coordinate system (CCS) is expressed by O c X c Y c Z c , which is the global coordinate system. The point cloud data in the local coordinate system is mainly unified to the global coordinate system through the following equation:
P c = [ R b c t b c 0 1 ] [ R s b t s b 0 1 ] P s
where P s is the 3D coordinate of the surface point of the workpiece in LCS, P c is the 3D coordinate of the corresponding point in CCS. R s b and t s b are denoted as the rotation matrix and the translation vector from LCS to MCS, respectively, and R b c and t b c are assumed as the rotation matrix and the translation vector from MCS to CCS, respectively.

3. Measurement Principle and Related Algorithms

3.1. Marker Detection

The laser scanner of our suggested measurement system is fixed at the end of the robot through a flange. However, the laser scanner has fewer textures, and the features on it are not obvious. In order to facilitate the positioning of the scanner, eight circular marked points are randomly pasted on the scanner to achieve robust and precise positioning. As a significant mark for calculating the pose of the scanner, these marked points are rigidly connected to the scanner. The accuracy of the pose calculation depends on the detection of the marked points.
In this paper, the principle of image moments is applied to complete the center detection of the marked points. The specific methods are as follows:
  • For the original image after correction, the normalized template matching algorithm is used to find the region of interest containing eight marked points. The region of interest is marked with a red rectangular box in the original image, as shown in Figure 3a. The pixel resolution of the ROI is 308 × 247 , and the position of the ROI changes, but its size remains unchanged in the process of movement.
  • The region of interest (Figure 3b) is processed by binarization threshold to obtain the binarization image (Figure 3c). According to the edge contour detection algorithm [29], the edge contours are extracted, as shown in Figure 3d. Each closed connected region is regarded as a separate contour region. Due to the affine transformation of the image, the standard circle is shaped like an ellipse in the image. The contour of the circular marker points can be obtained by filtering the geometric characteristics such as the area and the circularity, as shown in Figure 3e.
  • For each individual contour region, the centroid position of each contour is determined by the image moment property as the initial centroid result.
    x ¯ = i = 1 M j = 1 N ( x i × f ( x i , y j ) ) i = 1 M j = 1 N ( f ( x i , y j ) ) , y ¯ = i = 1 M j = 1 N ( y j × f ( x i , y j ) ) i = 1 M j = 1 N ( f ( x i , y j ) )
    where f ( x i , y j ) is the gray value of the pixel in the i-th row and j-th column.
  • Since the method of calculating the center by the image moment property is greatly affected by gray scale, to enhance the stability and reliability of the results, the binarization threshold interval of the image is set as [ 80 , 200 ] according to experience, and the binarization threshold is successively accumulated with 10 steps to process the image. Step 2 and step 3 are repeated K times, and the contour of the marked points for each image after threshold processing is detected. According to the inertia ratio, the confidence score of each detection is set, and the final center position of the marker is determined by weighted summation.
    γ = ( m u 20 + m u 02 ) ( m u 20 m u 02 ) 2 + ( 2 m u 11 ) 2 ( m u 20 + m u 02 ) + ( m u 20 m u 02 ) 2 + ( 2 m u 11 ) 2
    where γ is the inertial ratio of the contour, m u 20 = i = 1 M j = 1 N ( ( x i x ¯ ) 2 × f ( x i , y j ) ) , m u 02 = i = 1 M j = 1 N ( ( y j y ¯ ) 2 × f ( x i , y j ) ) , m u 11 = i = 1 M j = 1 N ( ( x i x ¯ ) × ( y j y ¯ ) × f ( x i , y j ) ) .
For each circular contour, multiple center point detection results can be obtained according to the threshold detection. α = γ 2 is introduced to represent the confidence score of detection, and the final center point position is expressed as
P ( x ¯ , y ¯ ) = i = 1 K ( P ( x ¯ i , y ¯ i ) × α i ) i = 1 K α i
Therefore, the central position of each marked point on the image can be determined, as shown in Figure 3f.

3.2. Binocular Parallel Stereo Vision Model

When an object is measured by the monocular camera model [30], its mapping relationship between the imaging plane coordinate system and the world coordinate system can be written as
z c [ u v 1 ] = K 3 × 3 [ R , t ] 3 × 4 [ x w y w z w 1 ] = [ f α 0 c x 0 f β c y 0 0 1 ] [ R , t ] 3 × 4 [ x w y w z w 1 ]
where K 3 × 3 is the internal parameter of the camera, f α = f / d x ,   f β = f / d y , f is the focal length of the camera. dx and dy are the physical size of each pixel in the x and y directions, respectively. ( c x , c y ) is the optical center coordinate of the camera, [ R , t ] 3 × 4 is the homogeneous transformation matrix from the world coordinate system to the camera coordinate system, ( u , v ) is the pixel coordinates on the imaging plane, ( x w , y w , z w ) is the 3D coordinates of the corresponding point in world space, and z c is the depth factor.
In the binocular parallel stereo vision model, the optical axes of the left camera and the right camera are parallel and perpendicular to the baseline, satisfying the epipolar constraint, as shown in Figure 4. The optical center distance of two cameras is defined as the baseline, noted as b . M ( x c , y c , z c ) is the 3D coordinate of the feature point in the camera coordinate system, ( u l , v l ) is the coordinate in the left camera image coordinate system, and ( u r , v r ) is the coordinate in the right camera image coordinate system. According to the principle of the similar triangle, the 3D coordinates in the new camera coordinate system can be obtained by using Equation (6) as
{ x c = b u l u l u r y c = b v u l u r z c = b f u l u r
In Equation (6), v = v l = v r is the y-pixel coordinate of the feature point.
The internal parameters of the camera are obtained by the camera calibration, which transforms the object from the 3D coordinates on the world coordinate system to the image plane. The 3D coordinates are unified in the newly corrected camera coordinate system based on the binocular stereo vision model. From this, the 3D coordinates of the marked points in CCS can be obtained.

3.3. Homonymy Point Matching Algorithm

For the images containing the laser scanner captured by the cameras, the center of the circle is extracted as a feature point through the detection method in Section 3.1. The epipolar constraint is exploited to correlate the corresponding feature points in the left and right images. Finally, through the parallax model in Section 3.2, the 3D coordinates of the circular marked points in CCS can be obtained.
Completing homonymy point matching of MCS and CCS is a key step in calculating the position conversion of these two coordinate systems. According to the principle of space distance invariance, a method for homonymy point matching is designed. By utilizing this matching method, the location of the laser sensor can be efficiently and quickly determined at each moment.
There are four steps for the matching algorithm, and they are as follows:
  • Building the reference distance library. According to the measurement calibration system, n three-dimensional coordinates of the marked points based on MCS can be calculated. These coordinates form a coordinate set as follows:
    Ω p w = { P 1 , P 2 , , P n }   P i = ( x i w , y i w , z i w ) , i = 1 , 2 , , n
The Euclidean distance relationship between these points is represented, as shown in Figure 5a. The distance between every two marked points is calculated, and the reference distance library are composed, as follows:
Ω W D = { W D 1 , W D 2 , , W D n }   W D i = { d i j } , j = 1 , 2 , , n
where W D i is the set of the distances from the i-th point to other points and d i j is the distance between the i-th point and the j-th point. The elements in each subset W D are arranged in ascending order.
2.
Building the sample distance library. After image processing on the images with marked points captured by the left camera and the right camera, m ( m n ) 3D coordinate sets of marked points in CCS are obtained as follows:
Ω p c = { Q 1 , Q 2 , , Q m }   Q t = ( x t c , y t c , z t c ) , t = 1 , 2 , , m
The Euclidean distance representation between sample points is shown in Figure 5b. The distance between every two marked points is calculated, and the sample distance library is composed as follows:
Ω C D = { C D 1 , C D 2 , , C D m }   C D t = { d t k } , k = 1 , 2 , , m
Similarly, the elements in each subset C D are also arranged in ascending order.
3.
Matching the corresponding points. According to the principle of invariance of spatial distance, the matching method of points with the same name is realized. The Euclidean distance between Q 1 and Q 2 is expressed by δ . Considering the influence of the error that the distance between the points reconstructed by the camera may deviate from the true distance, ε is assumed as this distance error threshold. The elements satisfying the range d [ δ ε , δ + ε ] in Ω W D are found and denoted as { d a b } , where a and b are the integers from 1 to n . Then, the possible initial matching point set of Q 1 is { P a , P b } . In order to further determine the corresponding point of Q 1 in MCS, the number of subsets in W D * that satisfy Equation (11) is counted and denoted as x . If x is greater than m 1 , it means that the corresponding point of Q 1 in MCS is P * . In the same way, the corresponding points of the remaining m 1 marked points in MCS can be determined.
| C D 1 ( t ) W D * ( i ) | ε   , * { a , b }
4.
Calculating the conversion relationship. According to the previous steps, the 3D coordinates of the corresponding marked points in the two different coordinate systems of MCS and CCS can be obtained; they are expressed by P c = { p c 1 , , p c m } and P b = { p b 1 , , p b m } , respectively. The problem is transformed into a point cloud registration problem with known matching information, which is to find an optimal Euclidean transformation to minimize the error.
min R , t 1 2 i = 1 m ( p c i ( R p b i + t ) ) 2 2
There are many closed-form solutions for this type of optimization, and the principles of each solution are basically similar. Here, the SVD decomposition algorithm in the literature [31] is used to calculate the pose parameters.

3.4. Hand–Eye Calibration of LCS and MCS

In the process of scanning, the laser scanner and the marked points move together without changing their relative positions. Therefore, the conversion of LCS and MCS does not change, which can be regarded as a rigid body transformation. The determination of the relationship between LCS and MCS is similar to the hand–eye calibration model [32]. It can be obtained by measuring the same point by changing the position of the line laser scanner multiple times. However, one disadvantage of the laser scanner is that it cannot directly and accurately obtain the coordinates of specific feature points. Combined with its measurement characteristics, a calibration model with the center of the standard sphere as the feature point is established. In addition to obtaining the transformation relationship, another parameter result of this calibration method is the coordinate value of the center of the standard sphere in the global coordinate system, which can be used to verify the quality of the calibration result.
When the laser scanner tests the standard sphere, the laser is attached to the spherical surface to obtain point cloud data, as shown in Figure 6. The value on the x-axis of these point cloud data is 0. According to the point cloud data, the center coordinate o c = ( 0 , y 0 , z 0 ) and the radius r 0 under this arc can be fitted. Combining the spatial geometric relationship between the circle center coordinates o 0 , the arc radius r 0 , and the sphere radius R 0 , the 3D coordinates of the center of the sphere in LCS can be calculated as P s ( x s , y s , z s ) .
{ x s = ± R 0 2 r 0 2 y s = y 0 z s = z 0
For the flexible 3D measurement system, P c ( x c , y c , z c ) is denoted as the 3D coordinate of the center of the standard sphere in CCS and P s ( x s , y s , z s ) is assumed as the 3D coordinate of the corresponding point in LCS. The local 3D coordinates measured by the laser scanner can be integrated into the camera coordinate system using Equation (14):
P c = R b c ( R s b P s + t s b ) + t b c
where R s b and t s b are the rotation matrix and the translation vector from LCS to MCS, respectively. Similarly, R b c and t b c are the rotation matrix and the translation vector from MCS to CCS.
During the calibration process, without changing the position of the standard sphere and binocular vision positioning system, the coordinate of the center of the standard sphere in CCS remains unchanged. However, with the help of the robot to drive the laser scanner to move to different positions, the coordinate of the center of the sphere in LCS is changed. Therefore, as shown in Figure 7, when the scanner is moved to a different position, in combination with Equation (14), multiple sets of conversion equations can be obtained as follows:
P c = R i b c ( R s b P i s + t s b ) + t i b c
where R i b c and t i b c are the rotation matrix and the translation matrix from MCS to CCS in the i-th measurement, P i s is the coordinate of the center of the standard sphere based on LCS in the i-th measurement.
Let R s b = [ r 1 , r 2 , r 3 ] , where r 1 , r 2 , r 3 are each column vector of the rotation matrix. Defining that R v = v e c ( R s b ) = [ r 1 , r 2 , r 3 ] T , R v is a vector with nine rows and one column. Then, the above equation can be expressed as
( P i s R i b c ) × R v + R i b c t s b P c = t i b c
where the symbol is defined as the Krone product of two matrices, and P i s R i b c is a matrix with three rows and nine columns.
Taking R v , t s b , and P c as the variables to be solved, Equation (16) can be transformed into
A X = b A 3 n × 15 = ( P 1 s R 1 b c R 1 b c I 3 × 3 P n s R n b c R n b c I 3 × 3 ) , X 15 × 1 = ( R v t s b P c ) , b 3 n × 1 = ( t 1 b c t n b c )
where n is the number of measurements in the calibration process, I 3 × 3 is an identity matrix with three rows and three columns.
For Equation (17), the result can be obtained by using the least squares method as
X = ( A T A ) 1 A T b
Finally, considering the rotation matrix and the translation vector from LCS to MCS, the 3D coordinate of the standard sphere in CCS can be obtained through this calibration model.

3.5. Overall Unification of Point Cloud Data

The point cloud data in LCS is mainly unified to CCS through the following equation.
P c = [ R b c t b c 0 1 ] [ R s b t s b 0 1 ] P s
where R s b and t s b are obtained through the hand–eye calibration model in Section 3.4, which represents the conversion relationship from LCS to MCS. R b c and t b c are calculated by the matching method in Section 3.3, which represent the conversion relationship from MCS to CCS. As a result, driven by the robot, the laser scanner scans the measured workpiece in all directions to complete the 3D topography measurement of the object.

4. Experiment and Analysis

The physical diagram of the whole system construction is shown in Figure 8. The binocular vision system consists of two industrial cameras. The model of the industrial cameras is JHSM130Bs, with a frame rate of 30 fps and a resolution of 1280 × 1024 pixels. The focal length of the industrial fixed focus lens is 4mm. The linear laser scanner is an nxSensor-I with a frame rate of 15 fps. A maximum of 960 points can be obtained in one scan measurement.
Regarding the laser scanner used in this paper, its internal placement structure is such that the optical axis of the laser is perpendicular to the surface of the measured object, and the optical axis of the camera is at an angle of 55° with the laser plane. This geometry is suitable for measuring objects with small surface height differences and can obtain a higher resolution [33]. For the laser measurement system used in this paper, the single measurement distance is not more than 200 mm, the scanning depth distance in the z-axis direction is 225–345 mm, the resolution in the depth direction can reach 0.0050 mm, the accuracy is 0.045 mm, and the minimum movement displacement of the driving system is 0.050 mm.

4.1. Accuracy Experiment for the Binocular Reconstruction

The checkerboard with 9 × 6 corner points is applied to calibrate the internal and external parameters of the two cameras. The side length of each small square of the checkerboard is 30 mm. The internal and external parameters obtained by the calibration are as shown in Table 1. It takes about two minutes to calibrate the binocular camera with a checkerboard.
The position conversion relationship from the right camera to the left camera is as follows. The unit of translation vector in the pose relationship is the millimeter.
R l r = [ 0.999503 0.022021 0.022570 0.021810 0.999717 0.009537 0.022773 0.009040 0.999670 ] ,   t l r = [ 264.449 7.38219 2.20070 ]
After calibration, the reprojection error of the camera is 0.14 pixel, and the epipolar error is 0.18 pixel. According to the calibrated camera parameters, the eight marked points randomly pasted on the surface of the laser scanner can be accurately positioned. The order of the marked points is as follows. Taking the first marked point in the upper left corner as the starting point, then the remaining marked points are sequentially numbered in a clockwise direction. In the positioning experiment, the matching result of the binocular camera at a certain moment is shown in Figure 9. The left and right pixel coordinates of the marked points at a certain moment are shown in Table 2.
For the eight marked points, the TRITOP optical measurement system developed by the German Company GOM has high measurement accuracy and can directly measure the center coordinates of the circular marked points. The TRITOP system consists of one DSLR camera with the largest resolution of 4288 × 2824 pixels, a fixed focal length of 35 mm, contrast coded and uncoded points, and scale bars. In the measurement process, the scale bars are placed on both sides of the object to be measured, and the contrast coded points are pasted on the surface of the object to be measured. The digital camera is used to take continuous photos from different perspectives, and then the 3D coordinates of the circular marked points can be obtained by processing these photos with its own software. Considering the high measurement accuracy of the TRITOP optical measurement system, the system is used to measure the 3D coordinates of these eight marked points, and the Euclidean distance between the two marked points is calculated as the reference value. The binocular positioning system built in this paper is used to perform 71 random positioning experiments on the laser scanner. The results of the first 12 groups are shown in Table 3.
In these 71 experiments, the comparison curve between the positioning distance of the marked points and the actual distance is shown in Figure 10.
The AVG denotes the average of the measured values, the RMSE denotes the root mean square error, the AE denotes the absolute error, the D-AE denotes the absolute error of the distance between two points, and the p i denotes the i-th marked point. It can be seen from Table 4 and Figure 10 that the reconstruction measurement of the binocular positioning system has good repeatability and reliability, and the reconstruction accuracy is high. The absolute error is less than 0.08 mm, and the root mean square error is better than 0.01 mm, which can ensure the high-precision positioning of the marked points. It is the basis for the accuracy of the subsequent calculation of the position.

4.2. Evaluation of Binocular Positioning Performance

For the laser scanning measurement system based on binocular positioning proposed in this paper, ensuring the accuracy of positioning in different environments is the premise to complete high-precision 3D reconstruction. Therefore, this part mainly evaluates the positioning accuracy of the positioning system from the lightness, distance, viewing angle, shelter, and other factors. Here, with the aid of a 6-DOF mechanical arm as the experimental mobile platform, the rotation accuracy of the arm is 0.003°, and the translation accuracy is 0.03 mm.

4.2.1. Localization Evaluation under Light Changes

In this part, the influence of different lightness on the positioning performance of eight-point markers is analyzed. Different lighting conditions are simulated mainly by changing the exposure time of the camera. The binocular camera is placed 1.6 m away from the laser scanner, the initial exposure time research range is set as 400–3200 µs, and the step interval is set as 100 µs. 29 groups of data are collected in the whole experiment, with 30 pairs of images in each group. The average pose solved by these 30 images is taken as the pose solution value under the current exposure time. Among them, the pose is expressed in the form of a Euler Angle, which is divided into rotations and translations in x, y, and z directions, represented by rx, ry, rz, tx, ty, and tz, respectively. Figure 11 shows the image recorded at a partial exposure time. Especially in the first pair of images in Figure 11, in a relatively dark environment, the eight markers may not be fully recognized, but the pose can still be calculated via the correct matching of some points.
Because the positions of the binocular camera and scanner have not moved, the pose obtained from the theoretical analysis should be relatively consistent. The average value of 29 groups of data within the exposure range is taken as the reference value to calculate the deviation between the solved pose and the reference value under each exposure condition. The experimental results are shown in Figure 12.
As can be seen in Figure 12, even under the influence of different exposure factors, the interference of solving the pose is small. The solution of the angle pose in the z direction is the most stable, with an error of less than 0.015°. The translation deviation in the y and z directions is also less affected, with an error of less than 0.076 mm and 0.12 mm, respectively. The maximum angle deviation in the x direction is 0.11°, and the maximum angle deviation in the y direction is 0.16°, both of which appear under high exposure conditions. However, the lightness condition has a great influence on the pose calculation in the x direction, and the error increases with the increase of brightness. The reason for this may be that when the brightness increases, the extracted circular contour may become smaller, leading to the deviation of the center of the circle.

4.2.2. Positioning Accuracy at Different Distances

In this section, the positioning accuracy of the system at different distances is mainly studied. Referring to the experiment in the previous section, in order to have better recognition results, the exposure time of the camera is set as 1200 µs in subsequent experiments. The research range of positioning detection distance is set as 1.2–2.3 m. With the industrial mechanical arm as the moving platform, 30 frames of pictures are recorded every 0.1 m of movement along the optical axis of the camera, and the average value is taken as the current value. Figure 13 shows some experimental images. The first frame of each set of data is used as the initial pose, and then the relative poses of the other images are obtained. The Euclidean distance of the translation error is used as a comparison, and the calculation formula is as follows:
Δ t = t x × t x + t y × t y + t z × t z
As can be seen in Figure 14, the translation deviation of the positioning system is relatively stable within the range of 1.8 m, and the error is less than 0.50 mm. When the distance increases to 2.3 m, the translation error reaches 1.34 mm. The possible reason for this is that at a long distance, the size of the circular mark is too small to be easily identified, resulting in a large error.

4.2.3. Positioning Accuracy at Different Angles

The localization accuracy of the localization system under different angles and occlusions is demonstrated in detail in this section. The binocular camera is placed 1.6 m away from the laser scanner. With the y axis of the camera as the rotation axis, the scanner rotates from 0 to 40°. Ten pairs of continuous images are recorded at each rotation of 5°. In order to compare the robustness of anti-occlusion, two circular markers are artificially blocked at the same position, and ten images are also recorded. Figure 15 shows some experimental images at a rotation angle of 30°. The first frame of each set of data is used as the initial pose, and then the relative poses of the other images are obtained. Then, the average errors of each degree of freedom are calculated. The experimental results are shown in Figure 16.
In Figure 16, when there is no occlusion, the camera can fully recognize eight points. The maximum rotation errors in the x, y, and z directions are −0.19°, −0.24°, and 0.23°, respectively. The maximum translation errors are 0.28 mm, 0.35 mm, and 0.25 mm, respectively. In the case of artificial occlusion, the camera can only locate the scanner through six markers. The maximum average errors of rx, ry, and rz are −0.23°, −0.21°, and 0.26°, and the maximum average errors of tx, ty, and tz are 0.27 mm, 0.35 mm, and 0.41 mm, respectively. The experimental results show that the positioning accuracy does decrease when the marker is partially occluded. The most influential is the translation error in the z-direction, which has increased from 0.25 mm to 0.41 mm. The rotation error has little effect, and the rotation error in both cases is basically kept below 0.26°.

4.3. Hand–Eye Calibration Experiment

(1)
The standard sphere with a diameter of 30.0055 mm is placed in a suitable position. The binocular camera is installed at a suitable location to ensure that the laser scanner is within the measurement range of the field of view of the positioning system.
(2)
The robot is controlled to drive the linear laser scanner to move to a certain position, where the laser beam can be effectively projected on the surface of the standard sphere. Afterward, the line point cloud data can be obtained, and the images taken by the left camera and the right camera are obtained at the same time.
(3)
Step (2) is repeated N times (N > 8).
(4)
The line point cloud data on the surface of the standard sphere is extracted for fitting, and then the coordinate of the center of the sphere can be calculated in the current laser coordinate system. According to the matching algorithm in Section 3.3, the circular marked points are recognized, and the spatial position changes of the marked points are calculated.
(5)
According to the standard sphere calibration model established in Section 3.4, the conversion relationship between LCS and MCS is calculated as follows:
R s b = [ 0.996745 0.00427368 0.00807998 0.0146129 0.00129409 0.989261 0.0293743 0.999739 0.0376670 ] , t s b = [ 89.5174 270.536 14.8889 ]
At the same time, the 3D coordinate of the center of the standard sphere in CCS during the calibration process is as follows:
P c = [ 35.9700 , 216.343 , 1303.39 ] T
The whole calibration process takes about 10 min. During the process, it takes 9 min to obtain the point cloud data of multiple groups of different poses by moving the scanner, and 189 ms for the transformation matrix obtained via program processing.
In order to verify the calibration accuracy of the system, without changing the position of the standard sphere, the standard sphere is scanned and measured using the calibrated position relationship. Combined with the known diameter of the standard sphere and the sphere center coordinate obtained by the calibration, the standard sphere model is established. The deviation analysis between the scanning point cloud and the sphere fitted by the standard sphere is shown in Figure 17.
In Figure 17, the parts from green to red represent positive deviation, and the parts from green to blue represent negative deviation. The result shows that the error of the point cloud near the edge of the standard sphere is relatively large, the maximum positive deviation is 0.38 mm, and the maximum negative deviation is −0.27 mm. The possible reason for this is that the data obtained at the edge of the scanner is relatively unstable due to the limitation of the scanner principle. The geometric standard deviation is 0.043 mm, showing that the overall deviation is more evenly distributed.

4.4. System Spatial Measurement Accuracy Experiment

In order to determine the spatial measurement accuracy of the system, several measurement experiments are carried out with the standard parts. The standard parts used mainly include a standard sphere with a diameter of 30.0055 mm and a standard gauge block with a length of 70 mm. The accuracy of standard parts is 0.0015 mm. To verify the reliability of the standard part data, ten measurement experiments were carried out on the standard ball and standard gauge block using a three-coordinate measuring machine (CMM). The model of the bridge-type CMM is Global 09.15.08 from Hexagon, and the indication error is 1.2 µm. Figure 18 shows photos of the process of using CMM to measure the standard part, and Table 5 records the data of the diameter of the standard ball and the length of the standard gauge block measured by CMM.
It can be determined from the data in Table 5 that the values measured by the CMM are relatively stable and reliable. The average diameter of the measurement standard sphere is 30.002 mm, and the average radius value of the standard sphere is 15.001 mm. The average length of the measurement standard block is 70.000 mm. The value of the standard part is calibrated, and the two average values are taken as the reference true value of the standard part, which is used to compare with the measurement data of the system.
According to the calibrated position relationship, the position of the standard sphere and the standard gauge block is fixed, and the scanner measures the gauge block and the standard sphere in a variety of different postures. At the same time, the positioning of the 6-DOF robot itself is used to scan and measure the measuring block and the standard sphere for comparison [34]. The point cloud image and the rendering image of the standard sphere are shown in Figure 19. Similarly, the point cloud image and the rendering image of the standard gauge block are shown in Figure 20.
According to the laser point cloud data obtained at different positions each time, the radius of the fitted sphere is obtained. The scanner is moved to different positions to measure the radius of the standard sphere and the length of the standard gauge block. Twelve measurement results are recorded in Table 6. The data measured by these two methods are compared with the true value data measured by the CMM, and the standard deviation and root mean square error of the system measurement are calculated. The data are shown in Table 7.
The SD is the standard deviation. It can be seen from Table 7 that within the measurement range of 70 mm, the standard deviation is less than 0.17 mm, indicating that the measurement system has good stability. Compared with the splicing results using robot positioning, our average measurement error is smaller. Compared with the true value of the CMM, the root mean square error of the measurement system does not exceed 0.20 mm, the maximum measurement error for the standard sphere does not exceed 0.19 mm, and the maximum measurement error for the standard gauge block does not exceed 0.36 mm. In summary, referring to the maximum measurement deviation, the spatial measurement accuracy of the system can be determined to be 0.36 mm.

4.5. Scanning Measurement Application Experiment

In order to verify the feasibility of the entire scanning measurement system, this system is applied to scan and reconstruct the ceramic bowl. The robot is used to drive the laser scanner to scan the ceramic bowl according to the planned route. In the entire scanning process, the linear laser scans 1194 times, and 977,292 points are obtained. The point cloud image of the scanned ceramic bowl is obtained as shown in Figure 21b. Due to the path setting, repeated measurements are made in some parts of the ceramic bowl, resulting in dense point clouds in some parts. With the distance tolerance of 0.5 mm as the constraint, the original point cloud image is downsampled to obtain the downsampled image as shown in Figure 21c. The downsampled image is rendered by “Imageware” software, and the 3D curved surface shape of the ceramic bowl is obtained as shown in Figure 21d.
It can be seen in Figure 21b,c that there are fewer point clouds at the side of the bowl with larger curvatures, indicating that more instances of measurement are required to obtain complete point cloud data at the curved location. In the repeated measurement area, the overlap degree of point cloud is relatively high, showing that the measurement system has good repeatability. In Figure 21d, the scanned point cloud data can fully recover the 3D shape of the measured object, and the overall stitching effect is good, indicating that the measurement system has the 3D measurement capability of a continuous surface.

5. Conclusions

This paper mainly introduces a laser scanning measurement system based on binocular vision positioning and its calibration method. Compared to traditional methods, the circular marker is designed to avoid sticking marked points on the measured workpiece. The complex calibration process is not required, and the positioning accuracy is not limited by the motion mechanism. The proposed circle detection method based on image moments takes the weighted mean of the centroid of multiple circles as observation points, which can reduce the center deviation error and improve positioning accuracy. This method is also suitable for the rapid detection and recognition of other circular markers. Compared to a specific target, the marker we designed overcomes the limitation that it cannot be located due to local occlusion, and the laser scanner can be accurately located without completely identifying all the marked points. The measurement range of the positioning system is wider. Therefore, the system can be conveniently applied to measure the 3D shape of large-sized workpieces on site with complex features.
Regarding the eight-point markers designed in this paper: (1) A corresponding marked points detection method is designed based on the image moment properties, which enhances the accuracy of detection. (2) A matching method of spatial points is built by both the epipolar constraint and the principle of the invariance of spatial distance. (3) The mathematical relationship between the laser point cloud and the global coordinates is obtained by a standard spherical calibration model. The marker coordinate system is used as an intermediate bridge to complete the data unification. The system is used to measure the ceramic bowl to verify the practicability and reliability of the measurement system. The experimental results show that the binocular vision system can complete localization under different light intensities and complex environments, the repeated translation error of the binocular vision system is less than 0.22 mm, and the rotation error is less than 0.15°. The repetition error of the measurement system is less than 0.36 mm, and the average measurement result is better than the splicing method relying on the self positioning of the manipulator, which can complete the precise positioning of the scanner and meet the requirements of 3D shape measurement.
The eight-point marking method in the paper also has some limitations. When the measurement distance is large, the marker is too small for positioning, and the size of the circular marker can be increased in the follow-up study. When the rotation angle of the scanner is too large and the number of marked points identified by the positioning system is less than five, it is easy to cause inaccurate positioning. In the follow-up work, we try to solve this problem by sticking circular markers on multiple surfaces of the scanner, so that the binocular positioning system can still recognize more circular markers when the scanner rotates at a large angle, and also the flexibility of the system can be further enhanced.

Author Contributions

Conceptualization, C.W., L.Y., Z.L. and W.J.; methodology, C.W., L.Y., Z.L. and W.J.; software, C.W.; validation, C.W. and L.Y.; investigation, C.W.; resources, Z.L.; data curation, C.W. and L.Y.; writing—original draft preparation, C.W., L.Y. and W.J.; project administration, C.W., L.Y., Z.L. and W.J.; supervision, funding acquisition, L.Y., Z.L. and W.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the National Natural Science Foundation of China (No. 52075511, No. 51927811, and No. 52005471), Natural Science Foundation of Zhejiang Province (No. LQ20E050017, No. LY19F030012), Science and Technology Project of State Administration for Market Regulation (No. 2021MK189), and Zhejiang University Student Science and Technology Innovation Activity Plan (No. 2022R409052).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data used to support the findings of this study are available from the first author upon request.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Sitnik, R.; Karaszewski, M.; Zaluski, W.; Bolewicki, P. Automated full-3D shape measurement of cultural heritage objects. In Proceedings of the SPIE Europe Optical Metrology, Munich, Germany, 14–18 June 2009. [Google Scholar] [CrossRef]
  2. Zhou, Z.; Liu, W.; Wu, Q.; Wang, Y.; Yu, B.; Yue, Y.; Zhang, J. A Combined measurement method for large-size aerospace components. Sensors 2020, 20, 4843. [Google Scholar] [CrossRef] [PubMed]
  3. Barone, S.; Paoli, A.; Razionale, A.V. Optical tracking of a tactile probe for the reverse engineering of industrial impellers. J. Comput. Inf. Sci. Eng. 2017, 17, 041003. [Google Scholar] [CrossRef]
  4. Poredoš, P.; Čelan, D.; Možina, J.; Jezeršek, M. Determination of the human spine curve based on laser triangulation. BMC Med. Imaging 2015, 15, 2. [Google Scholar] [CrossRef] [Green Version]
  5. Zhang, S. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques. Opt. Lasers Eng. 2010, 48, 149–158. [Google Scholar] [CrossRef]
  6. Zuo, C.; Feng, S.; Huang, L.; Tao, T.; Yin, W.; Chen, Q. Phase shifting algorithms for fringe projection profilometry: A review. Opt. Lasers Eng. 2018, 109, 23–59. [Google Scholar] [CrossRef]
  7. Brown, G.M.; Chen, F.; Song, M. Overview of three-dimensional shape measurement using optical methods. Opt. Eng. 2000, 39, 10–22. [Google Scholar] [CrossRef]
  8. Sioma, A.; Romaniuk, R.S.; Linczuk, M. 3D imaging methods in quality inspection systems. In Proceedings of the Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments, Wilga, Poland, 25 May–2 June 2019. [Google Scholar]
  9. Auerswald, M.M.; von Freyberg, A.; Fischer, A. Laser line triangulation for fast 3D measurements on large gears. Int. J. Adv. Manuf. Technol. 2018, 100, 2423–2433. [Google Scholar] [CrossRef]
  10. Yang, L.; Xie, X.; Zhu, L.; Wu, S.; Wang, Y. Review of electronic speckle pattern interferometry (ESPI) for three dimensional displacement measurement. Chin. J. Mech. Eng. 2014, 27, 1–13. [Google Scholar] [CrossRef]
  11. Salvi, J.; Pagès, J.; Batlle, J. Pattern codification strategies in structured light systems. Pattern Recognit. 2004, 37, 827–849. [Google Scholar] [CrossRef] [Green Version]
  12. Marrugo, A.G.; Gao, F.; Zhang, S. State-of-the-art active optical techniques for three-dimensional surface metrology: A review [Invited]. J. Opt. Soc. Am. A 2020, 37, B60. [Google Scholar] [CrossRef]
  13. Yu, H.; Huang, Y.; Zheng, D.; Bai, L.; Han, J. Three-dimensional shape measurement technique for large-scale objects based on line structured light combined with industrial robot. Optik 2019, 202, 163656. [Google Scholar] [CrossRef]
  14. Wang, J.; Tao, B.; Gong, Z.; Yu, W.; Yin, Z. A Mobile robotic 3-D measurement method based on point clouds alignment for large-scale complex surfaces. IEEE Trans. Instrum. Meas. 2021, 70, 7503011. [Google Scholar] [CrossRef]
  15. Liu, L.; Wang, W.; Li, W. Flexible measurement technology of complex curved surface three-dimensional shape robot based on iGPS. Chin. J. Lasers 2019, 46, 200–205. [Google Scholar]
  16. Norman, A.R.; Schönberg, A.; Gorlach, I.A.; Schmitt, R. Validation of iGPS as an external measurement system for cooperative robot positioning. Int. J. Adv. Manuf. Technol. 2012, 64, 427–446. [Google Scholar] [CrossRef]
  17. Du, H.; Chen, X.; Xi, J.; Yu, C.; Zhao, B. Development and verification of a novel robot-integrated fringe projection 3D scanning system for large-scale metrology. Sensors 2017, 17, 2886. [Google Scholar] [CrossRef] [Green Version]
  18. Novak, B.; Babnik, A.; Možina, J.; Jezeršek, M. Three-dimensional foot scanning system with a rotational laser-based measuring head. Stroj. Vestn.-J. Mech. Eng. 2014, 60, 685–693. [Google Scholar] [CrossRef]
  19. Liu, T.; Wang, N.N.; Fu, Q.; Zhang, Y.; Wang, M.H. Research on 3D reconstruction method based on laser rotation scanning. In Proceedings of the 2019 IEEE International Conference on Mechatronics and Automation (ICMA), Tianjin, China, 4–7 August 2019; pp. 1600–1604. [Google Scholar]
  20. Barone, S.; Paoli, A.; Razionale, A.V. Multiple alignments of range maps by active stereo imaging and global marker framing. Opt. Lasers Eng. 2013, 51, 116–127. [Google Scholar] [CrossRef]
  21. Wang, X.; Xie, Z.; Wang, K.; Zhou, L. Research on a handheld 3D laser scanning system for measuring large-sized objects. Sensors 2018, 18, 3567. [Google Scholar] [CrossRef] [Green Version]
  22. Chen, Y.; Zhou, F.; Zhou, M.; Zhang, W.; Li, X. Pose measurement approach based on two-stage binocular vision for docking large components. Meas. Sci. Technol. 2020, 31, 125002. [Google Scholar] [CrossRef]
  23. Yin, L.; Wang, X.; Ni, Y. Flexible three-dimensional reconstruction via structured-light-based visual positioning and global optimization. Sensors 2019, 19, 1583. [Google Scholar] [CrossRef] [Green Version]
  24. Liu, Z.; Li, X.; Li, F.; Wei, X.; Zhang, G. Fast and flexible movable vision measurement for the surface of a large-sized object. Sensors 2015, 15, 4643–4657. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Shi, J.; Sun, Z. Large-scale three-dimensional measurement based on LED marker tracking. Vis. Comput. 2015, 32, 179–190. [Google Scholar] [CrossRef]
  26. Hu, S.; Matsumoto, Y.; Takaki, T.; Ishii, I. Monocular stereo measurement using high-speed catadioptric tracking. Sensors 2017, 17, 1839. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Huang, J.-C.; Liu, C.-S.; Chiang, P.-J.; Hsu, W.-Y.; Liu, J.-L.; Huang, B.-H.; Lin, S.-R. Design and experimental validation of novel 3D optical scanner with zoom lens unit. Meas. Sci. Technol. 2017, 28, 105904. [Google Scholar] [CrossRef] [Green Version]
  28. Jiang, T.; Cui, H.; Cheng, X. Accurate calibration for large-scale tracking-based visual measurement system. IEEE Trans. Instrum. Meas. 2020, 70, 5003011. [Google Scholar] [CrossRef]
  29. Suzuki, S.; Abe, K. Topological structural analysis of digitized binary images by border following. Comput. Vis. Graph. Image Process. 1985, 29, 396. [Google Scholar] [CrossRef]
  30. Peng, J.; Chen, D.; Xu, W.; Liang, B. An efficient virtual stereo-vision measurement method of a space non-cooperative target. In Proceedings of the 2018 13th World Congress on Intelligent Control and Automation (WCICA), Changsha, China, 4–8 July 2018; pp. 7–12. [Google Scholar]
  31. Umeyama, S. Least-squares estimation of transformation parameters between two point patterns. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 376–380. [Google Scholar] [CrossRef] [Green Version]
  32. Enebuse, I.; Foo, M.; Ibrahim, B.S.K.K.; Ahmed, H.; Supmak, F.; Eyobu, O.S. A comparative review of hand-eye calibration techniques for vision guided robots. IEEE Access 2021, 9, 113143–113155. [Google Scholar] [CrossRef]
  33. Sioma, A.; Romaniuk, R.S.; Linczuk, M. Geometry and resolution in triangulation vision systems. In Proceedings of the Photonics Applications in Astronomy, Communications, Industry, and High Energy Physics Experiments, Wilga, Poland, 31 August–2 September 2020. [Google Scholar]
  34. Mu, N.; Wang, K.; Xie, Z.; Ren, P. Calibration of a flexible measurement system based on industrial articulated robot and structured light sensor. Opt. Eng. 2017, 56, 054103. [Google Scholar] [CrossRef]
Figure 1. (a) System composition and structure diagram. (b) The markers and the marker coordinate system.
Figure 1. (a) System composition and structure diagram. (b) The markers and the marker coordinate system.
Sensors 22 03572 g001
Figure 2. Flow chart of linear laser scanning measurement.
Figure 2. Flow chart of linear laser scanning measurement.
Sensors 22 03572 g002
Figure 3. (a) Template matching image. (b) Region of interest image. (c) Binary image. (d) Edge detection image. (e) Contours image. (f) Center marked image.
Figure 3. (a) Template matching image. (b) Region of interest image. (c) Binary image. (d) Edge detection image. (e) Contours image. (f) Center marked image.
Sensors 22 03572 g003aSensors 22 03572 g003b
Figure 4. Schematic diagram of binocular parallel stereo vision.
Figure 4. Schematic diagram of binocular parallel stereo vision.
Sensors 22 03572 g004
Figure 5. (a) Schematic diagram of the reference distance library. (b) Schematic diagram of the sample distance library.
Figure 5. (a) Schematic diagram of the reference distance library. (b) Schematic diagram of the sample distance library.
Sensors 22 03572 g005
Figure 6. Schematic diagram of standard sphere calibration.
Figure 6. Schematic diagram of standard sphere calibration.
Sensors 22 03572 g006
Figure 7. Coordinate systems during calibration.
Figure 7. Coordinate systems during calibration.
Sensors 22 03572 g007
Figure 8. Physical diagram of system configuration.
Figure 8. Physical diagram of system configuration.
Sensors 22 03572 g008
Figure 9. The matching result of the left image and the right image at a certain moment.
Figure 9. The matching result of the left image and the right image at a certain moment.
Sensors 22 03572 g009
Figure 10. Comparison of the reconstruction distance and the actual distance.
Figure 10. Comparison of the reconstruction distance and the actual distance.
Sensors 22 03572 g010
Figure 11. Positioning images with exposure times of 400 µs and 3200 µs.
Figure 11. Positioning images with exposure times of 400 µs and 3200 µs.
Sensors 22 03572 g011
Figure 12. Positioning errors under different brightness conditions. (a) Rotational errors on the x, y, and z axes. (b) Translation errors on the x, y, and z axes.
Figure 12. Positioning errors under different brightness conditions. (a) Rotational errors on the x, y, and z axes. (b) Translation errors on the x, y, and z axes.
Sensors 22 03572 g012
Figure 13. Left camera images at distances of 1.2 m and 2.3 m.
Figure 13. Left camera images at distances of 1.2 m and 2.3 m.
Sensors 22 03572 g013
Figure 14. Translation errors at different distances.
Figure 14. Translation errors at different distances.
Sensors 22 03572 g014
Figure 15. Left camera image at a rotation angle of 30°. (a) In the case of no occlusion. (b) In the case of the occlusion of two marker points.
Figure 15. Left camera image at a rotation angle of 30°. (a) In the case of no occlusion. (b) In the case of the occlusion of two marker points.
Sensors 22 03572 g015
Figure 16. The pose errors based on eight markers and six markers at different angles. (a) Rotation error along the x-axis. (b) Rotation error along the y-axis. (c) Rotation error along the z-axis. (d) Translation error along the x-axis. (e) Translation error along the y-axis. (f) Translation error along the z-axis.
Figure 16. The pose errors based on eight markers and six markers at different angles. (a) Rotation error along the x-axis. (b) Rotation error along the y-axis. (c) Rotation error along the z-axis. (d) Translation error along the x-axis. (e) Translation error along the y-axis. (f) Translation error along the z-axis.
Sensors 22 03572 g016
Figure 17. Deviation analysis between the standard sphere model and the scanned point cloud.
Figure 17. Deviation analysis between the standard sphere model and the scanned point cloud.
Sensors 22 03572 g017
Figure 18. Images of CMM measuring standard parts. (a) Measuring standard sphere. (b) Measuring standard gauge block.
Figure 18. Images of CMM measuring standard parts. (a) Measuring standard sphere. (b) Measuring standard gauge block.
Sensors 22 03572 g018
Figure 19. The related images of the standard sphere. (a) The original image. (b) The point cloud image based on binocular camera positioning. (c) The rendering image based on (b). (d) Point cloud image based on robot positioning.
Figure 19. The related images of the standard sphere. (a) The original image. (b) The point cloud image based on binocular camera positioning. (c) The rendering image based on (b). (d) Point cloud image based on robot positioning.
Sensors 22 03572 g019
Figure 20. The related images of the standard gauge block. (a) The original image. (b) The point cloud image based on binocular camera positioning. (c) The rendering image based on (b). (d) Point cloud image based on robot positioning.
Figure 20. The related images of the standard gauge block. (a) The original image. (b) The point cloud image based on binocular camera positioning. (c) The rendering image based on (b). (d) Point cloud image based on robot positioning.
Sensors 22 03572 g020
Figure 21. The related images of the ceramic bowl. (a) the original image. (b) the original point cloud image. (c) the point cloud image after down-sampling. (d) the rendering image.
Figure 21. The related images of the ceramic bowl. (a) the original image. (b) the original point cloud image. (c) the point cloud image after down-sampling. (d) the rendering image.
Sensors 22 03572 g021
Table 1. Internal parameter coefficients of two cameras.
Table 1. Internal parameter coefficients of two cameras.
( f x , f y ) ( c x , c y ) k 1 k 2
Left camera(1621.110, 1621.330)(653.257, 516.830)−0.0420.286
Right camera(1636.350, 1634.640)(652.691, 504.998)0.0170.055
Table 2. The coordinate value of the marked points at a certain time.
Table 2. The coordinate value of the marked points at a certain time.
The Number of Marked PointsLeft Camera Pixel
Coordinates
Right Camera Pixel
Coordinates
Space Coordinates in CCS
1(697.376, 442.240)(356.923, 442.371)(29.000, −53.269, 1365.262)
2(762.921, 433.678)(422.075, 433.779)(79.841, −59.864, 1363.679)
3(737.057, 423.396)(396.589, 423.487)(59.832, −67.924, 1365.186)
4(807.995, 456.506)(466.361, 456.598)(114.562, −42.052, 1360.543)
5(791.867, 467.259)(450.198, 467.327)(102.062, −33.731, 1360.402)
6(765.867, 495.424)(423.979, 495.553)(81.877, −11.891, 1359.516)
7(739.902, 471.748)(398.578, 471.839)(61.887, −30.276, 1361.771)
8(707.569, 477.463)(366.483, 477.565)(36.851, −25.861, 1362.724)
Table 3. Measurement results of the distance between the marked points.
Table 3. Measurement results of the distance between the marked points.
No. d 12 / m m d 13 / m m d 14 / m m d 15 / m m d 16 / m m d 17 / m m d 18 / m m
134.13951.29286.42475.78667.38840.28028.624
234.13651.28086.41575.77767.39240.25828.624
334.13751.28886.42375.78967.40040.27828.631
434.14551.28986.42275.78467.40040.27528.631
534.13851.28286.41775.78067.39440.26628.625
634.13751.28486.41775.78167.39740.25928.626
734.13451.28286.41275.77267.39140.25928.621
834.13451.28586.42475.78367.39440.26628.632
934.13351.27586.40775.76967.38240.25328.616
1034.14151.28886.42275.78767.39940.27128.631
1134.13651.28386.41675.77267.38140.26428.623
1234.14051.28386.41875.77667.38940.26328.625
Table 4. Comparison of the distance measured result with the actual value.
Table 4. Comparison of the distance measured result with the actual value.
d 12 / m m d 13 / m m d 14 / m m d 15 / m m d 16 / m m d 17 / m m d 18 / m m
AVG34.14051.29686.40775.77367.38640.25128.620
Real34.13051.22786.45375.74267.35440.25128.714
RMSE0.0100.0150.0160.0150.0200.0180.017
Table 5. Measurement results measured with CMM.
Table 5. Measurement results measured with CMM.
No.12345678910
Diameter/mm30.00430.00330.00130.00130.00330.00629.99729.99930.00330.005
Length/mm70.00069.99770.00170.00070.00069.99869.99870.00170.00070.000
Table 6. Measurement results of the standard sphere radius and gauge block length.
Table 6. Measurement results of the standard sphere radius and gauge block length.
No.Radius R/mmLength L/mm
Our SystemRobot PositioningOur SystemRobot Positioning
114.95815.23670.24170.170
215.12515.15769.88370.236
315.00415.10070.35870.215
415.18215.11870.21569.876
515.09115.15270.16370.129
614.96315.08569.94370.174
715.16614.98370.13669.954
815.02915.03470.08270.190
914.91615.08269.75570.143
1014.90015.04169.92670.091
1115.07914.99470.20269.984
1215.05015.06670.15670.230
Table 7. Comparison of the measured value with the actual value.
Table 7. Comparison of the measured value with the actual value.
Radius R/mmLength L/mm
Our SystemRobot PositioningOur SystemRobot Positioning
Average15.03915.08870.08870.116
Real15.00115.00170.00070.000
RMSE0.0970.1110.1900.162
SD0.0900.0690.1680.112
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wu, C.; Yang, L.; Luo, Z.; Jiang, W. Linear Laser Scanning Measurement Method Tracking by a Binocular Vision. Sensors 2022, 22, 3572. https://doi.org/10.3390/s22093572

AMA Style

Wu C, Yang L, Luo Z, Jiang W. Linear Laser Scanning Measurement Method Tracking by a Binocular Vision. Sensors. 2022; 22(9):3572. https://doi.org/10.3390/s22093572

Chicago/Turabian Style

Wu, Chunyan, Li Yang, Zai Luo, and Wensong Jiang. 2022. "Linear Laser Scanning Measurement Method Tracking by a Binocular Vision" Sensors 22, no. 9: 3572. https://doi.org/10.3390/s22093572

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop