Next Article in Journal
Industrial IoT Monitoring: Technologies and Architecture Proposal
Previous Article in Journal
Photo-Electrochemical Sensing of Dopamine by a Novel Porous TiO2 Array-Modified Screen-Printed Ti Electrode
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on a Handheld 3D Laser Scanning System for Measuring Large-Sized Objects

Engineering College, Ocean University of China, Qingdao 266100, China
*
Authors to whom correspondence should be addressed.
Sensors 2018, 18(10), 3567; https://doi.org/10.3390/s18103567
Submission received: 3 September 2018 / Revised: 17 October 2018 / Accepted: 18 October 2018 / Published: 21 October 2018
(This article belongs to the Section Physical Sensors)

Abstract

:
A handheld 3D laser scanning system is proposed for measuring large-sized objects on site. This system is mainly composed of two CCD cameras and a line laser projector, in which the two CCD cameras constitute a binocular stereo vision system to locate the scanner’s position in the fixed workpiece coordinate system online, meanwhile the left CCD camera and the laser line projector constitute a structured light system to get the laser lines modulated by the workpiece features. The marked points and laser line are both obtained in the coordinate system of the left camera in each moment. To get the workpiece outline, the handheld scanner’s position is evaluated online by matching up the marked points got by the binocular stereo vision system and those in the workpiece coordinate system measured by a TRITOP system beforehand; then the laser line with workpiece’s features got at this moment is transformed into the fixed workpiece coordinate system. Finally, the 3D information composed by the laser lines can be reconstructed in the workpiece coordinate system. A ball arm with two standard balls, which is placed on a glass plate with many marked points randomly stuck on, is measured to test the system accuracy. The distance errors between the two balls are within ±0.05 mm, the radius errors of the two balls are all within ±0.04 mm, the distance errors from the scatter points to the fitted sphere are distributed evenly, within ±0.25 mm, without accumulated errors. Measurement results of two typical workpieces show that the system can measure large-sized objects completely with acceptable accuracy and have the advantage of avoiding some deficiencies, such as sheltering and limited measuring range.

1. Introduction

Recently 3D contour measurement has been widely applied in many fields, such as heritage conservation, aerospace, automobile manufacturing and so forth. The coordinate measuring machine (CMM) and the articulated arm measurement system (AAMS) are the frequent-used 3D measurement devices. However, when measuring a large-sized object on site, the CMM is difficult to complete this task because it is not convenient being used on site. AAMS is portable and flexible, which could measure an object on site but its largest measurement scale is generally within several meters.
Now the 3D contour measurement based on optics [1,2,3,4,5,6] has been known as one of the most significant technologies with many advantages such as high accuracy, high efficiency, non-contact, which includes light coding, moire topography, structured light technique and space-time stereo vision and so forth. Light coding [7] is used to capture the movement features of human joints and figures but with low accuracy. Moire topography [8] could realize a measurement quickly but its capability is limited in estimating whether an object is concave or convex. Phase measurement profilometry (PMP) [9] can measure the contour of a moving object in real-time but it fails to find the optimum demodulation phase. Fourier transform frofilometry (FTP) [10] with high sensibility can obtain the surface points of an object only using one image captured with deformed grating fringes but its algorithm is very time-consuming. Handheld 3D line laser scanning [11] scans the contour online in several minutes by a structured light system composed by a fixed camera and handheld cross line laser projector, whose external parameters are obtained by self-calibration but its accuracy should be further improved. Spacetime stereo vision [12] can achieve the 3D contour measurement quickly by adding a projector to the binocular stereo vision system to solve the matching problem but its field-of-view is small. All these methods mentioned above measure an object from one view-point at a time. If the object needs to be measured completely, these methods have to measure the object from different view-points. Considering that each data patch has its own local coordinate system, data registration is necessary to acquire the complete surface points of the object in a common coordinate system with ineluctable registration errors at the junctions of different data patches. TRITOP system [13] has the ability to measure the rough geometry of a large-sized object on site by capturing pictures around the workpiece from different viewpoints and these pictures should include scale bars with known length. It is a non-contact measurement system proposed by Gom (Gesellschaft für Optische Messtechnik mbH) in Germany, with the primary ability of extracting the 3D coordinates of the marked points stuck on the objects. It has no limitation of measurement scale but it could not acquire the details of the 3D contour of objects. This study is developed to get the detail of the 3D contour of objects based on the TRITOP system.
In this paper, a handheld 3D laser scanning system is proposed to extract the 3D contours of large-sized objects on site. Its own measurement range is only several hundred millimeters. The extracted laser lines are within this range. In order to measure a large-sized object, the position and orientation of the scanning system is determined in real-time in the common workpiece coordinate system constructed by the known marked points randomly stuck on the workpiece. The positions of these marked points are measured by the TRITOP system beforehand. Then the laser points are transformed into the workpiece coordinate system continuously. Therefore, the system measuring range has no limitation just as TRITOP system and the measurement accuracy is largely dependent on the accuracy of TRITOP system.

2. System Composition and Working Principle

As shown in Figure 1, this measurement system is mainly composed of two CCD cameras, a line laser projector and two sets of lighting devices. The two CCD cameras constitute a binocular stereo vision system, while the left CCD camera and the line laser projector constitute a structured light system. The coordinate system of left camera is defined as the sensor coordinate system, noted as the o 1 x 1 y 1 z 1 .
At the beginning, the marked points are randomly stuck on the surface of one workpiece and then measured by TRITOP system. The working principle of TRITOP system is shown in Figure 2 [14]. Two scale bars are put aside of the workpiece to be measured, the scale bars lengths between the coded marked points at both ends are known. Some coded marked points are put on the workpiece for the orientation of 2D images. In the next step, one digital SLR camera with fixed focal length of 24 mm is used to take pictures around the workpiece from different viewpoints and each picture should include at least one scale bar in its entirety and at least 5 coded marked points. Then these pictures and the parameters of the digital camera are input to its own software, “TRITOP Professional”. Finally, the 3D coordinates of the marked points are obtained with all the relative distance errors within 0.2 mm and a 3D coordinate system is established on the fixed workpiece, defined as workpiece coordinate system o w x w y w z w .
With the 3D coordinates of marked points in o w x w y w z w , the working principle of this study to scan the 3D contour of workpiece is shown in Figure 3. At each position, the system captures two images, shown in Figure 4, each of them contains several marked points and a laser stripe modulated by the workpiece features. Then the 3D coordinates of the marked points and the laser stripe are calculated in o 1 x 1 y 1 z 1 according to the binocular stereo vision model and the structured light model respectively. To achieve 3D contour of the entire workpiece, the laser stripe in o 1 x 1 y 1 z 1 should be transformed into o w x w y w z w . The transformation is solved by using the 3D coordinates of the detected marked points in o 1 x 1 y 1 z 1 and their corresponding coordinates in o w x w y w z w measured by TRITOP system. After scanning over the workpiece, the laser points in o w x w y w z w can make up the contour of workpiece.
Therefore, to achieve the 3D contour of large-sized workpiece accurately, the binocular stereo vision model and the structured light model should be modeled and calibrated; the corresponding coordinates in o w x w y w z w of the marked points in o 1 x 1 y 1 z 1 should be found out to compute the transformation relationship between current o 1 x 1 y 1 z 1 and o w x w y w z w .
The organization of this paper is as follows: Section 3 presents the modeling and calibration of the binocular stereo vision system and the structured light system, in which the 3D coordinates of the marked points and laser points are achieved in o 1 x 1 y 1 z 1 . Section 4 addresses the coordinate match-up method of the same marked points in o 1 x 1 y 1 z 1 and in o w x w y w z w . Using the matched-up coordinates, the transformation from o 1 x 1 y 1 z 1 into o w x w y w z w is achieved. The 3D laser data obtained in o 1 x 1 y 1 z 1 can then be transformed into o w x w y w z w . Section 5 describes the experiments and gives out the accuracy analysis. The conclusion is given in Section 6.

3. Modeling and Calibration of the Handheld 3D Laser Scanning System

In order to achieve the 3D contours of given large-sized objects accurately, the internal and external parameters of both the binocular stereo vision system and structured light system should be modeled and calibrated.

3.1. Modeling and Calibration of the Binocular Stereo Vision System

Shown in Figure 5, o 1 x 1 y 1 z 1 is the left camera coordinate system, also defined as the scanner coordinate system. According to the perspective projection principle, the transformation from the camera coordinate system to the CCD array plane is shown in Equations (1) and (2), respectively.
ρ 1 [ X 1 Y 1 1 ] = [ f 1 0 0 0 f 1 0 0 0 1 ] [ x 1 y 1 z 1 ]  
ρ 2 [ X 2 Y 2 1 ] = [ f 2 0 0 0 f 2 0 0 0 1 ] [ x 2 y 2 z 2 ]  
where ρ 1 and ρ 2 are scale factors.
Only taking radial distortion into consideration, the relationship between the distorted position P i d ( X i d , Y i d ) and ideal position P ( X i , Y i ) in O i X i Y i is
{ X i = X i d ( 1 + k i 1 q 2 + k i 2 q 4 ) Y i = Y i d ( 1 + k i 1 q 2 + k i 2 q 4 )  
where i = 1 , 2 represents two cameras, q 2 = X i d 2 + Y i d 2 = ( u d u i 0 N x ) 2 + ( v d v i 0 N y ) 2 , ( u i 0 , v i 0 ) are the principal point of both cameras, k i 1 and k i 2 are the first-order and second-order distortion coefficients of both cameras.
The transformation from o 1 x 1 y 1 z 1 to o 2 x 2 y 2 z 2 is
P 2 = M 1 P 1 = [ R 1 T 1 ] P 1 ,   R 1 = [ r 11 r 12 r 13 r 14 r 15 r 16 r 17 r 18 r 19 ] , T 1 = [ t 1 x t 1 y t 1 z ] .
where P 1 = [ x 1 y 1 z 1 ] T and P 2 = [ x 2 y 2 z 2 ] T are the coordinates of one 3D point in o 1 x 1 y 1 z 1 and o 2 x 2 y 2 z 2 respectively, R 1 is a 3 × 3 rotation matrix from o 1 x 1 y 1 z 1 to o 2 x 2 y 2 z 2 , T 1 is a translation vector.
From Equations (1), (2) and (4), the transformation from O 1 X 1 Y 1 to O 2 X 2 Y 2 is derived as
ρ 2 [ X 2 Y 2 1 ] = [ f 2 r 11 f 2 r 12 f 2 r 13 f 2 t 1 x f 2 r 14 f 2 r 15 f 2 r 16 f 2 t 1 y r 17 r 18 r 19 t 1 z ] [ z 1 X 1 / f 1 z 1 Y 1 / f 1 z 1 1 ]  
Then the 3D coordinate of one point in o 1 x 1 y 1 z 1 can be calculated from Equation (6), after the unknown internal and external parameters in Equations (1)–(4) are calibrated.
{ x 1 = z 1 X 1 / f y 1 = z 1 Y 1 / f z 1 = f 1 ( f 2 t 1 y Y 2 t 1 z ) Y 2 ( r 17 X 1 + r 18 Y 1 + f 1 r 19 ) f 2 ( r 14 X 1 + r 15 Y 1 + f 1 r 16 )  
The binocular stereo vision system is calibrated by adopting the calibration target shown in Figure 6. To make the calibration results accurate, the target points should fill the whole view field and the postures of target in the stereo system should be fully considered. As a result, five poses set calibration method is introduced to get the calibration points [15,16]. The calibration points are extracted from15 pair images with 5 different postures and 3 pair images captured at each posture.
The 2D coordinates of the marked points are extracted from these 15 pairs of images with sub-pixel precision [17]. The unknown parameters in Equations (1)–(4), including R 1 , T 1 , f 1 , f 2 , u 10 , v 10 , u 20 , v 20 , k 11 , k 12 , k 21 and k 22 , are then calibrated by adopting the binocular stereo calibration function of OpenCV.

3.2. Modeling and Calibration of the Structured Light System

The principle of the structured light system is based on the triangulation method shown in Figure 7. The relative position between the left image and the laser plane is optimally designed for achieving a satisfying working depth and measurement accuracy. The model of the structured light system should be the transformation from the 2D image plane to the 2D laser plane where a 2D coordinate system o L x L y L is established, it is a one-to-one mapping relationship and can be created as
ρ 3 P ˜ = M 2 P ˜ L ,   M 2 = [ a 1 a 2 a 3 a 4 a 5 a 6 a 7 a 8 1 ]  
where P ˜ = [ u 1 v 1 1 ] T and P ˜ L = [ x L y L 1 ] T are the homogeneous coordinates of one calibration point in o u 1 v 1 and in o L x L y L respectively, a 1 ~ a 8 are defined as intrinsic parameters.
In Equation (7), ( u 1 , v 1 ) is a coordinate on the image plane after the lens distortions are corrected using the calibrated distortion coefficients in Section 3.1. Once the intrinsic parameters a 1 ~ a 8 are calibrated, the 2D coordinate of a point ( x L , y L ) on a laser stripe can be obtained from the ( u 1 , v 1 ) on the image plane.
To achieve 3D measurement, it is necessary to transform the 2D laser points in o L x L y L into o 1 x 1 y 1 z 1 , such a transformation is the extrinsic model of the structured light system and it is created as
P 1 = M 3 P ˜ L = [ R 3 T 3 ] P ˜ L ,   R 3 = [ r 31 r 34 r 32 r 35 r 33 r 36 ] , T 3 = [ t 3 x t 3 y t 3 z ] .
where P 1 = [ x 1 y 1 z 1 ] T is the coordinate of a 3D calibration point in o 1 x 1 y 1 z 1 . P ˜ L = [ x L y L 1 ] T is the homogeneous coordinate of a 2D calibration point in o L x L y L . R 3 is a 3 × 2 rotation matrix, which includes the unit direction vectors of x L and y L axes in o 1 x 1 y 1 z 1 and T 3 is the position of o L in o 1 x 1 y 1 z 1 . They are the extrinsic parameters to be calibrated.
For solving the intrinsic and extrinsic parameters, the calibration points in the laser plane need to be established. During this process, the LED light devices are not working to extract accurately the laser points. Firstly, the laser plane is projected on a glass plate painted with white matt paint. The laser stripe on the plate is captured by both cameras. Secondly, the center position of the laser stripe in o u 1 v 1 is extracted using the gray-weight centroid method [18,19,20]. Thirdly, a point on the laser stripe in the left image is matched up with its corresponding point in the right image according to the epipolar geometry constraint. The 3D coordinate of the laser points in o 1 x 1 y 1 z 1 is calculated from Equation (6).
Following the process described above, several laser stripes are obtained at different positions to keep the laser points distributed in different regions of the laser plane, shown in Figure 8. Since the laser stripe projected on the glass plate is a line, the 3D points on it are co-linear in o 1 x 1 y 1 z 1 . In Figure 8, 1 , 2 , 3 , n are co-linear points. They are applied to fit a line, direction of the line is from 1 to n which can be regarded as the direction of an axis in the laser plane, defined as x L , its direction vector in o 1 x 1 y 1 z 1 is [ r 31 r 32 r 33 ] T . Subsequently, all of the collected points in o 1 x 1 y 1 z 1 are used to fit a plane, the normal direction of the plane is computed as n . From [ r 31 r 32 r 33 ] T and n , an axis perpendicular to [ r 31 r 32 r 33 ] T in the laser plane can be determined, defined as y L and its direction vector is [ r 34 r 35 r 36 ] T , according to the right hand rule. If y L passes through point 1, the origin of the 2D coordinate frame is located at point 1 whose coordinate is ( t 3 x , t 3 y , t 3 z ) T .
As a result, a 2D coordinate frame o L x L y L is established in the laser plane, the coordinate of o L and the direction vectors of x L and y L in o 1 x 1 y 1 z 1 are simultaneously solved while establishing o L x L y L . The extrinsic parameters in Equation (8) is then solved. To calibrate the intrinsic parameters, the 2D calibration points in the laser plane should be established by transforming the 3D calibration points in o 1 x 1 y 1 z 1 into o L x L y L . Deriving from Equation (8), we have
P ˜ L = ( M 3 ) 1 P 1
With all the calibration points in o L x L y L and their corresponding points in o u 1 v 1 , the intrinsic parameters a 1 ~ a 8 in Equation (7) are worked out.

4. Match-Up Method of the Marked Points in o 1 x 1 y 1 z 1 and o w x w y w z w

4.1. Matching Up the Marked Points

The objective of matching up the marked points is to identify the coordinates of the same marked point in o 1 x 1 y 1 z 1 and in o w x w y w z w . In this study, this is achieved by using a distance constraint algorithm.
The position of each marked point in o w x w y w z w obtained by TRITOP has a sequence number. To match up the marked points in both o 1 x 1 y 1 z 1 and o w x w y w z w , the known marked points in o w x w y w z w should be assigned according to relative distances between marked points.
Firstly, the distances d i j between any two marked points are calculated in o w x w y w z w . Considering the small working range of binocular stereo vision system, the maximum distance between two marked points will be within d max 300   mm in o 1 x 1 y 1 z 1 . Then all of distances between marked points in o w x w y w z w meeting this condition, d i j 300   mm , will be sorted according to the ascending order and stored in a library (noted as L w = { P i , d i j , P j } , j i ) and each distance has a corresponding node. The library is shown below, Sensors 18 03567 i001 where P i , P j , P r , P t , P a , P c are marked points with the sequence number i, j, r, t, a, c in o w x w y w z w respectively, d i j , d r t , d a c are the distance between them respectively and d i j d r t d a c .
Then for each marked point P i , a workpiece sub-library (noted as L w i , i = { 1 , 2 , , n } , n is the number of marked points) is constructed by P i and its neighbors (the points connected with P i meeting d i j 300   mm ), which is also sorted ascendingly according to the distances. One workpiece sub-library, L w i , is Sensors 18 03567 i002 where points P i are connected with points P j , P k , …, P m and d i j d i k d i m 300   mm .
Then the handheld scanner begins scanning the workpiece’s contour. For example, n t j marked points are obtained in o 1 x 1 y 1 z 1 at the moment t j and the distances between arbitrary two marked points are calculated. Similar to L w i , the scanner sub-libraries (noted as L s j , j = { 1 , 2 , , n t j } ) are created for each of these marked points and these marked points make up of a web, noted as W t j (see Figure 9). As all of the marked points are fixed on the workpiece, there must exist a same web in o w x w y w z w , shown in Figure 10. Since the marked points are randomly stuck on the workpiece, there is one and only one web W i in o w x w y w z w as same as the web W t j in Figure 9.
To find the web W i in o w x w y w z w , a distance constraint algorithm employed is explained below (the web in Figure 9 is as an example):
(1)
Choose one marked point, such as P A , in Figure 9 with its scanner sub-library L s A in o 1 x 1 y 1 z 1 ;Sensors 18 03567 i003
(2)
find the distances in library L w meeting the conditions | d 5 d i j | ε and record the sequences of the marked points to a list. Assuming | d 5 d i j | ε and | d 5 d r t | ε , then the list of candidates is generated, l c = { i , j , r , t } .
(3)
check the workpiece sub-libraries according to the list l c , { L w i , L w j , L w r , L w t } , successively to find whether it exists all the distances d m meeting the conditions | d k d m | ε ( k = 1 , 6 , 7 , = i , j , r , t and m stands for one distance in workpiece sub-libraries L w ) respectively. For instance, if the distances d j m in L w j meet to | d k d m | ε , then the point P j is believed as the same point with P A in o w x w y w z w .
(4)
repeat the steps (1)~(3) to find the points corresponding to P B , P C , P D , P E , then the marked points are matched up well.

4.2. Obtaining the Contour of Workpiece in o w x w y w z w

To get the whole contour of one workpiece, the laser lines should be transformed into o w x w y w z w from o 1 x 1 y 1 z 1 . To do this, the transformation relationship between these two frames (see Equation (10)) should be worked out firstly by the matched-up marked points in Section 4.1.
P w = M 4 P ˜ 1 = [ R 4 T 4 ] P ˜ 1 ,   R 4 = [ r 41 r 42 r 43 r 44 r 45 r 46 r 47 r 48 r 49 ] , T 4 = [ t 4 x t 4 y t 4 z ]  
where R 4 is a 3 × 3 rotation matrix from o 1 x 1 y 1 z 1 to o w x w y w z w , T 4 is a translation vector. P w = [ x w y w z w ] T and P 1 = [ x 1 y 1 z 1 ] T are the 3D coordinates of a same 3D point in o w x w y w z w and o 1 x 1 y 1 z 1 , respectively.
Then the laser lines got from the structured light system can be transformed into the fixed o w x w y w z w . When the handheld scanner finishes the scanning process, all the laser lines modulated by the workpiece’s features are transformed into o w x w y w z w . The achievable result will be got.

5. Experiments and Results

5.1. System Hardware and Structure

The system is shown in Figure 1b. Two cameras are made by Pointgrey, Canada, with the model FL3-FW-03S3M. The resolution of the CCD array plane is 640 × 480. The frame rate of the two cameras is set as 60 frames/s, the shutter time as 8 ms, the gain as 6 dB. The lenses are generated by Computar, with 8 mm fixed focal length for 1 / 2 format sensors. To obtain clear laser lines, a small aperture is adopted. Moreover, to achieve an appropriate measuring range, the distance L from the intersection point of two optical axes N to the line between the optical points of the two cameras is designed as L = 300   mm and the angle of the two optical axes is designed as 38.6° considering two factors: (1) at least 5 same marked points can be synchronously captured to a large extent by the two cameras for the binocular stereo matching up; (2) the depth of field should be kept within a suitable range ( 280   mm L 2 350   mm in this study) for requiring a satisfying accuracy. As shown in Figure 11, the gray area is the effective view field.

5.2. System Accuracy Test

In order to test the accuracy of the system, the device in Figure 12 is adopted, which includes a glass plate painted with white matt paint and a ball arm with two standard spheres. The size of the glass plate is 400 mm × 500 mm, in which 69 marked points were randomly stuck on the glass plane and their 3D coordinates in the o w x w y w z w were measured accurately using TRITOP system beforehand. The radius of sphere 1 is 20.030 mm, the radius of sphere 2 is 20.057 mm. The distance between them is 198.513 mm.
As the ball arm is placed on the glass plate motionlessly, they can be regarded as one object. The system can simultaneously measure the glass plate and the ball arm. As a result, the systematic error and random error of the system can be estimated by using the obtained surface points on the two spheres.

5.2.1. Systematic Error Test

The ball arm was placed at ten different positions with different orientations on the glass plate. It was measured at each position and the collected laser points are used to fit a sphere. The fitted radii of sphere 1 and sphere 2 and the distances between them are listed in Table 1. The errors between the standard radii and the measured radii are calculated and shown in Figure 13. The errors between the standard distance and the measured distances are computed and presented in Figure 14.
To evaluate the accuracy of the system, one AAMS [21] (see Figure 15) is introduced to measure the ball arm ten times at ten different positions and orientations, similar to our system, on one plane of its working rang (700 mm × 500 mm × 400 mm) and the radius errors of two spheres and their distance errors are shown in Figure 16 and Figure 17 respectively.
It can be observed from Figure 13 and Figure 14 that the errors of the two radii and their distance errors are within ±0.04 mm and ±0.05 mm respectively with our system, while Figure 16 and Figure 17 show that the radius errors of two spheres are within ±0.07 mm but their distance errors are fluctuated largely depending on the ball arm’s positions, about ±0.3 mm, with the AAMS. The results indicate the high accuracy of our system.

5.2.2. Random Error Test

To test the random error of our system, the distance errors from the scatter points to the fitted sphere surface are tested. Fitting sphere using the surface points of the sphere1 obtained at Section 5.2.1, the distances distribution from the surface points got by our system to the fitted sphere were obtained, shown in Figure 18. Table 2 and Table 3 show the maximal distance errors from the surface points to the fitted spheres of the ten times with our system and with AAMS respectively.
It can be seen that all the distance errors from the surface points to the fitted sphere are within ±0.25 mm with our system, while the maximum distance of AAMS is only 0.151 mm. The reason is that the scanning path of arm robot can be set, so that it can scan the spheres orderly and get only one layer of laser points. In both systems, the positive errors and the negative errors are distributed approximately symmetrically.

5.3. Working Efficiency Test

With the known the marked points stuck on the workpiece measured by TRITOP system, the process of scanning one workpiece in this study is composed by capturing images, extracting the centers of marked points and the centers of laser stripe, matching up the corresponding 3D coordinates of the marked points in o 1 x 1 y 1 z 1 and o w x w y w z w , establishing the transformation from the o 1 x 1 y 1 z 1 to the o w x w y w z w and transforming the laser stripe into o w x w y w z w . The frame rate of the two cameras is 60 frames/s, the shutter time is 8 ms and thus the time of capturing an image is 24.7 ms. A time-consuming test shows the time of extracting the centers of marked points and the laser stripe in both images is about 17.5 ms. Therefore, the total time for obtaining one laser stripe in o w x w y w z w is about 42.2 ms. In other words, about 23 laser lines could be got in one second. The maximum laser point number obtained in per second is 14,720 if the 640 points on a line are all sampled.

5.4. Application

Two workpieces shown in Figure 19a and Figure 20a with the size 1100 mm × 500 mm × 200 mm and 600 mm × 420 mm × 190 mm respectively are measured to test the performance of this system.
For the workpiece in Figure 19, to keep at least five points are obtained in o 1 x 1 y 1 z 1 , 221 marked points are randomly stuck on the workpiece, shown in Figure 19. Considering sticking one marked point on the workpiece spending about 1 s, the time consumption of this part is within 4 min. Their 3D coordinates in o w x w y w z w are measured by TRITOP system, which takes about 5 min, shown in Figure 19b. In order to guarantee high accuracy, only more than five marked points are matched up in o 1 x 1 y 1 z 1 and o w x w y w z w , could the obtained laser points in o 1 x 1 y 1 z 1 be transformed into o w x w y w z w . About 15 min is spent to scan this workpiece, with the number of the collected laser points, 4,284,509. Figure 19c presents the reduced laser points by the rule of sampling one point from three points evenly. Figure 19d depicts the shaped form of Figure 19c by the software “imageware.”
For the workpiece in Figure 20a, 86 marked points are randomly stuck on the workpiece. The contour of this workpiece measured by this scanning system is shown in Figure 20b,c. The time expenditure to scan this workpiece is about 12 min to get as much information as we can, especially the edge area and at the regions with large curvature.
It can be seen from Figure 19b,c that the TRITOP system can only get the marked points, while the scanning system in this study can obtain the laser points covering the whole workpiece. From the Figure 19c,d, we can also find that there is no laser point near the edge of the workpiece and at the regions with large curvature. To get the laser points in the edge area and at the regions with large curvature of workpiece in Figure 20b, more time is spent. It is influenced by the number of marked points, the light noises, the posture of scanning system and so forth. Especially, the number of the marked points captured by two cameras simultaneously usually is less than that captured in the other regions and the number of the marked points matched up correctly is difficult to get five.

5.5. Discussion

The proposed handheld 3D laser scanning system can obtain the whole contours of typical large-sized workpieces with many features on site with acceptable accuracy and time expenditure. The system’s valid depth of field is 280   mm L 2 350   mm , the valid view field is about 300   mm × 300   mm . To get the contours, it needs the TRITOP system to measure the marked points stuck on the whole workpiece. To get an acceptable accuracy, usually 8~10 marked points (at least 5 points) should be captured synchronously by both cameras in the common view field, 300   mm × 300   mm .
The accuracy of this system is tested by evaluating the radii of spheres and their distances, with errors within ± 0 . 05   mm . The cloud thickness is mainly within ± 0.25   mm . The errors are distributed evenly based on the marked points measured by TRITOP system, without accumulated errors. The accuracy is relevant to the coordinates of marked points measured by TRITOP system, the internal and external parameters of the scanning system and the transformation relationship between o 1 x 1 y 1 z 1 and o w x w y w z w in each moment.
The performance of the system is verified by scanning a large-sized workpiece (1100 mm × 500 mm × 200 mm) and a medium-sized workpiece (600 mm × 420 mm × 190 mm) with complex features. The time consumption includes three parts: the time of sticking the marked points on the workpiece, the time of measuring the coordinates of marked points by TRITOP system and the time of scanning the contour with this system, which is relevant to the size of workpiece. The contours of workpieces in Figure 19 and Figure 20 can be reconstructed in 25 min and 20 min respectively.
But there are also some defects to be improved. The edge of the workpiece and the regions with large curvature are difficult to be obtained. The main reason is that the marked points in o 1 x 1 y 1 z 1 got by the binocular stereo vision system in these regions are difficult to be detected, which leads to the failure of transformation between o 1 x 1 y 1 z 1 and o w x w y w z w .

6. Conclusions

This paper presents a mobile 3D scanning system based on the known marked points obtained by the TRITOP system technique beforehand. Compared with the existed methods, (1) it can measure the 3D contour of large-sized workpieces on site with complex features by overcoming some problems in current 3D scanning methods, such as range limitation and sheltering; (2) the system is easy to be used with low demand to the operators, the scanning process can be stopped and discontinuous to check and get laser points; (3) its errors are distributed evenly.
The accuracy of the system is tested by measuring a ball arm with two standard spheres. The ball arm is placed on a glass plane, on which many marked points are randomly stuck and measured by a TRITOP system. The distance errors between the two sphere centers are within ±0.05 mm, the radius errors of two spheres are all within ±0.04 mm and the distance errors from the surface points to the fitted sphere are within ±0.25 mm. Experimental results demonstrate that the system enjoys high accuracy and high stability and can satisfy the accuracy demand of measuring the 3D contours of large-sized workpieces on site.
The measuring results of two workpieces with complex structure also indicate the difficulty in collecting data points near the edge of the workpiece and at the regions with large curvature. Because the number of the marked points correctly matched in o 1 x 1 y 1 z 1 and o w x w y w z w in these regions is less than five. To increase the matched number, it is necessary to increase the density of the marked points on the object or enlarge the working range of the system.

Author Contributions

Conceptualization, Z.X. and X.W.; Methodology, X.W. and Z.X.; Software, X.W.; Validation, Z.X., K.W., and L.Z.; Formal Analysis, K.W.; Investigation, X.W.; Resources, Z.X.; Data Curation, X.W. and K.W.; Writing-Original Draft Preparation, X.W.; Writing-Review & Editing, Z.X., K.W., L.Z.; Visualization, X.W. and K.W.; Supervision, Z.X.; Project Administration, Z.X., L.Z.; Funding Acquisition, Z.X., K.W.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 61571478, 61601428, 51709245, 51509229, and the Doctoral Fund of Ministry of Education of China, grant number 20110132110010.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Amans, O.C.; Beiping, W.; Ziggah, Y.Y.; Daniel, A.O. The need for 3D laser scanning documentation for select Nigeria cultural heritage sites. Eur. Sci. J. 2013, 9, 75–91. [Google Scholar]
  2. Price, G.J.; Parkhurst, J.M.; Sharrock, P.J.; Moore, C.J. Real-time optical measurement of the dynamic body surface for use in guided radiotherapy. Phys. Med. Biol. 2012, 57, 415–436. [Google Scholar] [CrossRef] [PubMed]
  3. Stevanovic, N.; Markovic, V.M.; Nikezic, D. New method for determination of diffraction light pattern of the arbitrary surface. Opt. Laser Technol. 2017, 90, 90–95. [Google Scholar] [CrossRef]
  4. Ke, F.; Xie, J.; Chen, Y.; Zhang, D.; Chen, B. A fast and accurate calibration method for the structured light system based on trapezoidal phase-shifting pattern. Optik 2014, 125, 5249–5253. [Google Scholar] [CrossRef]
  5. Suresh, V.; Holton, J.; Li, B. Structured light system calibration with unidirectional fringe patterns. Opt. Laser Eng. 2018, 106, 86–93. [Google Scholar] [CrossRef]
  6. Cuesta, E.; Suarez-Mendez, J.M.; Martinez-Pellitero, S.; Barreiro, J.; Zapico, P. Metrological evaluation of Structured Light 3D scanning system with an optical feature-based gauge. Procedia Manuf. 2017, 13, 526–533. [Google Scholar] [CrossRef]
  7. Ganganath, N.; Leung, H. Mobile robot localization using odometry and kinect sensor. In Proceedings of the IEEE Conference Emerging Signal Processing Applications, Las Vegas, NV, USA, 12–14 January 2012; pp. 91–94. [Google Scholar]
  8. Tang, Y.; Yao, J.; Zhou, Y.; Sun, C.; Yang, P.; Miao, H.; Chen, J. Calibration of an arbitrarily arranged projection moiré system for 3D shape measurement. Opt. Laser Eng. 2018, 104, 135–140. [Google Scholar] [CrossRef]
  9. Zhong, M.; Chen, W.; Su, X.; Zheng, Y.; Shen, Q. Optical 3D shape measurement profilometry based on 2D S-Transform filtering method. Opt. Commun. 2013, 300, 129–136. [Google Scholar] [CrossRef]
  10. Zhang, Z.; Jing, Z.; Wang, Z.; Kuang, D. Comparison of Fourier transform, windowed Fourier transform, and wavelet transform methods for phase calculation at discontinuities in fringe projection profilometry. Opt. Lasers Eng. 2012, 50, 1152–1160. [Google Scholar] [CrossRef]
  11. Bleier, M.; Nüchter, A. Towards robust self-calibration for handheld 3D line laser scanning. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 31–36. [Google Scholar] [CrossRef]
  12. Zhang, S. Handbook of 3D Machine Vision: Optical Metrology and Imaging; CRC Press: Boca Raton, FL, USA, 2013; pp. 57–70. [Google Scholar]
  13. Koutecký, T.; Paloušek, D.; Brandejs, J. Method of photogrammetric measurement automation using TRITOP system and industrial robot. Optik Int. J. Light Electron Opt. 2013, 124, 3705–3709. [Google Scholar] [CrossRef]
  14. Gmurczyk, G.; Reymer, P.; Kurdelski, M. Global FEM Model of combat helicopter. J. KONES Powertrain Transp. 2011, 18, 137–144. [Google Scholar]
  15. Xu, H.; Ren, N. Working Principle and System Calibration of ATOS Optical Scanner. Tool Eng. 2006, 40, 81–84. [Google Scholar]
  16. Xie, Z.; Lu, W.; Wang, X.; Liu, J. College of Engineering, Ocean University of China. Analysis of Pose Selection on Binocular Stereo Calibration. Chin. J. Lasers 2015, 42, 237–244. [Google Scholar]
  17. Chen, S.; Xia, R.; Zhao, J.; Chen, Y.; Hu, M. A hybrid method for ellipse detection in industrial images. Pattern Recognit. 2017, 68, 82–98. [Google Scholar] [CrossRef]
  18. Sun, Q.; Liu, R.; Yu, F. An extraction method of laser stripe centre based on Legendre moment. Optik 2016, 127, 912–915. [Google Scholar] [CrossRef]
  19. Tian, Q.; Zhang, X.; Ma, Q.; Ge, B. Utilizing polygon segmentation technique to extract and optimize light stripe centerline in line-structured laser 3D scanner. Pattern Recognit. 2016, 55, 100–113. [Google Scholar]
  20. Sun, Q.; Chen, J.; Li, C. A robust method to extract a laser stripe centre based on grey level moment. Opt. Laser Eng. 2015, 67, 122–127. [Google Scholar] [CrossRef]
  21. Mu, N.; Wang, K.; Xie, Z.; Ren, P. Calibration of a flexible measurement system based on industrial articulated robot and structured light sensor. Opt. Eng. 2017, 56, 054103. [Google Scholar] [CrossRef]
Figure 1. System composition and structure (a) schematic diagram; (b) picture of measurement system.
Figure 1. System composition and structure (a) schematic diagram; (b) picture of measurement system.
Sensors 18 03567 g001
Figure 2. Working principle of TRITOP.
Figure 2. Working principle of TRITOP.
Sensors 18 03567 g002
Figure 3. System working principle.
Figure 3. System working principle.
Sensors 18 03567 g003
Figure 4. Two images containing marked points and a laser stripe captured with small aperture to get clear laser stripe. (a) Left image; (b) right image.
Figure 4. Two images containing marked points and a laser stripe captured with small aperture to get clear laser stripe. (a) Left image; (b) right image.
Sensors 18 03567 g004
Figure 5. Binocular stereo vision system model.
Figure 5. Binocular stereo vision system model.
Sensors 18 03567 g005
Figure 6. Target for binocular vision calibration.
Figure 6. Target for binocular vision calibration.
Sensors 18 03567 g006
Figure 7. Structured light system model.
Figure 7. Structured light system model.
Sensors 18 03567 g007
Figure 8. Relationship between o 1 x 1 y 1 z 1 and o L x L y L .
Figure 8. Relationship between o 1 x 1 y 1 z 1 and o L x L y L .
Sensors 18 03567 g008
Figure 9. The web W t j constituted by the marked points obtained in o 1 x 1 y 1 z 1 at the moment t j .
Figure 9. The web W t j constituted by the marked points obtained in o 1 x 1 y 1 z 1 at the moment t j .
Sensors 18 03567 g009
Figure 10. Corresponding marked points to Figure 9 and the web W i in o w x w y w z w . The black marked points are the same points with the points in Figure 9, while the gray marked points are other points stuck on the workpiece.
Figure 10. Corresponding marked points to Figure 9 and the web W i in o w x w y w z w . The black marked points are the same points with the points in Figure 9, while the gray marked points are other points stuck on the workpiece.
Sensors 18 03567 g010
Figure 11. Field-of-view of the measurement system. The distance L from the intersection point of two optical axes N to the line between the optical points of the two cameras is designed as L = 300   mm and the angle of the two optical axes is designed as 38.6°. The depth of field should be kept within a suitable range ( 280   mm L 2 350   mm in this study). The gray range is the effective view field of the system.
Figure 11. Field-of-view of the measurement system. The distance L from the intersection point of two optical axes N to the line between the optical points of the two cameras is designed as L = 300   mm and the angle of the two optical axes is designed as 38.6°. The depth of field should be kept within a suitable range ( 280   mm L 2 350   mm in this study). The gray range is the effective view field of the system.
Sensors 18 03567 g011
Figure 12. The device used to test the accuracy of the system. The glass plate ( 400   mm × 500   mm ) is painted with white matt paint, in which 69 marked points are struck on with known 3D coordinates in o w x w y w z w measured accurately by TRITOP system. A ball arm with two standard spheres is put on this glass plate to test the accuracy of the system, the standard radii of sphere 1 and sphere 2 and their distance are 20.030 mm, 20.057 mm, 198.513 mm, respectively.
Figure 12. The device used to test the accuracy of the system. The glass plate ( 400   mm × 500   mm ) is painted with white matt paint, in which 69 marked points are struck on with known 3D coordinates in o w x w y w z w measured accurately by TRITOP system. A ball arm with two standard spheres is put on this glass plate to test the accuracy of the system, the standard radii of sphere 1 and sphere 2 and their distance are 20.030 mm, 20.057 mm, 198.513 mm, respectively.
Sensors 18 03567 g012
Figure 13. Errors between the standard radii and the measured radii of the two spheres with our system.
Figure 13. Errors between the standard radii and the measured radii of the two spheres with our system.
Sensors 18 03567 g013
Figure 14. Distance errors between two spheres with our system.
Figure 14. Distance errors between two spheres with our system.
Sensors 18 03567 g014
Figure 15. The measurement of ball arm by AASM.
Figure 15. The measurement of ball arm by AASM.
Sensors 18 03567 g015
Figure 16. Errors between the standard radii and the measured radii of the two spheres with AAMS.
Figure 16. Errors between the standard radii and the measured radii of the two spheres with AAMS.
Sensors 18 03567 g016
Figure 17. Distance errors between two spheres with AAMS.
Figure 17. Distance errors between two spheres with AAMS.
Sensors 18 03567 g017
Figure 18. Distribution of the errors from the scatter points to the fitted sphere surface in our system.
Figure 18. Distribution of the errors from the scatter points to the fitted sphere surface in our system.
Sensors 18 03567 g018
Figure 19. The contour of large-sized workpiece measured by the handheld scanning system in this study. (a) the workpiece with typical structures for testing the system performance; (b) the measured marked points by TRITOP system; (c) the laser points reduced to a third by the system studied; (d) the shaped of (c) generated in “Imageware.”
Figure 19. The contour of large-sized workpiece measured by the handheld scanning system in this study. (a) the workpiece with typical structures for testing the system performance; (b) the measured marked points by TRITOP system; (c) the laser points reduced to a third by the system studied; (d) the shaped of (c) generated in “Imageware.”
Sensors 18 03567 g019
Figure 20. The contour of medium-sized workpiece measured by the handheld scanning system in this study. (a) The medium-sized workpiece with typical structures for testing the system performance; (b) the measured laser points reduced to a third by the system studied (c) the shaped of (b) generated in “Imageware.”
Figure 20. The contour of medium-sized workpiece measured by the handheld scanning system in this study. (a) The medium-sized workpiece with typical structures for testing the system performance; (b) the measured laser points reduced to a third by the system studied (c) the shaped of (b) generated in “Imageware.”
Sensors 18 03567 g020
Table 1. The fitted radii of the two spheres and the distances between them.
Table 1. The fitted radii of the two spheres and the distances between them.
Measurement No.Radius1 (mm)Radius2 (mm)Distance (mm)
120.03720.069198.504
220.01620.034198.512
320.04320.036198.532
419.99520.076198.472
520.03920.020198.558
620.05420.097198.546
720.01820.053198.529
819.99620.032198.550
920.04220.081198.481
1020.06420.094198.553
Average20.030420.0592198.523
Table 2. Maximum distance errors from the scatter points to the fitted sphere surface with our system.
Table 2. Maximum distance errors from the scatter points to the fitted sphere surface with our system.
Measurement No.12345678910
Max distance1 1/mm0.2350.2220.2160.2450.2390.2420.2290.2190.2330.240
Max distance2 2/mm0.2520.2240.2540.2000.2400.2320.2530.2230.2410.235
1 Max distance1 is the maximum distance from the scatter points outside the sphere to the fitted sphere surface. 2 Max distance2 is the maximum distance from the scatter points inside the sphere to the fitted sphere surface.
Table 3. Maximum distance errors from the scatter points to the fitted sphere surface with AAMS.
Table 3. Maximum distance errors from the scatter points to the fitted sphere surface with AAMS.
Measurement No.12345678910
Max distance1/mm0.0810.0950.1070.0890.1390.1270.1510.0900.1420.149
Max distance2/mm0.1170.0970.1190.1430.0920.1050.1260.1340.0950.113

Share and Cite

MDPI and ACS Style

Wang, X.; Xie, Z.; Wang, K.; Zhou, L. Research on a Handheld 3D Laser Scanning System for Measuring Large-Sized Objects. Sensors 2018, 18, 3567. https://doi.org/10.3390/s18103567

AMA Style

Wang X, Xie Z, Wang K, Zhou L. Research on a Handheld 3D Laser Scanning System for Measuring Large-Sized Objects. Sensors. 2018; 18(10):3567. https://doi.org/10.3390/s18103567

Chicago/Turabian Style

Wang, Xiaomin, Zexiao Xie, Kun Wang, and Liqin Zhou. 2018. "Research on a Handheld 3D Laser Scanning System for Measuring Large-Sized Objects" Sensors 18, no. 10: 3567. https://doi.org/10.3390/s18103567

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop