Next Article in Journal
Mid-Infrared Sensor Based on Dirac Semimetal Coupling Structure
Previous Article in Journal
Device-to-Device (D2D) Multi-Criteria Learning Algorithm Using Secured Sensors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Internal Parameters Calibration of Vision Sensor and Application of High Precision Integrated Detection in Intelligent Welding Based on Plane Fitting

Key Laboratory for Advanced Materials Processing Technology, Ministry of Education, Department of Mechanical Engineering, Tsinghua University, Beijing 100084, China
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(6), 2117; https://doi.org/10.3390/s22062117
Submission received: 29 January 2022 / Revised: 25 February 2022 / Accepted: 26 February 2022 / Published: 9 March 2022
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Vision sensing is a key technology to realize on-line detection of welding groove sizes and welding torch relative position and posture parameters during the arc welding process of intelligent production. For the specially designed vision sensor based on combined laser structured lights, an integrated calibration method for its internal parameters is proposed firstly, which improves the efficiency, accuracy and comprehensiveness of internal parameter calibration for a line structured light vision sensor and provides a good foundation for industrial application of the vision sensor. Then, the high precision integrated detection algorithms are derived for the V-groove size parameters and the spatial position and posture (SPP) parameters of the welding torch relative to the welding groove based on a single modulated laser lines image. The algorithms make full use of the data in a single modulated laser lines image, adopting data segmentation and plane fitting to realize the 3D reconstruction of V-groove surfaces and its adjacent workpiece surfaces of planar workpiece, so solving the parameters with high precision. In the verification tests, the relative detection error of V-groove size parameters of planar workpiece is less than 1%, and the relative detection error of SPP parameters of welding torch relative to the welding groove is less than 5%, which separately shows the effectiveness and accuracy of the calibration method and the detection algorithms. This research work provides a good technical support for the practical application of the specially designed vision sensor in the intelligent welding production.

1. Introduction

For the welding of workpieces with determined groove shape and size located in specified spatial posture, the spatial position and posture (SPP) of the welding torch relative to the welding groove have great influence on weld formation quality that cannot be ignored. Thus, they should be reasonably set prior to implementing the arc welding process in addition to the basic process parameters such as welding current, arc voltage, welding speed and motion trajectory of the welding torch. However, machining and assembly errors for the welding groove sizes usually exist, and thermal deformation when workpieces are being welded inevitably occurs. This will often lead to defects and deficiencies with the weld formation if only relying on the preset SPP as well as the motion trajectory of the welding torch relative to the welding groove. Thus, it is necessary during the actual welding procedure to carry out high-precision welding groove sizes detection and weld tracking, that is, the motion trajectory control of the welding torch and synchronous control on the SPP of the welding torch relative to the welding groove. In other words, this important direction in the welding field is worthy of research and development to realize intelligent welding by on-line sensing detection and real-time feedback control, which can not only improve the efficiency of welding production but also effectively ensure the quality of the welded joint and product [1,2].
Vision sensing is a key technology to realize intelligent welding. Compared with other vision sensing technologies, the line structured light vision sensing based on the perspective projection principle has the comprehensive advantages of simple operation, high detection accuracy and low system cost and has been widely used in modern industrial production [3].
The premise of a vision sensor based on line structured light successfully applied in realizing high precision detection is firstly to calibrate its internal parameters effectively, including camera calibration and structured light plane calibration [4]. In the camera calibration, Zhang’s camera calibration method [5] is flexible and convenient and has been widely used. On the calibration of the structured light plane, there are many studies in recent years. Some specially designed 3D targets were used as calibrators [6,7,8], and good calibration results of the structured light plane were obtained. However, these 3D calibration methods rely on high precision 3D targets or motion devices, which greatly limit their application. Then, some planar calibration methods have been developed. Xu et al. [9] constructed the laser structured light plane through multiple Plücker matrices of the 3D crossing lines between the target planes and the laser projection planes. Ha [10] established 3D–3D correspondences between the camera and laser range finder by a calibration structure that has a triangular hole on its plane. Chen et al. [11] proposed a geometric calibration method for a line structured light sensor, which calibrates camera intrinsic parameters and the laser structured light plane equation by using a single circular target designed to construct geometric constraints. However, these calibration methods of the structured light plane need a specially designed target, auxiliary equipment or a relatively complex calibration model, which reduce their applicability and convenience.
Meanwhile, the line structured light vision sensor has been widely used in the welding field, such as weld tracking, welding torch location determining and so on [12,13,14]. However, the intelligent welding also requires on-line detection of the SPP parameters of the welding torch relative to the welding groove, reasonably adjusting and controlling them according to the actual shape and size parameters detected in the welding groove, so as to obtain high quality weld formation.
In terms of the welding groove size parameters detection based on line structured light vision sensing, some typical research works are as follows. He et al. [15] and Zhu et al. [16] realized the detection of weld geometry sizes, including width, height, etc., by using a corresponding image processing algorithm. Kim et al. [17] proposed a point cloud registration technique for multiple weld seams, which realized the 3D information extraction of weld seams. However, when using the vision sensor based on single line structured light for the welding groove size parameters detection, it is necessary to preset the height and posture of the sensor relative to the measured workpiece and for them to remain unchanged during the welding process, otherwise the vision sensor needs to be recalibrated. To solve this problem, Guo et al. [18,19] proposed a new vision sensor based on combined laser structured lights and derived the corresponding detection algorithm, which realizes the detection of welding groove size parameters and position deviation of the welding torch relative to the welding groove. However, this detection algorithm is based on the condition that the vision sensor is perpendicular to the workpiece surface when it is applied for the detection of planar workpiece, which still has certain limitations.
In terms of the relative position and posture detection based on line structured light vision sensing, some research works have also been carried out. Xue et al. [20] and Zeng et al. [21] used a crosshair laser projected onto the workpiece surface and obtained the position and posture between the welding torch and the groove by the combination of 2D and 3D information in the laser images. Kiddee et al. [12] obtained the relative position and posture of edges of a V-groove weld seam by using a modified template matching for ROI image set by the cross mark of the structured light. Zhang [22] proposed a mathematical model of SPP detection of the welding torch relative to the welding groove based on a combined laser structured lights vision sensor and realized the detection of position and angle parameters of the welding torch relative to the welding groove.
From the above brief review and analysis, we can see that the current calibration method of the line structured light vision sensor is complex and inconvenient. Meanwhile, in the detection process, the mentioned detection methods for the welding groove sizes and welding torch relative position and posture parameters depend on sensor installation and welding equipment motion parameters at different degrees. There are few studies that have completed the integrated detection of welding groove sizes and welding torch relative position and posture parameters with high detection accuracy, robustness and adaptability at the same time.
Aiming at the existing problems mentioned above, on the basis of the deduced detection mathematical model of the vision sensor based on combined laser structured lights, this paper firstly proposes an integrated calibration method using only an ordinary checkerboard calibration board for internal parameters of the vision sensor, including camera internal parameters (fx, fy, u0, v0, k1, k2) and the structured light plane equation parameters (Al1, Bl1, Cl1, Dl1 and Al2, Bl2, Cl2, Dl2) in a camera coordinate system. Next, based on the processing of single modulated laser lines image captured by camera, the 3D point cloud data of the laser lines can be obtained. Then, the V-groove surfaces and its adjacent workpiece surfaces of the planar workpiece are reconstructed by data segmentation and plane fitting. Finally, the integrated detection of the V-groove size parameters (groove width b1, b2 and groove depth h) of the planar workpiece and the relative position and posture parameters (position parameters e, γ, H and posture parameters α, β) of the welding torch in any relative position and posture is realized. This research work effectively improves the robustness and applicability of detection for the vision sensor and has important application value in intelligent welding production.

2. Configuration and Detection Mathematical Model of Vision Sensor

2.1. Configuration of Vision Sensor Based on Combined Laser Structured Lights

The application of the single line structured light vision sensor in the market needs to preset and keep its fixed installation position and posture when detecting the sizes of the object, and it needs multiple scans to obtain the accurate sizes of the object [23]. Meanwhile, a binocular or multilocular vision sensor also has problems processing multiple images at the same time, complex calibration and so on. Thus, a vision sensor based on combined laser structured lights is designed (as shown in Figure 1) and the corresponding detection algorithm is proposed, which can realize the detection of welding groove sizes and the welding torch relative position and posture parameters under any relative position and posture, and has great detection applicability, robustness and accuracy.
The vision sensor shown in Figure 1 is mainly composed of a monocular camera and two line laser transmitters. It adopts the spatial arrangement of oblique incidence-perpendicular receiving, which has the advantages of small detection error and compact structure [24,25]. Among them, the central axes of the two line laser transmitters are parallel, and the designed angle value between the optical axis of the camera and the central axes of the line laser transmitters is 30°, which makes the detected values of the vision sensor close to the actual values [26]. The vision sensor is fixedly mounted on the forward side of the welding torch (welding direction). Meanwhile, the central axis of the welding torch, the optical axis of the camera and the central axes of two line laser transmitters are theoretically coplanar; this plane should be perpendicular to the width direction of the camera image plane theoretically. In addition, based on the spectral characteristics that the arc light intensity of GMAW is the weakest within the wavelength range of 620–700 nm, the wavelength of laser transmitters selected is 660 nm; a narrow-band filter with the wavelength of 660 ± 8 nm is selected and installed in front of the camera lens to filter out the arc interference and ensure high laser light transmittance. The selected main components and their parameters of the vision sensor are shown in Table 1.

2.2. Detection Mathematical Model of Vision Sensor

The two line laser transmitters of the vision sensor (Figure 1) emit two parallel light planes with a certain thickness and project onto the surfaces of the measured object (workpiece and welding groove). Then, two laser lines modulated by the welding groove are formed on the surfaces of the measured object. The modulated laser lines are captured by the CMOS camera, and the 2D coordinates of the laser lines can be extracted by processing a single image. After that, according to the internal parameters of the vision sensor and the coordinate transformation based on the perspective projection principle, the 3D coordinates of the points in the modulated laser lines projected on the surfaces of the measured object in the camera coordinate system can be solved. Then, the welding groove sizes and the SPP parameters of the welding torch relative to the welding groove can be obtained.
In order to convert the 2D image data of the modulated laser lines captured by the camera into 3D space data through the perspective projection model, four rectangular coordinate systems are established, as shown in Figure 2. In Figure 2, O-xy is a 2D image coordinate system (taking the intersection O of the camera optical axis OCZC and the image plane as the origin) and o-uv is a 2D pixel coordinate system, which are used to characterize the 2D coordinates in the image plane; OC-XCYCZC is the 3D camera coordinate system (OC is the optical center of the camera) and OW-XWYWZW is the 3D world coordinate system (f is the focal length of the camera), which are used to characterize the 3D coordinates of space points in the actual physical space. Point P is the sampling point in the modulated laser lines projected on the surfaces of the measured object and point p is the perspective imaging point of the sampling point P on the 2D image plane.
In Figure 2, the coordinate transformation of the sampling point P from the 3D world coordinate system OW-XWYWZW to the 3D camera coordinate system OC-XCYCZC is a position and posture transformation of rigid body (including rotation and translation). This transformation matrix is called the external parameter matrix of the camera and it can be expressed as follows [27]:
[ X C Y C Z C 1 ] = [ R C T 0 1 ] [ X W Y W Z W 1 ]
where RC is a 3 × 3 matrix and T is a 3 × 1 matrix, which represent the posture and position of the camera coordinate system OC-XCYCZC in the world coordinate system OW-XWYWZW, respectively.
According to the perspective projection principle, the matrix relationship between the coordinates of the sampling point P in the 3D camera coordinate system OC-XCYCZC and the coordinates of the imaging point p in the 2D image coordinate system O-xy is as follows:
Z C [ x y 1 ] = [ f 0 0 0 f 0 0 0 1 0 0 0 ] [ X C Y C Z C 1 ]
On the image plane, the pixel widths along the x and y directions are dx and dy (mm/pixel), respectively, and u0, v0 (pixel) are the position coordinates of the origin of the 2D image coordinate system O-xy in the 2D pixel coordinate system o-uv. Then, the matrix relationship of the imaging point p between its image coordinates (x, y) and pixel coordinates (u, v) is:
[ u v 1 ] = [ 1 dx 0 u 0 0 1 dy v 0 0 0 1 ] [ x y 1 ]
Ignoring the camera imaging distortion, the sampling point P, the imaging point p and the camera optical center OC meet geometric collinear constraint. According to the conversion matrix of Equations (1)–(3), the conversion relationship between the 3D world coordinates of sampling point P and the 2D pixel coordinates of imaging point p can be obtained:
Z C [ u v 1 ] = [ f x 0 u 0 0 0 f y v 0 0 0 0 1 0 ] [ R c T 0 1 ] [ X W Y W Z W 1 ]
where fx = f/dx and fy = f/dy represent the dimensionless scale factor of the camera in the x direction and y direction, respectively.
However, the camera lens always has radial distortion and tangential distortion. In general, the tangential distortion is small and can be ignored, and only the radial distortion needs to be considered. The mathematical model used frequently for removing radial distortion of the camera is the Brown model [28], which is described by Taylor series expansion around the main point of image. Generally, only the first two items need to be used. The radial distortion correction formula is:
[ x c y c ] = ( 1 + k 1 r 2 + k 2 r 4 ) [ x y ]
where k1 and k2 represent the radial distortion coefficient of the camera, r represents the normalized distance between the distortion point and the main point of the image and xc and yc represent the image coordinates after radial distortion correction. So, k1, k2, fx, fy, u0 and v0 are called internal parameters of the camera system.
Only relying on the constraint provided by Equation (4), the detection mathematical model of the vision sensor is incomplete. Considering that the sampling point P on the surfaces of the measured object is also a point on the structured light plane projected by the line laser transmitter, a complete vision sensing detection mathematical model can be established by taking the structured light plane equation in the 3D camera coordinate system OC-XCYCZC as a supplementary condition. The general expression of the laser structured light plane equation in the 3D camera coordinate system OC-XCYCZC is:
A l X C + B l Y C + C l Z C + D l = 0
where Al, Bl, Cl and Dl are the coefficients of the laser structured light plane equation.
Combining the above equations, the detection mathematical model of the vision sensor based on combined laser structured lights are obtained, as shown in Equation (7). Through this equation, the 3D coordinates (XC, YC, ZC) of the sampling point P in the 3D camera coordinate system OC-XCYCZC can be solved by the pixel coordinates (u, v) of the imaging point p in the image plane.
{ Z C [ u v 1 ] = [ f x 0 u 0 0 0 f y v 0 0 0 0 1 0 ] [ X C Y C Z C 1 ] A l 1 X C + B l 1 Y C + C l 1 Z C + D l 1 = 0 or A l 2 X C + B l 2 Y C + C l 2 Z C + D l 2 = 0
where Al1, Bl1, Cl1, Dl1 and Al2, Bl2, Cl2, Dl2 are the plane equation parameters of the structured light planes projected by the line laser transmitters 1 and 2 in the 3D camera coordinate system OC-XCYCZC.

3. Integrated Calibration for Internal Parameters of Vision Sensor

For the vision sensor based on combined laser structured lights, an integrated calibration method is proposed and applied for the calibration of vision sensor internal parameters, including the camera internal parameters (fx, fy, u0, v0, k1, k2) and the structured light plane equation parameters (Al1, Bl1, Cl1, Dl1 and Al2, Bl2, Cl2, Dl2) of two line laser transmitters in the camera coordinate system. This method is only based on the ordinary planar checkerboard calibration board (its number of squares is 12 × 9, size of chessboard is 6 × 6 mm and accuracy is 1 μm), and the images of the calibration board and the laser lines are collected before and after the two line laser transmitters project the laser lines onto the calibration board by using different exposure times, respectively, as shown in Figure 3. Then, the position and posture of the calibration board are changed, and 20 sets of images of the calibration board and laser lines under different positions and postures of the calibration board are collected successively.
According to the 20 calibration board images and Zhang’s camera calibration method (see Reference [5]), the camera internal parameters can be calibrated by using the camera calibration toolbox in MATLAB. Meanwhile, the 20 sets of external parameter matrices RC and T of the chessboard calibration board relative to the 3D camera coordinate system OC-XCYCZC under different positions and postures can be obtained.
For the calibration of laser structured light plane equation parameters, its detailed steps are as follows. Firstly, the laser lines image is preprocessed by median filtering, binary segmentation and morphological processing. Next, the laser centerlines in the image are extracted by skeleton thinning and Hough line transform. Then, according to the obtained external parameter matrices RC, T of the chessboard calibration board relative to the 3D camera coordinate system OC-XCYCZC under different positions and postures and Equation (4), the 20 sets of 2D coordinate data of the laser centerlines in the laser lines image are converted into the 3D coordinate data of their corresponding point in the laser lines projected on the calibration board in the 3D camera coordinate system OC-XCYCZC. Finally, according to the 20 sets of 3D coordinate data of laser lines projected on the calibration board under different positions and postures, the two laser structured light plane equation parameters can be obtained by fitting the two structured light planes separately, which are the actual structured light planes projected from the two line laser transmitters.
The calibration results of the internal parameters for the vision sensor are shown in Table 2.

4. Detection Algorithm of Welding Groove Sizes and Relative Position and Posture of Welding Torch

The mathematical algorithms of vision detection are different for vision sensors with different arrangement structures between camera and laser structured light. For the designed vision sensor based on combined laser structured lights, the 3D reconstruction method for the V-groove surfaces and its adjacent workpiece surfaces of planar workpiece is primarily studied after the single modulated laser lines image processing effectively. Then, the detection algorithms of the V-groove size parameters and the SPP parameters of the welding torch relative to the welding groove are deduced. Finally, some experiments are carried out for the verification of the effectiveness and accuracy of the deduced detection algorithms.

4.1. Image Processing of Modulated Laser Lines Projected on V-Groove of Planar Workpiece

The laser lines projected on the V-groove surfaces and its adjacent workpiece surfaces are modulated, and then the CMOS camera captures the modulated laser lines image. The flow of processing and feature extraction of modulated laser lines image is shown in Figure 4.
Firstly, the modulated laser lines image (Figure 5a) is preprocessed by median filtering, top-hat transformation, binary segmentation and morphological processing to make the image with uniform brightness and obvious characteristics, as shown in Figure 5b. Next, according to the horizontal and vertical gray projection values of the laser lines image, the preprocessed laser lines image is segmented by the method of dynamic region of interest (ROI), and the two laser lines in the original image are divided into two small sub-region images. Extracting the features of two segmented laser line sub-images, respectively, can improve the efficiency of image processing. The features extraction of modulated laser lines includes single pixel centerline extraction, straight line fitting for centerline and intersection points calculation of adjacent fitting lines.
Considering the requirements of real-time detection accuracy and anti-interference ability of image processing, the Zhang-Suen thinning algorithm [29] is selected to process the two modulated laser line sub-images after region segmentation. The image processed by skeleton thinning may have defects such as bifurcation and discontinuity (Figure 5c). Through defect repair operations, e.g., bifurcation points removal and interpolation between discontinuous pixels, a continuous single pixel laser centerline image can be obtained (Figure 5d).
Then, for the obtained single pixel laser centerline image, the probabilistic Hough transform method is applied to determine the lines in the image. In the actual detection, the detection results of multi-group points of line segment will appear. The final detected values of the laser centerline equations can be obtained by calculating the average values of ρ and θ in polar coordinates for each line segment.
Finally, the intersection points of the two adjacent lines detected can be calculated, which are the image feature points of the V-groove of the planar workpiece (Figure 5e). For the two laser lines projected on the workpiece surfaces of V-groove, the whole laser lines image captured by camera has six feature points (Figure 5f).

4.2. Three-Dimensional Reconstruction for V-Groove Surfaces and Its Adjacent Workpiece Surfaces of Planar Workpiece

The obtained intersections of the fitting lines after image processing and feature extraction, which are the characteristic points of the modulated laser lines image, will inevitably have pixel errors. If the size parameters of the V-groove are directly solved by using image coordinates of these points, it will inevitably produce large detection deviation, reduce the robustness of the detection algorithm and decrease the utilization of the laser lines image data.
In order to improve the accuracy and robustness of the detection results of welding groove size parameters, a plane fitting method is proposed, that is, to reconstruct the V-groove surfaces and its adjacent workpiece surfaces based on the 3D data points of the modulated laser lines in the camera coordinate system OC-XCYCZC, and then to realize the high precision calculation of the welding groove sizes and welding torch relative position and posture parameters. The flow of algorithm for solving the welding groove sizes and welding torch relative position and posture parameters is shown in Figure 6.

4.2.1. Two-Dimensional Image Data Segmentation of the Modulated Laser Lines

The coordinates of the six feature points in Figure 5f are recorded as (xi, yi; i = 1, 2,..., 6) according to their serial number. Each modulated laser line has three image feature points. Taking them as breakpoints, the data of each modulated laser line are divided into four segments, which are recorded as datai (i = 1, 2, 3, 4), as shown in Figure 7a.

4.2.2. Three-Dimensional Coordinates Solution of Segmented Data in the Camera Coordinate System

According to the detection mathematical model of Equation (7), the 2D segmented data of two single pixel modulated laser lines in the image coordinate system o-xy are mapped into the camera coordinate system OC-XCYCZC based on the perspective projection principle. Then, the 3D point cloud segmented data of two single pixel modulated laser lines can be obtained, as shown in Figure 7b.

4.2.3. Plane Fitting of Segmented 3D Data

According to the structural characteristics of V-groove of planar workpiece, the data points with the same serial number in the two modulated laser lines are on the same welding groove surfaces or its adjacent workpiece surfaces. Theoretically, the welding groove surfaces and its adjacent workpiece surfaces are plane, so the 3D point cloud segmented data can be used for plane fitting. Four planes Si (i = 1, 2, 3, 4) can be obtained by fitting the datai (i = 1, 2, 3, 4) points with the same serial number in the two modulated laser lines, respectively.
Considering that the obtained six image feature points may have pixel errors, their corresponding 3D feature points are not completely accurate segment interval points. Therefore, to ensure the accuracy of plane fitting data points and reduce the influence of the data error caused by segment interval points, the segment interval points and their neighborhood data points are removed during the process of data segmentation.
The singular value decomposition (SVD) is used to plane fitting for the different segmented 3D data points separately. The coefficient matrix M and column matrix X can be constructed as follows:
M = [ X C 1 X ¯ C Y C 1 Y ¯ C Z C 1 Z ¯ C X C 2 X ¯ C Y C 2 Y ¯ C Z C 2 Z ¯ C X Ci X ¯ C Y Ci Y ¯ C Z Ci Z ¯ C ] ,   X = [ A B C ]
where ( X _ C, Y _ C, Z _ C) is the average coordinates of one 3D segmented data point (XCi, YCi, ZCi) which is used to fit corresponding plane, and column matrix X represents the normal vector of the plane.
The purpose of plane fitting is to minimize the sum of the distances between the fitting plane and all fitting points; thus, the objective function is established as follows:
f ( X ) = min MX
The constraint condition is X = 0 . In the solution process, the matrix M is decomposed by the method of SVD. It can be obtained that the eigenvector corresponding to the minimum singular value of the coefficient matrix M is the optimal solution X of the objective function in Equation (9). According to D = −(A X _ C + B Y _ C, + C Z _ C), all of the parameter values of the fitting plane are obtained. Taking the equation parameters of four fitting planes Si as Ai, Bi, Ci and Di (i = 1, 2, 3, 4), the fitting plane equations of the four planes S1, S2, S3 and S4 associated with the V-groove are:
{ S 1 : A 1 X C + B 1 Y C + C 1 Z C + D 1 = 0 S 2 : A 2 X C + B 2 Y C + C 2 Z C + D 2 = 0 S 3 : A 3 X C + B 3 Y C + C 3 Z C + D 3 = 0 S 4 : A 4 X C + B 4 Y C + C 4 Z C + D 4 = 0
where the normal vectors of the four planes are m1 (A1, B1, C1), m2 (A2, B2, C2), m3 (A3, B3, C3), m4 (A4, B4, C4), respectively.
The welding groove surfaces and its adjacent workpiece surfaces obtained by fitting the segmented 3D point cloud data are shown in Figure 8. So far, the 3D reconstruction of V-groove for planar workpiece is realized.
Further, according to the reconstructed 3D welding groove and the position and posture of the welding torch in the camera coordinate system OC-XCYCZC, the welding groove size parameters and the SPP parameters of the welding torch relative to the welding groove can be detected and solved.

4.3. Detection Algorithm of Welding Groove Size Parameters

The main size parameters of V-groove of planar workpiece include groove depth h, groove width b1 and b2. Their detection algorithms are as follows.

4.3.1. Groove Depth h

The distance from the groove bottom intersection line l1, which is the intersection line of the left groove surface S2 and the right groove surface S3, to the left workpiece surface S1 or the right workpiece surface S4 is called the groove depth.
The linear equation of the intersection line l1 is:
l 1 : { A 2 X C + B 2 Y C + C 2 Z C + D 2 = 0 A 3 X C + B 3 Y C + C 3 Z C + D 3 = 0
The average distance from the points on the intersection line l1 to the left workpiece surface S1 and the right workpiece surface S4 are recorded as the groove depth h1 and the groove depth h2, respectively; The difference between h1 and h2 reflects the groove misalignment caused by planar workpiece assembly or thermal deformation. When the value of misalignment is small enough or negligible, take the average value of h1 and h2 as the groove depth h, as shown in Figure 8.

4.3.2. Groove Width b1 and b2

According to the fitted left workpiece surface S1 and left groove surface S2, the intersection line l2 of the two planes can be obtained, and its linear equation is:
l 2 : { A 1 X C + B 1 Y C + C 1 Z C + D 1 = 0 A 2 X C + B 2 Y C + C 2 Z C + D 2 = 0
The direction vector n2 of line l2 is:
n2 = m1 × m2 = (B1C2B2C1, A2C1A1C2, A1B2A2B1)
According to the direction vector n2 of the line l2 and the normal vector m1 of the fitting left workpiece surface S1, the normal vector m5 of the virtual vertical plane S5, which is perpendicular to the left workpiece surface S1 and passing through the line l2, can be obtained, expressed as:
m5 = n2 × m1 = (B1A1B2B12A2C12A2 + C1A1C2, C1B1C2C12B2A12B2 + A1A2B1, A1A2C1A12C2B12C2 + B1B2C1)
Take any point of line l2, the plane equation of S5 can be obtained:
A 5 X C + B 5 Y C + C 5 Z C + D 5 = 0
where m5 = (A5, B5, C5), D5 = −(A5XCq + B5YCq + C5ZCq). (XCq, YCq, ZCq) are the coordinates of any point on the line l2.
Similarly, the plane equation of the virtual vertical plane S6 can also be obtained. Thus, the average distance from the points on the intersection line l1 to the virtual vertical planes S5 and S6 are the groove width b1 and b2, respectively, and the sum of b1 and b2 is the total groove width B, as shown in Figure 8.

4.4. Detection Algorithm of Welding Torch Relative SPP Parameters

4.4.1. Relative SPP Parameters of Welding Torch

Figure 9 is the schematic diagram of the SPP parameters of the welding torch relative to the welding groove, where OW-XWYWZW is the world coordinate system fixedly connected with the planar workpiece, XW is the welding direction (groove direction) vector, YW is the welding groove width direction vector and ZW is the normal vector of the upper surface of the planar workpiece. The vector tor is the direction vector of the central axis of welding torch, which is used to characterize the posture of welding torch.
The relative position parameters of welding torch include transverse deviation e, angular deviation γ and welding torch height H.
Construct the groove bottom plane S7, which is a plane passing through the groove bottom intersection line l1 and parallel to the upper surface of the planar welded workpiece. The point Pj0 and point Pj1 are the intersections of the central axis of welding torch and the optical axis of CMOS camera with the bottom plane S7, respectively. The transverse deviation e of welding torch is defined as the distance from point Pj0 to the groove bottom intersection line l1, and the angular deviation γ is the included angle between the line Pj0Pj1 and the intersection line l1.
The welding torch height H, camera height H1 are, respectively, the distances from the end of welding torch conductive nozzle PE, the camera focus along their respective axes to the upper surface of planar welded workpiece. The sensor installation height H0 is the distance from the camera focus along the central axis of welding torch to the end of welding torch conductive nozzle PE, which is a fixed value after the sensor being installed. When the central axis of welding torch is perpendicular to the upper surface of planar welded workpiece, the relationship among them is:
H = H 1 H 0
The relative posture parameters of welding torch include front and rear tilt angle α, left and right tilt angle β. The front and rear tilt angle α is the included angle between the welding torch direction vector tor and the normal vector ZW of planar workpiece upper surface along the welding direction. The left and right tilt angle β is the included angle between the welding torch direction vector tor and the normal vector ZW of planar workpiece upper surface in the plane of groove cross section.

4.4.2. Solution of Relative Position Parameters of Welding Torch

The moving direction of welding torch during arc welding process may not always coincide with the direction of welding groove, which often leads to an angular deviation γ. When welding process for different welding beads (layers) of V-groove are being implemented, the transverse deviation e and the height H of welding torch also need to be adjusted appropriately.
According to the plane equations of the welding groove surfaces and its adjacent workpiece surfaces and the definition of position parameters of welding torch relative to the welding groove, the transverse deviation e, angular deviation γ and welding torch height H are solved as follows:
1.
Transverse deviation e
(1). Solve the plane equation of the bottom plane S7.
When the misalignment of the welding groove can be ignored, the average value of the normal vectors of planar workpiece upper surfaces S1 and S4 is taken as the value of the normal vector m7 (A7, B7, C7) of the bottom plane S7, which are given as follows:
A7 = (A1 + A4)/2, B7 = (B1 + B4)/2, C7 = (C1 + C4)/2
Take any point P of the intersection line l1, the plane equation of the groove bottom plane S7 can be determined as:
A 7 X C + B 7 Y C + C 7 Z C + D 7 = 0
where D7 = −(A7XCp + B7YCp + C7ZCp), and (XCp, YCp, ZCp) are the coordinates of any point P on the intersection line l1.
(2). Find the intersection point Pj0 of the central axis of welding torch and the plane S7.
The linear equation of the central axis of welding torch is: XC = D0, YC = 0. where D0 is the distance between the central axis of welding torch and the optical axis of camera, which are two parallel lines, and D0 is a fixed value after the sensor has been installed. The coordinates of the intersection point Pj0 of the welding torch central axis and plane S7 can be solved as (D0, 0, −(A7D0 + D7)/C7).
(3). Calculate the transverse deviation e.
The value of the direction vector n1 of the groove bottom intersection line l1 is:
n1 =m3 × m4 = (B3C4B4C3, A4C3A3C4, A3B4A4B3)
The distance from point Pj0 to intersection l1 is the transverse deviation e, which is:
e = | P P j 0 × n 1 | | n 1 |
2.
Angular deviation γ
(1). Find the intersection point Pj1 between the camera optical axis and plane S7.
The linear equation of the camera optical axis is: XC = 0, YC = 0. The coordinates of the intersection point Pj1 of the camera optical axis and plane S7 can be obtained, which are (0, 0, −D7/C7).
(2). Calculate the angular deviation γ.
Since the straight line Pj0Pj1 and the intersection line l1 are both in plane S7, their included angle is the angular deviation γ, which is:
γ = arccos P j 0 P j 1     n 1 | P j 0 P j 1 | | n 1 |
3.
Welding torch height H
The coordinates of the intersection point Pj of the central axis of welding torch and plane S1 can be solved by combining two corresponding equations, which are (D0, 0, −(A1D0 + D1)/C1). In the camera coordinate system OC-XCYCZC, the coordinates of the end of welding torch conductive nozzle PE are (D0, 0, −H0). where the sensor installation height H0 can be determined by calibration.
Thus, for the upper surface S1 of planar welded workpiece, the welding torch height H is the distance between point Pj and PE, which is:
H =     ( A 1 D 0 + D 1 )   C 1 H 0
For the upper surface S4 of planar workpiece, the welding torch height H can also be calculated similarly. In practical application, the welding torch heights H relative to the two upper surfaces of S1 and S4, respectively, can be used alone or used with their mean value according to actual needs.

4.4.3. Solution of Relative Posture Parameters of Welding Torch

During the arc welding process, the relative position parameters of welding torch affect the accuracy of the weld forming position and the relative posture parameters of welding torch (front and rear tilt angle α, left and right tilt angle β) affect the shape of weld pool and its fluidity, then will affect the quality of weld formation, including penetration, width and reinforcement. The values of relative posture parameters of welding torch are closely related to the absolute spatial posture of welding groove.
1.
Front and rear tilt angle α of the welding torch
The bottom plane S7 is parallel to the upper surface of planar workpiece. Thus, the relationship between the direction vector of the line Pj0Pj1 in bottom plane S7 and the direction vector tor of the axis of welding torch can be used to characterize the front and rear tilt angle α of welding torch relative to the welding groove.
The direction vector of the line Pj0Pj1 is: q1 = (D0, 0, −A7D7/C7). Take the value of the direction vector of the central axis of welding torch as: tor = (0, 0, 1). Therefore, the front and rear tilt angle α of welding torch can be solved, as follows:
α = arcsin tor     q 1 | tor | | q 1 |
Considering the front and rear tilt angle α is generally expressed as an acute angle during the actual welding process. A positive value of α indicates the welding torch tilts forward relatively along welding direction, and a negative value of α indicates the welding torch tilts backward relatively along welding direction.
2.
Left and right tilt angle β of the welding torch
The relationship between the direction vector tor of welding torch and the vector q2, which is simultaneously perpendicular to the normal vector m7 and the vector q1, can be used to characterize the left and right tilt angle β of welding torch relative to the welding groove. The vector q2 is:
q2= m7 × q1 = (−A7D7B7/C7, A72D7/C7 + C7D0, −B7D0)
Therefore, the left and right tilt angle β of the welding torch is:
β = arcsin tor     q 2 | tor | | q 2 |
At the same, considering the left and right tilt angle β is generally expressed as an acute angle during the actual welding process. A positive value of β indicates the welding torch slants to the left relatively in the welding groove cross section, and a negative value of β indicates the welding torch slants to the right relatively in the welding groove cross section.
According to the above derived detection algorithm of welding groove sizes and relative position and posture parameters of welding torch, in the process of practical application of the designed vision sensor based on combined laser structured lights, there is no need to preset special or fixed relative posture between the welding torch (or the vision sensor) and the welding groove, that is, it can be arbitrary. The whole process only requires the modulated laser lines can be imaged completely, and the integrated detection of the welding groove sizes and the relative SPP parameters of the welding torch can be realized. At the same time, it can be seen that the derived detection algorithms have few application restrictions and do not depend on other devices. Except that the sensor installation parameters H0 and D0 (both can be obtained by calibration) are used in the solutions of the relative SPP parameters of the welding torch, the solutions of other parameters only depend on the internal parameters of the designed vision sensor, which effectively improve its detection applicability.

5. Experimental Verification and Discussion

In order to verify the correctness of the internal parameter calibration method for the designed vision sensor proposed in this paper and the accuracy of calibration results, as well as the effectiveness of the above derived detection algorithms of welding groove sizes and the SPP parameters of the welding torch relative to the welding groove, some verification tests under three different conditions were carried out on the test platform shown in Figure 10. Considering the height of the welding torch during arc welding and the sensor installation height H0, the object distance of the camera is controlled at 140–160 mm in the verification tests, which can ensure that the magnification of the camera is similar in the verification tests, and the resolution of the camera is about 0.03–0.04 (mm/pixel) in the width direction and 0.07–0.10 (mm/pixel) in the depth direction in this case. These three tests were used to verify the detection accuracy of welding groove size parameters (groove depth h, groove width b1 and b2) and the spatial position parameters (e, γ, H) and posture parameters (α, β) of the welding torch relative to the welding groove separately.
T1: In test 1, symmetric and asymmetric V-grooves were used to verify the effectiveness and accuracy of detection algorithms of groove depth h, groove width b1 and b2, respectively. Table 3 shows the measured and detected values of welding groove sizes.
During the detection test of T1, on the premise of ensuring that the modulated laser lines are completely located in the field of vision of the camera, the workpiece was placed on the platform in any position and posture. The feature extraction diagram of the laser lines image modulated by typical V-groove and the 3D reconstruction diagram of the welding groove surfaces and its adjacent workpiece surfaces are shown in Figure 11a,b.
From Table 3, it can be seen that the maximum absolute error of welding groove size parameters between the detection value and measured value did not exceed 0.08 mm, the maximum relative error did not exceed 1% and the maximum repetition error did not exceed 0.04 mm. These detection results indicate the plane fitting method, which is used for the 3D reconstruction of welding groove surfaces and its adjacent workpiece surfaces, can greatly reduce the detection error caused by the deviation of feature point extraction. Meanwhile, the detection under any relative position and posture of workpiece can represent the variation of welding groove sizes caused by thermal deformation of the welded workpiece, so the detection method proposed can eliminate the impact of thermal deformation on the weld formation during the arc welding process to a certain extent. Therefore, the detection method has good detection accuracy, repeatability and adaptability.
T2: Since the central axis of the welding torch is coplanar with the optical axis of the CMOS camera and the central axes of two line laser emitters, and the central axis of the welding torch is parallel to the optical axis of the camera, the SPP of the camera relative to the welding groove can be used to characterize the SPP of the welding torch relative to the welding groove. In test 2, a symmetric V-groove was used in the detection test of the camera (characterizing welding torch) position parameters (e, γ, H1) relative to the welding groove. In order to facilitate the setting of position parameters and test verification, this test placed the planar workpiece with V-groove vertically under the camera, that is, the upper surface of planar workpiece was perpendicular to the optical axis of the camera. Under this condition, the camera height H1 can be expressed as: H1 = |D1/C1|, and the relationship between the torch height H and the camera height H1 is shown in Equation (14). As shown in Figure 10, the horizontal precision displacement platform was used to set the transverse deviation e, the rotary precision displacement platform was used to set the angular deviation γ and the vertical precision displacement platform was used to set the relative height H1 of the camera.
The set values of the precision displacement platform are taken as the measured values of the position deviation, and the position deviations obtained by vision sensing detection algorithms are taken as the detected values. Table 4 shows the measured and detected values of the position parameters of the camera (characterizing welding torch) relative to the welding groove. The feature extraction diagram of the laser lines image modulated by typical V-groove and the 3D reconstruction diagram of the welding groove surfaces and its adjacent workpiece surfaces are shown in Figure 11c,d.
From Table 4, it can be concluded that the maximum absolute errors of transverse deviation e and height H1 did not exceed 0.04 mm and the maximum absolute error of angular deviation γ did not exceed 0.1°. At the same time, the maximum relative errors of each parameter detection results were no more than 2%, which indicates the detection algorithms can be used for the accurate detection of the position of the welding torch relative to the welding groove and their real-time feedback adjustment.
T3: In test 3, a symmetric V-groove was used in the detection test of the camera (characterizing welding torch) posture parameters (α, β) relative to the welding groove. The dual-axis tilt sensor was placed on the upper surface of the planar workpiece to real-time display the posture parameter values of the planar workpiece.
In the test of T3, the front and rear tilt angle α and the left and right tilt angle β of the V-groove of the planar workpiece were respectively set by the two knobs of the biaxial angle adjusting platform in Figure 10. The reading values of the dual-axis tilt sensor were taken as the measured values of relative posture parameters and the calculated values obtained by vision detection algorithms were taken as the detected values of relative posture parameters. Table 5 shows the measured and detected values of the posture parameters of the camera (characterizing welding torch) relative to the welding groove. The feature extraction diagram of the laser lines image modulated by typical V-groove and the 3D reconstruction diagram of the welding groove surfaces and its adjacent workpiece surfaces are shown in Figure 11e,f.
From Table 5, it can be seen that the detection algorithms can also accurately detect the posture parameters of camera (characterizing welding torch) relative to the welding groove. The maximum absolute error of the detected values did not exceed 0.2° and the maximum relative error of the detected values were no more than 5%, which shows that this research work can fully meet the requirements of detection and control for relative posture of welding torch during the arc welding process.
Analyzing the above test results further, it is found that the main detection errors of test 2 and test 3 come from the fact that the initial posture of the camera (characterizing welding torch) relative to the welding groove is not an ideal vertical condition, which leads to the coupling among these parameters, and the measured values cannot fully represent the actual values of the parameters in the detection process of the relative position and posture parameters of the camera (characterizing welding torch).
In short, based on the processing of a single modulated laser lines image, the derived detection algorithms can effectively realize the integrated detection of the welding groove size parameters (h, b1, b2) as well as the SPP parameters (e, γ, H and α, β) of the welding torch relative to the welding groove. Their detection accuracy can fully meet the needs of actual welding production, which indicates the algorithms can provide strong support for on-line detection and feedback control of intelligent welding production.

6. Conclusions

There are three main research works accomplished in this paper. Some valuable conclusions could be obtained as follows:
(1)
For the specially designed vision sensor based on combined laser structured lights, an integrated calibration method of vision sensor internal parameters is proposed, which uses only an ordinary planar checkerboard calibration board. The internal parameters (including camera internal parameters of fx, fy, u0, v0, k1, k2 and laser structured light plane equation parameters of Al1, Bl1, Cl1, Dl1 and Al2, Bl2, Cl2, Dl2 in the camera coordinate system) of the vision sensor can be integrated, calibrated effectively by the proposed calibration method. This reduces the requirement for high installation accuracy of two laser transmitters to a great extent, avoids the influence of non-parallel error in two laser structured light projection planes on detection results and eliminates the cumulative error in stepwise calibration. Thus, the proposed integrated calibration method improves the efficiency, accuracy and comprehensiveness of internal parameter calibration for a line structured light vision sensor and provides a good foundation for industrial application of the vision sensor.
(2)
The derived high precision integrated detection algorithms for the V-groove size parameters (groove width b1, b2 and groove depth h) of the planar workpiece and the SPP parameters (including position parameters of e, γ, H and posture parameters of α, β) of the welding torch relative to the welding groove can be applied under any SPP of the welding torch (or vision sensor). Based on the 3D data of modulated laser lines obtained by the processing of the single modulated laser lines image, the algorithms reconstruct the 3D surfaces of V-groove surfaces and its adjacent surfaces of planar workpiece by data segmentation and plane fitting. This improves the utilization of modulated laser lines image data and reduces the interference of image processing error on the parameter detection.
(3)
According to the proposed integrated calibration method and derived high precision integrated detection algorithms, some verification tests were carried out. The experimental results show that the derived integrated detection algorithms can be applied under any position and posture of the welding torch (or vision sensor) and has good detection accuracy and robustness, which improves the applicability of the vision sensor and the integration of detection algorithms. This work has important value for the application of the vision sensor in intelligent welding production.

Author Contributions

C.Z. wrote the paper and conceived and designed the experiments; Z.Z. supervised the overall work and reviewed the paper; Z.K. and T.Z. gave some suggestions about the overall work. All authors have read and agreed to the published version of the manuscript.

Funding

The research was funded by the National Natural Science Foundation of China (grant number 51775301) and the Shunyi District “Beijing Scientific and Technological Achievements Transformation Coordination and Service Platform” construction special project.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The research was financially supported by the National Natural Science Foundation of China (grant number 51775301) and the Shunyi District “Beijing Scientific and Technological Achievements Transformation Coordination and Service Platform” construction special project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bestard, G.A.; Alfaro, S.C.A. Measurement and estimation of the weld bead geometry in arc welding processes: The last 50 years of development. J. Braz. Soc. Mech. Sci. Eng. 2018, 40, 444. [Google Scholar] [CrossRef]
  2. Rout, A.; Deepak, B.; Biswal, B.B.; Mahanta, G.B. Weld Seam Detection, Finding, and Setting of Process Parameters for Varying Weld Gap by the Utilization of Laser and Vision Sensor in Robotic Arc Welding. IEEE Trans. Ind. Electron. 2022, 69, 622–632. [Google Scholar] [CrossRef]
  3. Leandry, I.; Breque, C.; Valle, V. Calibration of a structured-light projection system: Development to large dimension objects. Opt. Lasers Eng. 2012, 50, 373–379. [Google Scholar] [CrossRef]
  4. Kim, D.; Lee, S.; Kim, H.; Lee, S. Wide-angle laser structured light system calibration with a planar object. In Proceedings of the International Conference on Control, Automation and Systems (ICCAS 2010), Goyang-si, Korea, 27–30 October 2010; pp. 1879–1882. [Google Scholar]
  5. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  6. Huynh, D.Q.; Owens, R.A.; Hartmann, P.E. Calibrating a structured light stripe system: A novel approach. Int. J. Comput. Vis. 1999, 33, 73–86. [Google Scholar] [CrossRef]
  7. Ha, J.E. Calibration of Structured Light Vision System using Multiple Vertical Planes. J. Electr. Eng. Technol. 2018, 13, 438–444. [Google Scholar]
  8. Santolaria, J.; Pastor, J.; Brosed, F.; Aguilar, J. A one-step intrinsic and extrinsic calibration method for laser line scanner operation in coordinate measuring machines. Meas. Sci. Technol. 2009, 20, 045107. [Google Scholar] [CrossRef]
  9. Xu, G.; Zhang, X.; Su, J.; Li, X.; Zheng, A. Solution approach of a laser plane based on Plucker matrices of the projective lines on a flexible 2D target. Appl. Opt. 2016, 55, 2653–2656. [Google Scholar] [CrossRef]
  10. Ha, J.E. Extrinsic Calibration of a Camera and Laser Range Finder Using a New Calibration Structure of a Plane with a Triangular Hole. Int. J. Control. Autom. Syst. 2012, 10, 1240–1244. [Google Scholar] [CrossRef]
  11. Chen, T.; Sun, L.; Zhang, Q.; Wu, X.; Wu, D. Field geometric calibration method for line structured light sensor using single circular target. Sci. Program. 2017, 2017, 1526706. [Google Scholar] [CrossRef] [Green Version]
  12. Kiddee, P.; Fang, Z.J.; Tan, M. An automated weld seam tracking system for thick plate using cross mark structured light. Int. J. Adv. Manuf. Technol. 2016, 87, 3589–3603. [Google Scholar] [CrossRef]
  13. Zhang, G.; Zhang, Y.; Tuo, S.; Hou, Z.; Yang, W.; Xu, Z.; Wu, Y.; Yuan, H.; Shin, K. A novel seam tracking technique with a four-step method and experimental investigation of robotic welding oriented to complex welding seam. Sensors 2021, 21, 3067. [Google Scholar] [CrossRef] [PubMed]
  14. Mao, Z.; Zhou, S.; Zhao, B.; Shi, Z.; Jiang, Y.; Pan, J. Welding torch position and seam orientation deviation based on two stripes laser vision sensing. Trans. China Weld. Inst. 2015, 36, 35–38. [Google Scholar]
  15. He, Y.; Li, D.; Pan, Z.; Ma, G.; Yu, L.; Yuan, H.; Le, J. Dynamic modeling of weld bead geometry features in thick plate GMAW based on machine vision and learning. Sensors 2020, 20, 7104. [Google Scholar] [CrossRef] [PubMed]
  16. Zhu, H.; Lu, Y.; Li, Y.; Tan, J.; Feng, Q. Method for detecting weld feature size based on line structured light. Trans. Nanjing Univ. Aeronaut. Astronaut. 2021, 38, 383–392. [Google Scholar]
  17. Kim, J.; Lee, J.; Chung, M.; Shin, Y.G. Multiple weld seam extraction from RGB-depth images for automatic robotic welding via point cloud registration. Multimed. Tools Appl. 2021, 80, 9703–9719. [Google Scholar] [CrossRef]
  18. Guo, J.; Zhu, Z.; Sun, B.; Yu, Y. Principle of an innovative visual sensor based on combined laser structured lights and its experimental verification. Opt. Laser Technol. 2019, 111, 35–44. [Google Scholar] [CrossRef]
  19. Guo, J.; Zhu, Z.; Sun, B.; Yu, Y. A novel multifunctional visual sensor based on combined laser structured lights and its anti-jamming detection algorithms. Weld. World 2019, 63, 313–322. [Google Scholar] [CrossRef]
  20. Xue, B.; Chang, B.; Peng, G.; Gao, Y.; Tian, Z.; Du, D.; Wang, G. A vision based detection method for narrow butt joints and a robotic seam tracking system. Sensors 2019, 19, 1144. [Google Scholar] [CrossRef] [Green Version]
  21. Zeng, J.; Chang, B.; Du, D.; Hong, Y.; Chang, S.; Zou, Y. A precise visual method for narrow butt detection in specular reflection workpiece welding. Sensors 2016, 16, 1480. [Google Scholar] [CrossRef] [Green Version]
  22. Zhang, T. Welding Torch Postion and Posture Based on Vision and Gravity Sensing; Tsinghua University: Beijing, China, 2021. [Google Scholar]
  23. Sioma, A. 3D imaging methods in quality inspection systems. In Proceedings of the 44th WILGA Symposium on Photonics Applications and Web Engineering, Wilga, Poland, 26 May–2 June 2019. [Google Scholar]
  24. Zou, Y.; Zhao, M.; Zhang, L.; Gao, S. Error analysis and structural analysis of structured-light visual sensor for seam tracking. Chin. J. Sci. Instrum. 2008, 29, 2605–2610. [Google Scholar]
  25. Sioma, A. Geometry and resolution in triangulation vision systems. In Proceedings of the Conference on Photonics Applications in Astronomy, Communications, Industry, and High Energy Physics Experiments, Wilga, Poland, 31 August–6 September 2020. [Google Scholar]
  26. Liu, N.; Guo, C.; Liu, M.; Wei, S.; Liao, Y. The internal layout experiment research of structured light visual sensor used in crawling arc welding robot. Jiangxi Sci. 2005, 23, 325–327. [Google Scholar]
  27. Wang, Z.; Fan, J.; Jing, F.; Deng, S.; Zheng, M.; Tan, M. An efficient calibration method of line structured light vision sensor in robotic eye-in-hand system. IEEE Sens. J. 2020, 20, 6200–6208. [Google Scholar] [CrossRef]
  28. Nowakowski, A.; Skarbek, W. Analysis of Brown camera distortion model. In Proceedings of the Conference on Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments, Wilga, Poland, 27 May–2 June 2013. [Google Scholar]
  29. Chen, W.; Sui, L.; Xu, Z.; Lang, Y. Improved Zhang-Suen thinning algorithm in binary line drawing applications. In Proceedings of the 2012 International Conference on Systems and Informatics (ICSAI 2012), Yantai, China, 19–20 May 2012; pp. 1947–1950. [Google Scholar]
Figure 1. Schematic diagram of vision sensor based on combined laser structured lights.
Figure 1. Schematic diagram of vision sensor based on combined laser structured lights.
Sensors 22 02117 g001
Figure 2. Schematic diagram of working principle of vision sensor.
Figure 2. Schematic diagram of working principle of vision sensor.
Sensors 22 02117 g002
Figure 3. Image acquisition process of vision sensor calibration: (a) image acquisition, (b) captured image of checkerboard calibration board and (c) captured image of two laser projection lines.
Figure 3. Image acquisition process of vision sensor calibration: (a) image acquisition, (b) captured image of checkerboard calibration board and (c) captured image of two laser projection lines.
Sensors 22 02117 g003
Figure 4. Image processing flow of modulated laser lines image.
Figure 4. Image processing flow of modulated laser lines image.
Sensors 22 02117 g004
Figure 5. Laser lines image processing and feature extraction: (a) original laser lines image, (b) preprocessing for laser lines image, (c) skeleton thinning processing for ROI, (d) defect repair for ROI of single pixel laser line, (e) Hough line detection for ROI of single pixel laser line and (f) feature extraction of laser lines image.
Figure 5. Laser lines image processing and feature extraction: (a) original laser lines image, (b) preprocessing for laser lines image, (c) skeleton thinning processing for ROI, (d) defect repair for ROI of single pixel laser line, (e) Hough line detection for ROI of single pixel laser line and (f) feature extraction of laser lines image.
Sensors 22 02117 g005aSensors 22 02117 g005b
Figure 6. The solution flow of the welding groove sizes and welding torch relative position and posture parameters.
Figure 6. The solution flow of the welding groove sizes and welding torch relative position and posture parameters.
Sensors 22 02117 g006
Figure 7. Data segmentation of modulated laser lines: (a) 2D data segmentation of laser lines image and (b) 3D data segmentation of laser lines.
Figure 7. Data segmentation of modulated laser lines: (a) 2D data segmentation of laser lines image and (b) 3D data segmentation of laser lines.
Sensors 22 02117 g007
Figure 8. Three-dimensional reconstruction of welding groove surfaces and its adjacent workpiece surfaces of planar workpiece based on plane fitting.
Figure 8. Three-dimensional reconstruction of welding groove surfaces and its adjacent workpiece surfaces of planar workpiece based on plane fitting.
Sensors 22 02117 g008
Figure 9. SPP of welding torch relative to welding groove.
Figure 9. SPP of welding torch relative to welding groove.
Sensors 22 02117 g009
Figure 10. Parameters detection test platform based on designed vision sensor.
Figure 10. Parameters detection test platform based on designed vision sensor.
Sensors 22 02117 g010
Figure 11. Detection test of welding groove sizes and relative position and posture parameters: (a) feature extraction for asymmetric V-groove image, (b) 3D reconstruction of asymmetric V-groove surfaces and its adjacent workpiece surfaces, (c) feature extraction for V-groove image with position deviation, (d) 3D reconstruction of V-groove surfaces and its adjacent workpiece surfaces with position deviation, (e) feature extraction for V-groove image with posture deviation and (f) 3D reconstruction of V-groove surfaces and its adjacent workpiece surfaces with posture deviation.
Figure 11. Detection test of welding groove sizes and relative position and posture parameters: (a) feature extraction for asymmetric V-groove image, (b) 3D reconstruction of asymmetric V-groove surfaces and its adjacent workpiece surfaces, (c) feature extraction for V-groove image with position deviation, (d) 3D reconstruction of V-groove surfaces and its adjacent workpiece surfaces with position deviation, (e) feature extraction for V-groove image with posture deviation and (f) 3D reconstruction of V-groove surfaces and its adjacent workpiece surfaces with posture deviation.
Sensors 22 02117 g011
Table 1. Component selection and their main parameters of vision sensor.
Table 1. Component selection and their main parameters of vision sensor.
Component DesignationsModel and Main Parameters
Industrial cameraCMOS: MER2-503-23GM
Resolution: 2448 × 2048
Exposure mode: Global Shutter
Exposure frequency: 23.5 fps
Dimension of the sensor matrix: 2/3″
Pixel size: 3.45 × 3.45 μm
Line laser emitterWavelength: 660 nm
Lens: glass lens
Size: Φ16 × 70 mm
Power: 200 mW
Focal length: adjustable
Camera lensModel: Computar, M1228-MPW3
Focal length: 12 mm
Angle of view (D × H × V):49.3° × 40.3° × 30.8°
Working distance: 100 mm~inf
Maximum compatible target size: 2/3″
FilterFilter wavelength: 660 ± 8 nm
Table 2. Calibration results of internal parameters for vision sensor.
Table 2. Calibration results of internal parameters for vision sensor.
Calibration ItemsCalibration ParametersCalibration Parameters Value
Internal parameters of the camerafx3544
fy3543
u01239
v01225
k1−0.0508
k20.0738
Structured light plane equation parameters of two laser emittersAl1/Al20.8771/0.8748
Bl1/Bl20.1340/0.0044
Cl1/Cl20.4801/0.4844
Dl1/Dl251.63/71.57
Table 3. V-grooves size parameters detection of planar workpiece.
Table 3. V-grooves size parameters detection of planar workpiece.
Groove TypeSize
Parameters
Measured Value
(mm)
Detected Value
(mm)
Mean Detected Value
(mm)
Standard
Deviation
(mm)
Absolute
Error (mm)
Relative Error
Absolute ValueMaximumAbsolute ValueMaximum
Symmetric V-grooveh13.06513.07013.0930.0330.0050.0740.04%0.57%
13.0690.0040.03%
13.1390.0740.57%
b15.9765.9805.9810.0090.0040.0160.07%0.27%
5.9920.0160.27%
5.9720.0040.07%
b26.0226.0026.0210.0210.0200.0290.33%0.48%
6.0110.0110.18%
6.0510.0290.48%
Asymmetric V-grooveh12.07112.13412.1370.0020.0630.0690.52%0.57%
12.1370.0660.55%
12.1400.0690.57%
b16.9966.9907.0000.0100.0060.0170.09%0.24%
6.9960.000.00%
7.0130.0170.24%
b24.4254.4654.4540.0110.0400.0400.90%0.90%
4.4580.0330.75%
4.4400.0150.34%
Table 4. Position parameters detection of camera (characterizing welding torch) relative to welding groove.
Table 4. Position parameters detection of camera (characterizing welding torch) relative to welding groove.
Position
Parameters
Measured ValueDetected ValueAbsolute
Error
Relative Error
Absolute ValueMaximumAbsolute ValueMaximum
e/mm32.9960.0040.0060.13%0.13%
55.0060.0060.12%
88.0040.0040.05%
γ1.61.6130.0130.0850.81%1.77%
3.23.1580.0421.31%
4.84.7150.0851.77%
H1/mm143143.0110.0110.0400.01%0.03%
148147.9600.0400.03%
153153.0090.0090.01%
Table 5. Posture parameters detection of camera (characterizing welding torch) relative to welding groove.
Table 5. Posture parameters detection of camera (characterizing welding torch) relative to welding groove.
Posture
Parameters
Measured ValueDetected ValueAbsolute
Error
Relative Error
Absolute ValueMaximumAbsolute ValueMaximum
α−4.94−4.8620.0780.1241.58%4.40%
−2.56−2.5490.0110.43%
2.822.6960.1244.40%
β−10.06−10.1310.0710.0710.71%2.75%
2.182.2400.0602.75%
4.364.3720.0120.28%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhu, C.; Zhu, Z.; Ke, Z.; Zhang, T. Internal Parameters Calibration of Vision Sensor and Application of High Precision Integrated Detection in Intelligent Welding Based on Plane Fitting. Sensors 2022, 22, 2117. https://doi.org/10.3390/s22062117

AMA Style

Zhu C, Zhu Z, Ke Z, Zhang T. Internal Parameters Calibration of Vision Sensor and Application of High Precision Integrated Detection in Intelligent Welding Based on Plane Fitting. Sensors. 2022; 22(6):2117. https://doi.org/10.3390/s22062117

Chicago/Turabian Style

Zhu, Chuanhui, Zhiming Zhu, Zhijie Ke, and Tianyi Zhang. 2022. "Internal Parameters Calibration of Vision Sensor and Application of High Precision Integrated Detection in Intelligent Welding Based on Plane Fitting" Sensors 22, no. 6: 2117. https://doi.org/10.3390/s22062117

APA Style

Zhu, C., Zhu, Z., Ke, Z., & Zhang, T. (2022). Internal Parameters Calibration of Vision Sensor and Application of High Precision Integrated Detection in Intelligent Welding Based on Plane Fitting. Sensors, 22(6), 2117. https://doi.org/10.3390/s22062117

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop