Next Article in Journal
Assessment of Gait Patterns during Crutch Assisted Gait through Spatial and Temporal Analysis
Previous Article in Journal
Firefighting Water Jet Trajectory Detection from Unmanned Aerial Vehicle Imagery Using Learnable Prompt Vectors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Three-Dimensional-Scanning of Pipe Inner Walls Based on Line Laser

by
Lingyuan Kong
1,
Linqian Ma
1,
Keyuan Wang
1,
Xingshuo Peng
1 and
Nan Geng
1,2,3,*
1
College of Information Engineering, Northwest Agriculture and Forestry University, Xianyang 712100, China
2
Key Laboratory of Agricultural Internet of Things, Ministry of Agriculture and Rural Affairs, Northwest A&F University, Xianyang 712100, China
3
Shaanxi Key Laboratory of Agricultural Information Perception and Intelligent Service, Northwest A&F University, Xianyang 712100, China
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(11), 3554; https://doi.org/10.3390/s24113554
Submission received: 28 April 2024 / Revised: 25 May 2024 / Accepted: 29 May 2024 / Published: 31 May 2024
(This article belongs to the Section Optical Sensors)

Abstract

:
In this study, an innovative laser 3D-scanning technology is proposed to scan pipe inner walls in order to solve the problems of the exorbitant expenses and operational complexities of the current equipment for the 3D data acquisition of the pipe inner wall, and the difficulty of both the efficiency and accuracy of traditional light stripe-center extraction methods. The core of this technology is the monocular-structured light 3D scanner, the image processing strategy based on tracking speckles, and the improved gray barycenter method. The experimental results demonstrate a 52% reduction in the average standard error of the improved gray barycenter method when compared to the traditional gray barycenter method, along with an 83% decrease in the operation time when compared to the Steger method. In addition, the size data of the inner wall of the pipe obtained using this technology is accurate, and the average deviation of the inner diameter and length of the pipe is less than 0.13 mm and 0.41 mm, respectively. In general, it not only reduces the cost, but also ensures high efficiency and high precision, providing a new and efficient method for the 3D data acquisition of the inner wall of the pipe.

1. Introduction

Pipe plays an important role in modern production activities, but the defects, damage, and corrosion of the inner wall of the pipe pose a serious threat to its safe use [1,2]. Conventional methods for inspecting pipes have traditionally depended on hands-on measurements, influenced by elements like material distortion due to contact and the proficiency of workers, leading to reduced work efficiency and inaccurate measurements [3]. Furthermore, the growth of an older population is expected to escalate labor expenses, leading to a trend of different solutions in the industrial inspection sector being sought [4].
As scanning technology progresses, the use of scanners for acquiring 3D pipe data has evolved into a proficient, precise, and nondestructive method [5], offering robust technical assistance in the building and upkeep of pipes, including ultrasonic scanners [6], magnetic scanners [7], structured light scanners [8], and capacitive scanners [9]. These technologies accurately gauge the 3D shape and structure of the pipe’s inner wall without direct contact. In contrast to conventional inspection technologies, these methods present notable benefits, including nondestructive detection, enhanced precision, and superior efficiency.
In the realm of 3D shape acquisition, optical-based structured light scanners stand out among diverse technologies for their notable benefits, including high precision, efficiency, adaptability, and affordability [10,11]. The primary components of structured light scanners include a structured light emitter and a camera [12]. After capturing the object’s surface reflection of structured light, the camera, through stereo vision models and image processing methods, swiftly and precisely gathers the object’s 3D shape data for the analysis and evaluation of feature information [13,14]. Due to the low space requirements of monocular-structured light scanning equipment, this is advantageous for scanning in narrow environments, and can easily cope with the space limitations of narrow diameter pipes.
Given the complex manual measurement operation of narrow diameter pipes, the expensive nature of current apparatus for obtaining the 3D inner wall shapes of pipes, and the difficult equilibrium between speed and precision in conventional center extraction methods for light stripe [15], this study introduces a low-cost, efficient, and high-precision 3D-laser-scanning technology for the inner wall of multi-size pipes is introduced, which can accurately obtain the key data of the shape and size of the inner wall of the multi-size pipe. This technology is essential for assessing the health of pipes, identifying potential problems, and planning maintenance.

2. Literature Review

The acquisition of the 3D shape of the inner wall of the pipe is an important research direction in the field of pipe inspection. The primary difficulty is found in the exactness of the 3D-scanning devices and in the efficiency and accuracy of the light stripe’s center extraction. A multitude of academics, both domestically and internationally, have engaged in comprehensive studies in this domain, suggesting diverse solutions and pioneering technologies [16].

2.1. 3D-Scanning Technology

Owing to its benefits of being non-contact, highly accurate, and rapid, 3D-scanning technology finds extensive applications across diverse sectors [17,18,19]. In the field of pipe inspection, there has been extensive research on pipe 3D scanning. Shang et al. proposed a single-pass inline pipe 3D-reconstruction method using a depth camera array, which achieved good results in terms of accuracy. However, the size of the measuring equipment designed through this method was difficult to adapt to the inner wall scanning of smaller pipes [20]. Bahnsen et al. studied the 3D scanning of pipes using a single forward-looking RGBD camera. The scanning accuracy was obtained through the use of an active infrared projector or through providing suitable lighting conditions, but the study only achieved centimeter-level accuracy, which was not enough to accurately restore the 3D morphology of the pipe [21]. Yang et al. proposed a pipe 3D-reconstruction method using a 3D-active-stereo omnidirectional-vision scanner, which realized the scanning of the inner wall of a multi-size pipe. However, the measuring equipment designed with this method needed to be in contact with the inner wall of the pipe, which would cause damage to the inner wall of the soft material in the measurement process [22]. In view of the limitations of the above methods in the scanning of the pipe inner wall, a new pipe inner wall-scanning technology based on structured light is proposed in this study.

2.2. Extraction of Light Stripe Center

In the linear-structured light 3D-scanning technology, the linear-structured light projected by the linear-structured light emitter is reflected by the target object, and the light stripe appears in the image. The accuracy of the 3D morphology acquisition depends largely on the extraction of the center of the light stripe [23,24]. Initial methods for extracting the center of light stripes predominantly involved the extremum method [25], threshold method [26], and skeleton thinning method [27], among others. These methods rely on the structural features of light stripes to extract centers at the pixel level and are readily influenced by noise. The other is similar to the gray barycenter method (GBM) [28], direction template method [29], curve fitting method [30], and Steger method [31], among others, which are used to determine the center of structured light stripes based on their gray scale features. However, these methods are difficult to balance in terms of accuracy, stability, and speed. Consequently, in the past few years, numerous researchers have enhanced traditional methods for extracting the core of light stripes. Cai et al. calculated the normal direction through principal component analysis, followed with determining the sub-pixel’s central point via second-order Taylor expansion. The technology relies on the Gaussian convolution process, leading to high computational durations [32]. Liu et al. suggested an algorithm for extracting centers, which utilized the Hessian matrix and expanding regions; this method is noted for high accuracy and strong resilience, yet it falls short in real-time efficiency with a wide stripe width [33]. Wang et al. obtained the approximate position of the light stripe center by extracting the skeleton, and then calculated the coordinates of the light stripe center along the skeleton normal direction using the weighted gray barycenter method. The algorithm achieved significant improvement in both the efficiency and the extraction effect [34]. However, current methods for extracting the centers of light stripe lack accuracy in extracting the center of the light stripe in regions of large curvature and are prone to breakpoints. Consequently, an algorithm tailored for arc-shaped light stripes is essential, ensuring sub-pixel accuracy and rapid extraction, satisfying the immediate needs of 3D-measurement systems.

2.3. Contributions

The 3D-scanning technology of pipe inner walls faces many challenges, including the demand for high efficiency, high precision, flexible operation, and low equipment cost. In response to these challenges, a new method for the automatic 3D scanning of multi-size pipes using monocular-structured light is proposed. This study’s key innovative contributions include the following:
  • This study proposes an image-processing strategy based on tracking speckles to solve the influence of speckle noise in the image on subsequent light stripe center extractions. The strategy consists of speckle aggregation region extraction, weak speckle grayscale enhancement, and accurate speckle recognition. The problem that the traditional filtering method can not remove the speckle completely is solved by the targeted processing of the speckle.
  • Aiming at the morphological characteristics of arc-shaped light stripes, this study improved the gray barycenter method. On the basis of the traditional gray barycenter method, the center point is modified through fitting a Gaussian curve, and the breakpoint problem in the process of light stripe-center extraction is solved with interpolation based on tangent direction guidance.
  • Utilizing a camera, an annular-structured light emitter, and a mobile control system, this study develops and builds an automatic 3D scanner for the inner wall of multi-size pipes, which enhances the cost efficiency, improves the operational adaptability, and achieves non-contact inner wall detection, providing an effective instrument for the accurate detection of the inner wall of the multi-size pipe.

3. Methodology

3.1. Creation and Assembly of a Monocular Structured Light 3D Scanner

3.1.1. Elements and Operations of the Scanner

Demonstrated in Figure 1a, this research designed and built a monocular-structured light 3D scanner for the automated 3D scanning of multi-size pipes. Figure 1b illustrates that the primary components of the scanner are an annular-structured light emitter, a camera, and a mobile control platform. As depicted in Figure 2a, the annular-structured light emitter primarily consists of a point laser emitter and a tapered mirror with a 90 degree top angle. As shown in Figure 2b, since the top angle of the tapered mirror is 90 degrees, the angle between the two beams of reflected light l u p and l d o w n is
θ = 2 · π 4 + 2 · π 4 = π
Consequently, the surface of an annular-structured light can be represented by an arbitrary plane of three degrees of freedom as follows:
ω : A x + B y + C z + D = 0
In order to improve the detection accuracy, the scanner adopts the mode of “sample moving-structured light emitter and camera position fixed”, and is equipped with a stepper motor in order to realize the translation of the sample axis and the control of the image acquisition process, thus ensuring uniform stratification and fast data acquisition. In addition, when the linear-structured light emitter is in a relative position to the camera, different specifications of the pipe can be detected, and the positions of the linear-structured light emitter and the camera are no longer repeatedly adjusted, thus improving the flexibility of the 3D detection of the inner wall of the multi-size pipe.

3.1.2. Adjusting the Annular Structure Light Emitter

Installation can lead to a misaligned horizontal alignment between the pipe’s axis and the tapered reflector’s axis, potentially causing errors in the equipment. Figure 3 illustrates that, if the tapered reflector’s axis deviates by an angle value of α from the camera’s optical axis, the resultant light plane strays by an angle value of 2 α from the theoretical standard light plane. Consequently, this results in radial inaccuracies as follows:
r = r d e t e c t i o n r t r u t h = r t r u t h cos 2 α r t r u t h
When the deflection angle is large, it adversely affects the accuracy of the subsequent 3D-shape acquisition. To align the axis of the tapered reflector with the axis of the pipe horizontally, this study uses a correction method based on the light plane equation, aiming to reduce radial errors. In the camera coordinate system, calculate the coordinate deviation value c between the point cloud centroid of each section at different positions of the standard pipe, and adjust the relative position of the sample bracket and the camera according to c , so that c approaches 0, determining the position of the sample bracket and the camera. In this case, the error caused by the deflection angle between the camera optical axis and the pipe axis can be neglected. When the axis of the tapered reflector is parallel to the axis of the pipe, the normal vector n of the light plane in the camera coordinate system is parallel to the z -axis. Therefore, based on the actual parameters of the light plane equation calculated in the camera coordinate system, the annular-structured light emitter is fine-tuned to make the coefficients A and B approach 0, thereby reducing radial errors.
With the adjustment, the relative position relationship between the camera and the annular-structured light emitter can be accurately determined. In addition, by mounting a flip seat at the bottom of the annular-structured light emitter bracket, as shown in Figure 3b, it is possible to ensure that the relative position relationship between the camera and the annular-structured light emitter remains stable after the seat is flipped and reset. This design not only provides convenience for the installation and disassembly of pipes, but also eliminates the need for secondary adjustments when continuously measuring multi-size pipes, thus improving the rigor of the overall operation of the equipment.

3.2. Image Processing Strategy Based on Tracking Speckles

The scattering phenomenon caused by the strong reflectivity of the inner wall material will significantly affect the image quality and produce speckle noise near the light stripe. In this study, an image-processing strategy based on tracking speckles is proposed to accurately identify and process the speckle, thus solving the problem that the traditional filtering method cannot completely remove the speckle, and effectively eliminating the influence of speckle noise on the accuracy of the light stripe-center extraction.

3.2.1. Extraction of Speckle Aggregation Regions

Firstly, for very bright scattered speckles, this study employs a bilateral filtering method based on the spatial distribution of the Gaussian filter function [35]. For each pixel I x , y in the image, the value of the pixel after bilateral filtering, I x , y , is described as follows:
I x , y = 1 W p s , t S I s , t f p q g I p I q
In the above equation,
f p q = e p q 2 2 σ d 2
and
g I p I q = e I p I q 2 2 σ r 2
Among them, p = x , y is the current pixel position, q = s , t is a pixel position within the neighborhood S around p , σ d is the standard deviation of the spatial kernel, σ r is the standard deviation of the range kernel, and W p is a normalization factor.
Then, the contour recognition method based on image connectivity analysis [36] is used to extract the connected region in the image, and the connected region whose region is less than the set threshold is determined as the speckle. However, this method can not accurately extract the speckle region, and the phenomenon of less extraction will occur. Therefore, after the initial recognition and extraction of speckles, we further adopted the K-nearest neighbor algorithm to perform cluster analysis on the extracted initial speckles in order to obtain the speckle aggregation region, so as to carry out further speckle recognition against the speckle aggregation region. Figure 4 shows the process of extracting the initial speckle region, where the details of the extraction effect are shown in the red square.

3.2.2. Accurate Extraction of Speckles

Due to the low grayscale of some speckle pixels, this study proposes a new method to enhance weak speckle pixels based on the hyperbolic tangent transform. A morphological operation was performed on the extracted speckle aggregation region, and the mean value a v g and standard deviation σ of the morphologically processed images were calculated. The threshold values θ were set as follows:
θ = a v g σ
Each pixel value G ( x , y ) in the input image I is mapped differently according to its relationship to the threshold θ :
G x , y = N 1 2 1 t a n h θ G x , y θ k , G x , y θ N 1 2 1 + t a n h θ G x , y θ k , G x , y > θ
where N · is a normalization function that normalizes the converted grayscale to a range of 0 , 255 , t a n h · is hyperbolic tangent function, and k is the strength value used for contrast enhancement.
Afterwards, the level set function (LSF) is used to identify the contour of the speckle after the image enhancement. The zero level set of the LSF ϕ ( x , y , t ) at the temporal variable t is expressed as follows:
C = x , y : ϕ x , y , t = 0
The determination of the contour C is translated into the solution of the partial differential equation (PDE) in Equation (10), that is, the evolution equation of the level set [37].
ϕ t = F ϕ
where F is the speed function that controls the motion of the contour, and is the gradient operator. In the traditional level set method, the LSF usually contains irregularity in the evolution process, which leads to numerical errors. While the regularity of the LSF can be maintained through re-initialization as a numerical remedy, it may mistakenly move the set of zero levels away from the intended location. In order to maintain the regularity of the LSF without the need for re-initialization, the distance regular term proposed by Li et al. [38] is used as the new energy term as follows:
R ϕ Ω   1 2 ϕ 1 2 d x
where Ω is a domain for the LSF ϕ : Ω R . Define the energy functional as
E ϕ = μ R ϕ + E e x t ϕ
where μ > 0 is a constant, E e x t ϕ is the external energy defined by E e x t ϕ = λ L ϕ + α A ( ϕ ) , and λ > 0 and α R are the coefficients and the energy functionals L ϕ and A ϕ , which are defined through
L ϕ Ω   g δ ϕ ϕ d x
and
A ϕ Ω   g H ϕ d x
where g 1 / ( 1 + G σ I 2 ) is the edge indicator function of image I in the domain Ω , and G σ is a Gaussian kernel function with standard deviation σ / δ and H are the Dirac delta function and the Heaviside function [39,40], respectively.
According to the [38], the energy is designed to reach a minimum when the zero-level set of the LSF is at the desired location. According to the variational calculus [41], the E ϕ energy can be minimized via calculating the PDE, where d i v · is the divergence operator.
ϕ t = μ d i v ϕ 1 ϕ ϕ E e x t ϕ
In edge detection in image processing, E e x t energy is used to describe the edge information, and the energy functional can be minimized through solving for the following gradient flow:
ϕ t = μ d i v ϕ 1 ϕ ϕ + λ δ ϕ d i v g ϕ ϕ + α g δ ϕ
Figure 5 shows the process of speckle recognition after image enhancement based on the hyperbolic tangent transformation.

3.2.3. Binarization Processing

In order to suppress the influence of speckles and other background regions, this study first assigned the grayscale of speckle pixels obtained based on Section 3.2.2 to 0, and then set a threshold α to binarize the input image I as follows:
g x , y = I x , y , I x , y α 0 , I x , y < α

3.3. Improved Gray Barycenter Method

In this study, the light stripe region is first divided into four regions, and then the center of the light stripe is extracted for each region, respectively. Finally, the coordinates of the extracted center point of the light stripe are restored to the original image coordinate system. Based on the traditional gray barycenter method, this study is divided into two steps as follows: (1) optimized gray barycenter method via fitting a Gaussian curve, and (2) interpolation guided by the tangent direction.

3.3.1. Optimized Gray Barycenter Method through Fitting Gaussian Curve

In this study, the initial center point of the light stripe is extracted using the traditional gray barycenter method as follows:
C 0 = x G x , y G x , y
According to [42], the center line of the light stripe can be quickly and accurately extracted by using the center point of the Gaussian curve as the center point of the light stripe through Gaussian curve fitting. The set of input points is set as n , G C 0 n , n + 1 , G C 0 n + 1 , , 0 , G C 0 , , n 1 , G C 0 + n 1 , n , G C 0 + n . As shown in Figure 6a, C 0 is the initial center point and G 0 is the grayscale of C 0 . Through the Gaussian curve fitting of the grayscale distribution of C 0 and the pixels on both sides, the center point C 0 of the corrected light stripe, that is, the central coordinate of the Gaussian curve, is obtained. Equation (19) is the corresponding ideal Gaussian curve function, where a is the amplitude, C 0 is the central coordinate of the ideal Gaussian curve, and w is the width of the light stripe.
G i = a e C i C 0 2 w
In this study, the least square method is used for the Gaussian curve fitting. In order to construct the error function, the logarithms of both sides of Equation (19) are first taken and converted into polynomials as follows:
l n G i = C i 2 w + 2 C 0 C i w 2 + l n a C 0 2 w 2
Let l n G i = z i , l n a C 0 2 / w 2 = b 0 , 2 C 0   / w 2 = b 1 , 1 / w = b 2 , then Equation (20) can be converted to polynomial z i = b 0 + b 1 C i + b 2 C i 2 , and the residual sum of squares is
M = i = 1 n z i ( b 0 + b 1 C i + b 2 C i 2 ) 2
By minimizing M , b 0 , b 1 , and b 2 can be solved so that the transverse coordinate of the center point of the corrected light stripe, that is, the central coordinate of the Gaussian curve, can be determined as follows:
C 0 = b 1 2 b 2
The actual grayscale distribution of the light stripe is shown in Figure 6b. Because the Gaussian curve can better simulate the grayscale distribution of light stripe, the center point obtained via Gaussian curve fitting is more accurate than the traditional gray gravity center method, and the interpolation operation will be more accurate and effective in the subsequent processing of light stripe breakpoints.

3.3.2. Interpolation Guided by the Tangent Direction

As shown in Figure 7a, through line fitting based on the least square method to the extracted center point of the light stripe and the points on both sides of its neighborhood, the fitted line with equation A x + B y + C = 0 can be obtained, and then the slope k of the tangent line of each center point can be extracted. In order to better illustrate the extraction effect, Figure 7b shows the normal corresponding to the tangent line of the center point of a light strip calculated using the above method, and Figure 7c shows the center line of the light stripe after interpolation. According to Equations (23) and (24), the coordinates of the interpolation points along the tangential direction can be calculated to solve the breakpoint problem of the line points in the arc-shaped light stripe, where t is the step length along the tangential direction and α is the angle between the tangential line and the x -axis.
x = x ± t cos α y = y ± t sin α
among this,
cos α = 1 1 + k 2 sin α = k 1 + k 2

3.4. Point Cloud Generation

3.4.1. Monocular Line-Structured Light 3D-Reconstruction Model

This study is based on a monocular line-structured light 3D-measurement model, which obtains its 3D coordinates through the pixel coordinates of the center of the light stripes. As shown in Figure 8, in this model, the light plane P A B , generated by the annular line-structured light emitter, forms circular light stripes on the inner wall of the pipe, where P is a point on the circular light stripe, P is the imaging point of P on the imaging plane, and P is the imaging point of P on the normalized plane.
P ( x c , y c , z c ) is the intersection point between the straight line O c P and the light plane P A B . Since P u v ( u , v ) is known in the pixel coordinate system, the coordinates of this point in the image coordinate system P ( x , y ) can be obtained as shown in the following equation:
x = d x u u 0 y = d y v v 0
Among this, ( u 0 , v 0 ) is the coordinate of the origin of the image coordinate system in the pixel coordinate system, d x represents the actual physical size of the unit pixel in the u direction, and d y represents the actual physical size of the unit pixel in the v direction. Since the distance from the center point in the plane to the camera origin is the focal length f , transforming P ( x , y ) to the camera coordinate system results in P ( x , y , f ) . Considering O c as the origin of the camera coordinate system, the equation of the line O c P is as follows:
x c = x t y c = y t z c = f t
As shown in Figure 9, this study calibrates the camera using the classic Zhang Zhengyou calibration method by adjusting the checkerboard calibration board placed at multiple different angles [43]. Simultaneously calibrating the camera, the light plane generated by the annular line-structured emitter intersects with the blank area of the checkerboard calibration board to form a linear light stripe. The light plane equation is fitted based on the classical least square method via the linear light strips intersected with the calibration plates at different angles. The fitted light plane equation is shown in Equation (2). For ease of calculation, assuming the light plane equation obtained through fitting is as shown in the formula, then
z c = a x c + b y c + c
The coordinates of points ( x c , y c , z c ) on the measured pipe in the camera coordinate system can be obtained from Equations (25)–(27), as shown in the following formula:
x c = c d x u u 0 a d x u u 0 + b d y v v 0 f y c = c d y v v 0 a d x u u 0 + b d y v v 0 f z c = c f a d x u u 0 + b d y v v 0 f
Among this, d x , d y , and f are the camera’s own parameters and cannot be calculated only through camera calibration. To enhance the universality of the method, the transformation relationship between the pixel coordinates and camera coordinates is constructed in this study as shown in Equation (29), where M 1 is the intrinsic parameter matrix obtained through camera calibration.
z c u v 1 = f d x 0 u 0 0 f d y v 0 0 0 1 x c y c z c = M 1 x c y c z c
From Equation (29) above, it can be inferred that the coordinates ( x c , y c , z c ) in the camera coordinate system corresponding to a certain pixel ( u , v ) on the image are as shown in the following equation:
x c y c z c = z c M 1 1 u v 1
To eliminate the influence of variable z c in Equation (30), establish a normalization plane. The image point of P in the image plane and normalized plane is P and P , respectively. Because the pixel coordinates of P are ( u , v ) , then the coordinates of P ( x , y , 1 ) in the camera coordinate system are as follows:
x y 1 = M 1 1 u v 1
P ( x c , y c , z c ) can be regarded as the intersection point between the line O c P and the light plane P A B . Repeat the above line and surface intersection method to obtain the following coordinates:
x c = c x a x + b y 1 y c = c y a x + b y 1 z c = c a x + b y 1

3.4.2. Pipe Inner Wall Reconstruction

As shown in Figure 9, during the scanning process, the pipe is translated along the pipe axis, and the annular-structured light is continuously projected to form a series of circular light stripes at different positions on the inner wall of the pipe. Based on Section 3.4.1, the point cloud data of a single section of the inner wall of the pipe can be obtained using the parameters obtained from the system calibration and the two-dimensional coordinates of the center point of the light stripe [44]. Each of the obtained cross-section point clouds represents the cross-section shape of the inner wall of the pipe at a specific position. Since the pipe axis is parallel to the camera optical axis, the pipe axis is parallel to the z -axis in the camera coordinate system. According to the preset movement parameter k in the scanning stage, the corresponding k units of point clouds in each section are translated along the z -axis, thus achieving a complete point cloud reconstruction of the inner wall of the pipe. The converted coordinates are shown in Equation (33).
x i = x i   y i = y i   z i = z i + k

4. Analysis of Results

This study conducts a thorough evaluation of the 3D-laser-scanning technology used to scan the inner wall of the pipe through rigorously designed experiments. The performances of the improved gray barycenter method, traditional gray barycenter method, Steger method, and Wang’s method are compared and analyzed in terms of accuracy and processing speed. Furthermore, it validates the accuracy and stability of the proposed technology through conducting 3D scans and acquiring 3D morphologies of pipes of various sizes. The processing algorithm involved in this study is implemented based on C++ using OpenCV library and PCL library.

4.1. Experimental Setup

In order to avoid the influence of ambient light and brightness saturation on scanning, the experimental collection environment is carried out in a dark room, and the monocular line-structured light 3D scanner is equipped with a 6-megapixel black and white industrial camera and a laser emitter which projects light stripes of 1mm width, as shown in Figure 10. The test pipe is placed on a sample bracket, and an annular-structured light is projected onto the inner wall of the pipe, forming light stripes. During the experiment, the acquisition frame rate is set to eight frames per second, and the acquisition image size is set to 2472 × 934 pixels. The scanning step of the moving platform is 0.50 mm. The annular-structured light emitter is finely adjusted based on the coefficients of the light plane equation to make the normal vector of the light plane close to parallel with the pipe axis and the camera light axis. The camera calibration results based on Zhang’s calibration method are shown in Table 1. After multiple adjustments, the final fitted light plane equation is Equation (34).
0.000121087 x 0.000227188 y z + 472.753 = 0

4.2. Evaluation of the Extraction of the Light Stripe Center

In order to evaluate the performance of the improved gray barycenter method, this study conducted comparative experiments with the traditional gray barycenter method, the Steger method, and Wang’s method. Three standard pipes with different inner diameters were selected for the experiments, and images of light stripes at various positions on the inner walls of the pipes were captured to form the experimental dataset, thus ensuring the practical applicability evaluation of the methods. Subsequently, these four methods were applied to extract the centers of the light stripes from the images in the dataset, and the comparative diagrams (as shown in Figure 11) and residual comparison table (as shown in Table 2) of the effects of the four methods in extracting the centers of the light stripes were presented.
From the comparison in Figure 11, it is evident that, although the Steger method can identify the center of the light stripes well, it has poor sensitivity, especially in Figure 11a. Due to the lack of analysis of the geometric properties of the light stripes, the gray barycenter method still has limitations in extracting the centers of light stripes, leading to noticeable discontinuities. Wang’s method can weaken the breakpoint phenomenon by normal weighting, but there are still breakpoints. In contrast, the improved gray barycenter method not only provides more accurate centers of light stripes in arc-shaped regions, but also maintains good continuity. Regarding the runtime, as shown in Table 2, the traditional gray barycenter method takes the least time due to its simple calculations, while the Steger method takes the longest time among all methods due to its involvement in more complex mathematical operations, and Wang’s method ranks in the middle in terms of computing time. Although the improved gray barycenter method increases the computational burden, its running time is still in a low range, second only to the traditional gray barycenter method. This method balances the relationship between efficiency and accuracy, ensures its feasibility and efficiency in practical applications, and has important value in industrial detection and other time-sensitive applications.

4.3. Evaluation of the Accuracy of Local Measurements

In order to verify the accuracy of point cloud extraction, a set of standard ring gauges with clear diameters (95 mm, 105 mm, and 115 mm, respectively) were used as the experimental benchmark to verify the local accuracy. As shown in Figure 12, in this experiment, the image captured by the industrial camera was first preprocessed and the center line of the light fringe was accurately extracted. On this basis, the monocular-structured light 3D-reconstruction model in Section 3.4.1 is used to calculate and generate the corresponding point cloud data. As shown in Table 3, after the comparative analysis, it can be found that there is a very small deviation between the measured value and the reference value of the standard ring gauge, that the average deviation is 0.0833 mm, and that the value of RMSE (root mean square error) is also controlled at an excellent level of 0.0839 mm. This result fully proves that the error caused by the coordinate conversion is strictly limited to a very small range, so as to ensure the accuracy and stability of the subsequent 3D-surface-morphology acquisition.

4.4. Evaluation of the Accuracy of 3D Surface Morphology Acquisition on Pipe Inner Walls

In this experiment, the inner walls of pipes with four different sizes were scanned to obtain point cloud data of the inner walls, and the reconstructed 3D morphology of the inner walls of the pipes is shown in Figure 13. The point cloud data of the pipe inner walls were processed using a cylindrical dimension analysis method based on cylinder fitting, accurately extracting the inner diameter of the pipes, which were taken as the measured value of the inner diameters of the pipes. Since the program controlled the forward belt of the stepper motor to drive the carriage forward, the gap of all the parts involved in the movement had been compensated and could be ignored, the moving distance was the length change, and the changing size of the moving variable in the program was the length of the pipe. To ensure the accuracy and reliability of the experiment, each pipe was measured five times using a vernier caliper with an accuracy of 0.02 mm, and the average value was taken as the reference value. Subsequently, the experimental measurement values were compared with the reference values in order to evaluate the accuracy of the obtained 3D morphology. The experimental results in Table 4 and Table 5 show that the average deviation between the measured inner diameters of the four pipes and the reference values remain at a low level of less than 0.13 mm, and the RMSE value is 0.1344 mm. Similarly, the average deviation between the measured lengths of the pipes and the reference values remain within an acceptable range of less than 0.41 mm, and the RMSE value is 0.4193 mm, limited by the precision of the screw. High-precision 3D morphology can be obtained for small diameter pipes less than 70 mm. These results indicate that the laser 3D-scanning technology proposed in this study not only applies to the detection of ordinary pipes, but also maintains high accuracy in narrow pipes.

5. Conclusions

In order to improve the accuracy and operational flexibility of pipe inspections while reducing the overall costs, this study specifically designed and constructed a monocular line-structured light 3D scanner consisting of a camera, an annular-structured light emitter, and a mobile control platform. Additionally, an image-processing strategy based on tracking speckles is proposed. Through accurate speckle processing, it overcomes the disadvantages of traditional filtering methods, in which it is difficult to completely eliminate speckles, thus reducing the interference of speckle noise on the accuracy of light fringe center positioning. Regarding the morphological characteristics of arc-shaped light stripes, on the basis of the traditional gray barycenter method, the center point is modified via fitting a Gaussian curve, and the breakpoint problem is solved through interpolation based on tangent guidance, which, while ensuring extraction speed, improved the accuracy of the light stripe-center extraction.
In order to evaluate the performance of the proposed laser-based 3D-scanning technology and the improved gray barycenter method, this study utilized a constructed 3D scanner to measure the inner walls of pipes of different sizes. It assessed the accuracy and operation time of the improved gray barycenter method, along with the accuracy of laser-based 3D-scanning technology.
The results show that, when compared with the traditional gray barycenter method, the standard error of light stripe barycenter extraction is reduced by 52%, and, when compared with the Steger method, the average operating time is reduced by 83%. In addition, when compared with Wang’s method, the standard error of optical strip centroid extraction is reduced by 7%, and the average operation time is reduced by 37%. The average deviation of the inner diameter and length of the pipe obtained through laser-3D-scanning technology are less than 0.13 mm and 0.41 mm, respectively, demonstrating sufficient measurement accuracy.
This study provides an efficient and accurate technology for acquiring the 3D morphology of the inner walls of pipes, which is expected to play a significant role in pipe engineering construction and maintenance. However, in future applications, consideration needs to be given to how to optimize the technology to adapt to more complex and harsh pipe environments. This can be achieved through introducing more advanced mechanical designs and control algorithms in order to improve the platform’s motion accuracy and response speed, thus ensuring efficient and stable operation in various environments.

Author Contributions

Conceptualization, methodology, software and writing—original draft preparation, L.K.; software and validation, L.M.; visualization, formal analysis, and data curation, K.W.; investigation, X.P.; writing—review and editing, resources, supervision, project administration, and funding acquisition, N.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Key Research and Development Program of Shaanxi (Grant No. 2019ZDLNY07-06-01).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

Thanks for all the help of the teachers and students.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Liu, X.; Zheng, J.; Fu, J.; Nie, Z.; Chen, G. Optimal inspection planning of corroded pipelines using BN and GA. J. Pet. Sci. Eng. 2018, 163, 546–555. [Google Scholar] [CrossRef]
  2. Piciarelli, C.; Avola, D.; Pannone, D.; Foresti, G.L. A vision-based system for internal pipeline inspection. IEEE Trans. Ind. Inf. 2018, 15, 3289–3299. [Google Scholar] [CrossRef]
  3. Buschinelli, P.; Pinto, T.; Silva, F.; Santos, J.; Albertazzi, A. Laser triangulation profilometer for inner surface inspection of 100 millimeters (4″) nominal diameter. J. Phys. Conf. Ser. 2015, 648, 12010. [Google Scholar] [CrossRef]
  4. Gao, Y. Mathematical Modeling of Pipeline Features for Robotic Inspection. Ph.D. Thesis, Louisiana Tech University, Ruston, LA, USA, 2012. [Google Scholar]
  5. Yokota, M.; Koyama, T.; Takeda, K. Digital holographic inspection system for the inner surface of a straight pipe. Opt. Lasers Eng. 2017, 97, 62–70. [Google Scholar] [CrossRef]
  6. Song, S.; Ni, Y. Ultrasound imaging of pipeline crack based on composite transducer array. Chin. J. Mech. Eng. 2018, 31, 1–10. [Google Scholar] [CrossRef]
  7. Heo, C.G.; Im, S.H.; Jeong, H.S.; Cho, S.H.; Park, G.S. Magnetic hysteresis analysis of a pipeline re-inspection by using preisach model. IEEE Trans. Magn. 2020, 56, 1–4. [Google Scholar] [CrossRef]
  8. Ye, Z.; Lianpo, W.; Yonggang, G.; Songlin, B.; Chao, Z.; Jiang, B.; Ni, J. Three-dimensional inner surface inspection system based on circle-structured light. J. Manuf. Sci. Eng. 2018, 140, 121007. [Google Scholar] [CrossRef]
  9. Demori, M.; Ferrari, V.; Strazza, D.; Poesio, P. A capacitive sensor system for the analysis of two-phase flows of oil and conductive water. Sens. Actuators A Phys. 2010, 163, 172–179. [Google Scholar] [CrossRef]
  10. Dong, Y.; Fang, C.; Zhu, L.; Yan, N.; Zhang, X. The calibration method of the circle-structured light measurement system for inner surfaces considering systematic errors. Meas. Sci. Technol. 2021, 32, 75012. [Google Scholar] [CrossRef]
  11. Javaid, M.; Haleem, A.; Singh, R.P.; Suman, R. Industrial perspectives of 3D scanning: Features, roles and it’s analytical applications. Sens. Int. 2021, 2, 100114. [Google Scholar] [CrossRef]
  12. Guerra, M.G.; De Chiffre, L.; Lavecchia, F.; Galantucci, L.M. Use of miniature step gauges to assess the performance of 3D optical scanners and to evaluate the accuracy of a novel additive manufacture process. Sensors 2020, 20, 738. [Google Scholar] [CrossRef] [PubMed]
  13. Almaraz-Cabral, C.; Gonzalez-Barbosa, J.; Villa, J.; Hurtado-Ramos, J.; Ornelas-Rodriguez, F.; Córdova-Esparza, D. Fringe projection profilometry for panoramic 3D reconstruction. Opt. Lasers Eng. 2016, 78, 106–112. [Google Scholar] [CrossRef]
  14. Li, Y.; Zhou, J.; Huang, F. High precision calibration of line structured light sensors based on linear transformation over triangular domain. In Proceedings of the 8th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Optical Test, Measurement Technology, and Equipment, Suzhou, China, 26–29 April 2016; SPIE: Bellingham, WA, USA, 2016; Volume 9684, pp. 40–46. [Google Scholar]
  15. Coramik, M.; Ege, Y. Discontinuity inspection in pipelines: A comparison review. Measurement 2017, 111, 359–373. [Google Scholar] [CrossRef]
  16. Sutton, M.A.; McNeill, S.R.; Helm, J.D.; Chao, Y.J. Advances in two-dimensional and three-dimensional computer vision. In Photomechanics; Springer: Berlin/Heidelberg, Germany, 2000; pp. 323–372. [Google Scholar]
  17. Zhang, Z.; Yuan, L. Building a 3D scanner system based on monocular vision. Appl. Opt. 2012, 51, 1638–1644. [Google Scholar] [CrossRef] [PubMed]
  18. Haleem, A.; Javaid, M. 3D scanning applications in medical field: A literature-based review. Clin. Epidemiol. Glob. Health 2019, 7, 199–210. [Google Scholar] [CrossRef]
  19. Vázquez-Arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D imaging systems for agricultural applications—A review. Sensors 2016, 16, 618. [Google Scholar] [CrossRef] [PubMed]
  20. Shang, Z.; Shen, Z. Single-pass inline pipeline 3D reconstruction using depth camera array. Autom. Constr. 2022, 138, 104231. [Google Scholar] [CrossRef]
  21. Bahnsen, C.H.; Johansen, A.S.; Philipsen, M.P.; Henriksen, J.W.; Nasrollahi, K.; Moeslund, T.B. 3d sensors for sewer inspection: A quantitative review and analysis. Sensors 2021, 21, 2553. [Google Scholar] [CrossRef] [PubMed]
  22. Yang, Z.; Lu, S.; Wu, T.; Yuan, G.; Tang, Y. Detection of morphology defects in pipeline based on 3D active stereo omnidirectional vision sensor. IET Image Process 2018, 12, 588–595. [Google Scholar] [CrossRef]
  23. Forest, J.; Salvi, J.; Cabruja, E.; Pous, C. Laser stripe peak detector for 3D scanners: A FIR filter approach. In Proceedings of the 17th International Conference on Pattern Recognition, 2004, ICPR 2004, Cambridge, UK, 23–26 August 2004; IEEE: Piscataway, NJ, USA, 2004; Volume 3, pp. 646–649. [Google Scholar]
  24. Sun, Q.; Chen, J.; Li, C. A robust method to extract a laser stripe centre based on grey level moment. Opt. Lasers Eng. 2015, 67, 122–127. [Google Scholar] [CrossRef]
  25. Bazen, A.M.; Gerez, S.H. Systematic methods for the computation of the directional fields and singular points of fingerprints. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 905–919. [Google Scholar] [CrossRef]
  26. Bo, J.; Xianyu, S.; Lurong, G. 3D measurement of turbine blade profile by light knife. Chin. J. Lasers A 1992, 19. [Google Scholar] [CrossRef]
  27. Fan, J.; Jing, F.; Fang, Z.; Liang, Z. A simple calibration method of structured light plane parameters for welding robots. In Proceedings of the 2016 35th Chinese Control Conference (CCC), Chengdu, China, 27–29 July 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 6127–6132. [Google Scholar]
  28. Wang, H.; Wang, Y.; Zhang, J.; Cao, J. Laser stripe center detection under the condition of uneven scattering metal surface for geometric measurement. IEEE Trans. Instrum. Meas. 2019, 69, 2182–2192. [Google Scholar] [CrossRef]
  29. Pang, S.; Yang, H. An algorithm for extracting the center of linear structured light fringe based on directional template. In Proceedings of the 2021 4th International Conference on Advanced Electronic Materials, Computers and Software Engineering (AEMCSE), Changsha, China, 26–28 March 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 203–207. [Google Scholar]
  30. Xue, B.; Chang, B.; Peng, G.; Gao, Y.; Tian, Z.; Du, D.; Wang, G. A vision based detection method for narrow butt joints and a robotic seam tracking system. Sensors 2019, 19, 1144. [Google Scholar] [CrossRef] [PubMed]
  31. Zhang, W.; Cao, N.; Guo, H. Novel sub-pixel feature point extracting algorithm for three-dimensional measurement system with linear-structure light. In Proceedings of the 5th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Optical Test and Measurement Technology and Equipment, Dalian, China, 26–29 April 2010; SPIE: Bellingham, WA, USA, 2010; Volume 7656, pp. 936–941. [Google Scholar]
  32. Cai, H.; Feng, Z.; Huang, Z. Centerline extraction of structured light stripe based on principal component analysis. Chin. J. Lasers 2015, 42, 10–3788. [Google Scholar]
  33. Liu, J.; Liu, L. Laser stripe center extraction based on hessian matrix and regional growth. Laser Optoelectron. Prog. 2019, 56, 21203. [Google Scholar]
  34. Wang, J.; Wu, J.; Jiao, X.; Ding, Y. Research on the center extraction algorithm of structured light fringe based on an improved gray gravity center method. J. Intell. Syst. 2023, 32, 20220195. [Google Scholar] [CrossRef]
  35. Piao, W.; Yuan, Y.; Lin, H. A digital image denoising algorithm based on gaussian filtering and bilateral filtering. ITM Web Conf. 2018, 17, 1006. [Google Scholar] [CrossRef]
  36. Zhao, Y.Q.; Wang, X.F.; Li, G.Y. Liver image segmentation based on multi-scale and multi-structure elements. J. Optoelectron. Laser 2009, 20, 563–566. [Google Scholar]
  37. Malladi, R.; Sethian, J.A.; Vemuri, B.C. Shape modeling with front propagation: A level set approach. IEEE Trans. Pattern Anal. Mach. Intell. 1995, 17, 158–175. [Google Scholar] [CrossRef]
  38. Li, C.; Xu, C.; Gui, C.; Fox, M.D. Distance regularized level set evolution and its application to image segmentation. IEEE Trans. Image Process 2010, 19, 3243–3254. [Google Scholar] [CrossRef] [PubMed]
  39. Osher, S.; Fedkiw, R.; Piechor, K. Level set methods and dynamic implicit surfaces. Appl. Mech. Rev. 2004, 57, B15. [Google Scholar] [CrossRef]
  40. Peng, D.; Merriman, B.; Osher, S.; Zhao, H.; Kang, M. A PDE-based fast local level set method. J. Comput. Phys. 1999, 155, 410–438. [Google Scholar] [CrossRef]
  41. Aubert, G.; Kornprobst, P.; Aubert, G. Mathematical Problems in Image Processing: Partial Differential Equations and the Calculus of Variations; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  42. Fasogbon, P.; Duvieubourg, L.; Macaire, L. Fast laser stripe extraction for 3D metallic object measurement. In Proceedings of the IECON 2016—42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, Italy, 24–27 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 923–927. [Google Scholar]
  43. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  44. Huang, Z.; Zhao, S.; Qi, P.; Li, J.; Wang, H.; Li, X.; Zhu, F. A high-accuracy measurement method for shield tail clearance based on line structured light. Measurement 2023, 222, 113583. [Google Scholar] [CrossRef]
Figure 1. Monocular line-structured light scanner structure. (a) Physical diagram. (b) Structural diagram.
Figure 1. Monocular line-structured light scanner structure. (a) Physical diagram. (b) Structural diagram.
Sensors 24 03554 g001
Figure 2. Structure of the annular line-structured light emitter. (a) Structural diagram. (b) Schematic diagram.
Figure 2. Structure of the annular line-structured light emitter. (a) Structural diagram. (b) Schematic diagram.
Sensors 24 03554 g002
Figure 3. Design of equipment adjustment. (a) Analysis of errors generated by annular-structured light emitters. (b) Structural diagram in case of flipping.
Figure 3. Design of equipment adjustment. (a) Analysis of errors generated by annular-structured light emitters. (b) Structural diagram in case of flipping.
Sensors 24 03554 g003
Figure 4. The extraction of initial speckle regions. (a) Input image. (b) Image after bilateral filtering. (c) Initial extraction of speckles. (d) Extraction of speckle aggregation regions.
Figure 4. The extraction of initial speckle regions. (a) Input image. (b) Image after bilateral filtering. (c) Initial extraction of speckles. (d) Extraction of speckle aggregation regions.
Sensors 24 03554 g004
Figure 5. The recognition of speckle region. (a) Input image. (b) Image enhancement. (c) Speckle regions extraction.
Figure 5. The recognition of speckle region. (a) Input image. (b) Image enhancement. (c) Speckle regions extraction.
Sensors 24 03554 g005
Figure 6. Gaussian curve fitting. (a) Gaussian curve calculation. (b) The actual grayscale distribution of light stripe.
Figure 6. Gaussian curve fitting. (a) Gaussian curve calculation. (b) The actual grayscale distribution of light stripe.
Sensors 24 03554 g006
Figure 7. Interpolation point calculation. (a) Tangent fitting. (b) Center point modified based on Gaussian curve fitting. (c) Normal direction. (d) The center point after interpolation.
Figure 7. Interpolation point calculation. (a) Tangent fitting. (b) Center point modified based on Gaussian curve fitting. (c) Normal direction. (d) The center point after interpolation.
Sensors 24 03554 g007
Figure 8. Monocular-structured light 3D-reconstruction model.
Figure 8. Monocular-structured light 3D-reconstruction model.
Sensors 24 03554 g008
Figure 9. Process of the generating point cloud of the pipe inner wall.
Figure 9. Process of the generating point cloud of the pipe inner wall.
Sensors 24 03554 g009
Figure 10. Experimental collection work diagram. (a) Overall working diagram. (b) Internal workings of the pipe.
Figure 10. Experimental collection work diagram. (a) Overall working diagram. (b) Internal workings of the pipe.
Sensors 24 03554 g010
Figure 11. Comparison of the effects in extracting the center of the light stripes [28,31,34]. (a) The effect of center extraction in the lower left part of the circular light stripe. (b) The effect of center extraction in the upper left part of the circular light stripe. (c) The effect of center extraction in the lower right part of the circular light stripe.
Figure 11. Comparison of the effects in extracting the center of the light stripes [28,31,34]. (a) The effect of center extraction in the lower left part of the circular light stripe. (b) The effect of center extraction in the upper left part of the circular light stripe. (c) The effect of center extraction in the lower right part of the circular light stripe.
Sensors 24 03554 g011
Figure 12. Standard ring gauge measurement diagram. (a) Measurement diagram of a standard ring gauge with an inner diameter of 95 mm. (b) Measurement diagram of a standard ring gauge with an inner diameter of 105 mm. (c) Measurement diagram of a standard ring gauge with an inner diameter of 115 mm.
Figure 12. Standard ring gauge measurement diagram. (a) Measurement diagram of a standard ring gauge with an inner diameter of 95 mm. (b) Measurement diagram of a standard ring gauge with an inner diameter of 105 mm. (c) Measurement diagram of a standard ring gauge with an inner diameter of 115 mm.
Sensors 24 03554 g012
Figure 13. 3D-morphology diagram of the inner wall of the pipe. (a) 3D-morphology diagram of pipe numbered a. (b) 3D-morphology diagram of pipe numbered b. (c) 3D-morphology diagram of pipe numbered c. (d) 3D-morphology diagram of pipe numbered d.
Figure 13. 3D-morphology diagram of the inner wall of the pipe. (a) 3D-morphology diagram of pipe numbered a. (b) 3D-morphology diagram of pipe numbered b. (c) 3D-morphology diagram of pipe numbered c. (d) 3D-morphology diagram of pipe numbered d.
Sensors 24 03554 g013
Table 1. Results of camera calibration.
Table 1. Results of camera calibration.
TypeParameter
Internal parameter matrix 2850.83 0 1245.61 0 2849.61 471.12 0 0 1
Radial distortion
(K1, K2, K3)
[0.1460, −0.8254, 2.7391]
Tangential distortion
(P1, P2)
[−0.0136, −0.0082]
Table 2. Performance comparison of different methods for extracting the centers of various light stripes.
Table 2. Performance comparison of different methods for extracting the centers of various light stripes.
Method
Type
648 pixel449 pixel362 pixel
σ /pixelTime/ms σ /pixelTime/ms σ Time/ms
Steger [31]0.40171.130.6074.020.6450.83
GBM [28]0.9517.211.337.931.366.21
Wang’s [34]0.4641.450.6820.130.7614.22
Ours0.4423.620.6312.350.689.96
Table 3. Measurement error results for different standard ring gauges.
Table 3. Measurement error results for different standard ring gauges.
Number(a)(b)(c)
Reference/mm95.00105.00115.00
Measurement/mm94.91104.91114.93
Error/mm0.090.090.07
Table 4. Measurement error results for different pipe inner diameters.
Table 4. Measurement error results for different pipe inner diameters.
Number(a)(b)(c)(d)
Reference/mm106.9482.6670.7060.46
Measurement/mm106.8682.5470.5560.29
Error/mm0.080.120.150.17
Table 5. Measurement errors of pipe lengths for different pipe sizes.
Table 5. Measurement errors of pipe lengths for different pipe sizes.
Number(a)(b)(c)(d)
Reference/mm265.28265.38263.50254.98
Measurement/mm265.00265.00263.00254.50
Error/mm0.280.380.500.48
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kong, L.; Ma, L.; Wang, K.; Peng, X.; Geng, N. Three-Dimensional-Scanning of Pipe Inner Walls Based on Line Laser. Sensors 2024, 24, 3554. https://doi.org/10.3390/s24113554

AMA Style

Kong L, Ma L, Wang K, Peng X, Geng N. Three-Dimensional-Scanning of Pipe Inner Walls Based on Line Laser. Sensors. 2024; 24(11):3554. https://doi.org/10.3390/s24113554

Chicago/Turabian Style

Kong, Lingyuan, Linqian Ma, Keyuan Wang, Xingshuo Peng, and Nan Geng. 2024. "Three-Dimensional-Scanning of Pipe Inner Walls Based on Line Laser" Sensors 24, no. 11: 3554. https://doi.org/10.3390/s24113554

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop