Next Article in Journal
Research on Inter-Turn Short Circuit Fault Diagnosis of Electromechanical Actuator Based on Transfer Learning and VGG16
Next Article in Special Issue
Design and Field Test of a Mobile Augmented Reality Human–Machine Interface for Virtual Stops in Shared Automated Mobility On-Demand
Previous Article in Journal
A Carbon-Nanotube Cold-Cathode Reflex Klystron Oscillator: Fabrication @ X-Band and Returning Electron Beam Realization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Modeling and Calibration of Active Thermal-Infrared Visual System for Industrial HMI

1
Beijing Smart-Chip Microelectronics Technology Co., Ltd., Beijing 100192, China
2
Zhongguancun Xinhaizeyou Technology Co., Ltd., Beijing 100994, China
3
Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
4
Graduate School of Advanced Science and Engineering, Hiroshima University, Hiroshima 739-8527, Japan
*
Author to whom correspondence should be addressed.
Electronics 2022, 11(8), 1230; https://doi.org/10.3390/electronics11081230
Submission received: 15 March 2022 / Revised: 6 April 2022 / Accepted: 8 April 2022 / Published: 13 April 2022
(This article belongs to the Special Issue Advances in Augmenting Human-Machine Interface)

Abstract

:
In the industrial application of the human-machine interface (HMI), thermal-infrared cameras can detect objects that are limited by visible-spectrum cameras. It can receive the energy radiated by a target through an infrared detector and obtain the thermal image corresponding to the heat distribution field on the target surface. Because of its special imaging principle, a thermal-infrared camera is not affected by the light source when imaging. Compared to visible-spectrum cameras, thermal imaging cameras can better detect defects with poor contrast but temperature differences or internal defects in products. Therefore, it can be used in many specific industrial inspection applications. However, thermal-infrared imaging has the phenomenon of thermal diffusion, which leads to noisy thermal infrared images and limits its applications in high precision industrial environments. In this paper, we proposed a high precision measurement system for industrial HMI based on thermal-infrared vision. The accurate measurement model of the system was established to deal with the problems caused by imaging noise. The experiments conducted suggest that the proposed model and calibration method is valid for the active thermal-infrared visual system and achieves high precision measurements.

1. Introduction

Computer vision has made good progress in the industry in recent years [1]. For visual measurement systems, cameras are widely used as measuring sensors [2,3,4]. Thermal-infrared cameras nowadays are getting more and more popular [5], but visible-spectrum cameras are still far more common. Thermal infrared imaging sensors have demonstrated excellent performance in unconventional environments and have historically found their use limited to military, security, and medical applications [6]. However, with increasing image quality and decreasing price, some of the thermal sensing devices are finding commercial deployment for home and office monitoring [7,8] as well as industrial HMI applications [9]. Thermal image sensors typically have a spectral sensitivity ranging from 7-micron to 14-micron bands of wavelength. The capacity of these imaging devices to appropriately capture images of objects depends on their emissivity and reflectivity [10]. In general, thermal-infrared cameras are superior or complementary to conventional visible-spectrum cameras in these aspects:
  • More adaptive to poor or changing lighting;
  • Effective for detecting an object with low contrast but differences in temperature;
  • Can detect surface information of objects as well as internal information.
In some industry fields, the infrared camera plays an irreplaceable role in in-process inspection tasks [11]. For example, in an automated fiber placement inspection system, the defects inside composites are difficult to detect by visible-spectrum cameras because the black prepreg slit-tapes prevent a high visual contrast between the single tows and the plies on the tooling surface, but the thermal-infrared camera achieves good performance in detection of these defects inside composites [12].
However, research on infrared vision systems has been limited so far. Infrared cameras have not been widely used for industrial vision measurements because of blurred imaging and calibration difficulties to build reliable and efficient vision measurement systems with infrared-thermal image sensors. The objective of this paper is to address this concern.
A common visual measurement system used in industry is the eye-in-hand inspection system [13]. The camera is mounted at the end of a robot arm and scans the detected objects. The eye-in-hand inspection system should be calibrated before measurement. The calibration results are used to establish the corresponding relationship between the detected target and its image. For a visual measurement system based on a visible-spectrum camera, we can calibrate the system by the camera’s intrinsic parameters calibration and eye-in-hand calibration, which can acquire the relationship between the camera frame and the robot’s end frame [14]. However, for a visual measurement system based on a thermal-infrared camera, the method above is not feasible. On the one hand, the thermal infrared camera is unable to capture the calibration patterns of the visible-spectrum camera [15]. On the other hand, even if a calibration pattern with temperature difference is used, the thermal diffusion of the infrared image will cause blurring of the image and make it difficult to acquire calibration features. Figure 1a–c show the calibration board with temperature difference taken continuously by the thermal infrared camera, which can clearly show the phenomenon of thermal diffusion. The edges of the calibration pattern are not clear and become increasingly blurred.
In recent years, some researchers have been devoted to the geometric calibration of thermal-infrared cameras. In [16], the author conducted the calibration with the famous Zhang’s calibration method [17] by heating a chessboard calibration block. In many cases, hollow calibration targets based on burning lamps are very useful [18,19,20]. However, the procedure is too complex, and the size of the calibration object is very large, which is not suitable for the calibration of high-precision vision systems with small FOV (field of view). An alternative solution is creating calibration patterns by the combination of materials with different emissivity, which makes the calibration more convenient [21,22]. However, the imaging of calibration patterns in this way is not clear enough and still requires complex heating equipment to assist in heating the calibration target [20]. In addition to the lack of a good solution for the calibration target for geometric calibration, it is also difficult to process calibration images [23]. The corners in the thermal image are blurred and are thus extracted manually. As it is impossible to obtain the subpixel position of the corner accurately, the thermal-infrared camera cannot be calibrated accurately as well [24].
The main contribution of this paper is in proposing an active thermal infrared visual measurement system for high precision industrial HMI applications, which mainly consists of a thermal infrared camera, a heating lamp, and a robotic arm. Mathematical modeling and calibration methods for this system are also proposed, and the results are verified to achieve high accuracy measurements. The rest of this paper is organized as follows: In Section 2, system construction and working principles are provided. Section 3 proposes the mathematical model of the thermal-infrared visual measurement system. Section 4 introduces the calibration method. In Section 5, experiments are conducted to test the performance of the proposed method, finalizing it by presenting an application case for the industrial HMI.

2. System Construction and Working Principle

As shown in Figure 2, the thermal-infrared visual measurement system mainly consists of an industrial robotic arm, a thermal infrared camera, and a heating lamp. The robot arm uses an ABB IRB640 M2240 robot. The thermal infrared camera is an i3-thermal expert TE-EQ1, which has a resolution of 640 × 480 pixels and a frame rate of 60 fps. The wavelength range of the camera is 8–14 μm. Objects in the temperature range of −10 to 120 °C can be seen with this camera. The heating lamp is mounted on the left side of the end of the robotic arm, and the angle between them is about 0°. The thermal infrared camera is mounted on the right side of the end of the robotic arm, and the angle between them is about 45°. The distance between the camera and the detected target is about 15 cm. When the system starts to work, the heating lamp heats the surface of the inspected workpiece to create a temperature difference between defects and the surface of the workpiece. Therefore, the infrared camera can detect the defects of the workpiece. The thermal-infrared camera scans the workpiece with the movement of the robot arm to acquire infrared images of defects and measures the size and position of defects according to real-time infrared images. It has great potential applications for industrial HMI, especially for online inspection and real-time measurement of those defects that are difficult to be detected by visible-spectrum cameras. The accuracy of the system is mainly dependent on the mathematical modeling and calibration method, so an accurate calibration model and convenient calibration method are significant for the system.

3. Modeling of the Active Thermal-Infrared Visual Measurement System

To facilitate modeling of the thermal-infrared visual measurement system, coordinate frames were established, as shown in Figure 3. The workpiece coordinate system is denoted as {W}, the robot coordinate frame is {R}, the camera coordinate frame is {C}, and the end of the robot arm coordinate frame is {E}. A reference coordinate frame {S} is established, which is associated with {E}. Coordinate frames at time t are denoted as {C(t)}, {E(t)}, and {S(t)}.
The surface of the workpiece is modeled as a point set Ω and denoted as:
Ω : { ( x , y , z ) | ( x , y , z ) = 0 }
where Ω describes the shape of the workpiece. It is an analytical formula or recorded as an engineering diagram. The position of point P in {W} is denoted as WP = [wdx, way, wdz]. IP= [u, v, 1]T denotes its corresponding point in the thermal infrared image. The aim is to achieve WP, which is the position of the point P in the workpiece coordinate frame. Using the pin-hole model of the camera, we can obtain:
M in · T   C ( t )   W · P   W =   z c P   I
where Min is the camera’s intrinsic parameters matrix, C(t)TW represents the external parameters between {C} and {W}. Min is a constant matrix. In the robotic vision system, the camera is fixed at the end of the robot and moves with the robot arm. Thus, C(t)TW is associated with the robot’s posture. Existing methods transform Equation (2) to Equation (3):
M in     · T   C ( t ) E ( t ) · T   E ( t )   W · P   W = z c P   I
Min can be obtained by camera intrinsic parameters calibration. C(t)TE(t) can be obtained by eye-in-hand calibration, and E(t)TW can be obtained by the output of the robot arm. Then we can establish the relationship between IP and WP. However, as analyzed in the first part, in the thermal infrared visual system, existing calibration methods cannot obtain accurate Min and C(t)TE(t). Therefore, this paper obtains the relationship parameters of IP and WP through the following analysis and modeling. WP can be measured according to Equation (2):
P   W = z c T   W   C ( t ) · M in 1 · P   I
When t = 0, the coordinate original point of {S(0)}, which is denoted as Os(0), is established in the camera’s FOV and Os(0)∈Ω. Then we can transform WTC(t) to WTS(t) S(t)TC(t) and get Equation (5):
P   W = z c T   W   S ( t ) · T   S ( t )   C ( t ) · M in 1 · P   I
WTS(t) is divided as WTS(0) · S(0)TS(t) and combined with Equation (5), we can get the following equation:
P   W = z c T   W   S ( 0 ) · T   S ( 0 )   S ( t ) · T   S ( t )   C ( t ) · M in 1 · P   I
The transformation between {S(0)} and {S(t)} is equal to the transformation between {E(0)} and {E(t)}:
T   S ( 0 )   S ( t ) =     E ( t ) T E ( t )
Usually, E(0)TE(t) can be easily obtained from the robot’s control system as the so-called TCP pose. It consists of two parts, the position WPTCP = (WXTCP, WYTCP, WZTCP) and the quaternion Q = (q1, q2, q3, q4). There are four parameters in Equation (6) that need to be solved. They are zc, WTS(0), Min, and S(t)TC(t). zcMinC(t)TS(t) is a constant value. MinC(t)TS(t) is denoted as M. The imaging model between the coordinate frame {S(t)} and {W} can be established. The position of P in {S(t)} is denoted as S(t)P, so we can get the following equation:
z c P   I = M in · T   C ( t )   S ( t ) · P   S ( t ) = M · S ( t ) P
The robot can be controlled to move along the workpiece’s surface at a certain distance to keep Os(t) on the surface and is keeping Os(t)∈Ω. When the camera’s FOV is small enough, the detected surface can be approximated as a plane. According to Equation (1), the function of the tangent plane at the original point can be calculated as:
A ( x   w x   wd ) + B ( y   w y   wd ) + C ( z   w z   wd ) = 0
Additionally, the point in this plane is wdP = (wdx, wdy, wdz). Therefore,
[ x   w y   w z   w 1 ] = T   W   S ( 0 ) · T   E ( 0 )   E ( t ) [ x   S y   S z   S 1 ]
we can solve WP by Equations (8)–(10). It can be seen that we can obtain WP from IP if M and WTS(0) are calibrated. We refer to these two parameters as system parameters in the following.

4. Calibration Method

4.1. Calibration of System Parameters M

The system parameters M include the intrinsic parameters of the thermal-infrared camera and transform the matrix between {C(t)} and {S(t)}. As mentioned in Section 1, the calibration methods for a visible-spectrum camera can hardly be transplanted to thermal-infrared cameras. Calibration methods for thermal-infrared cameras are poorly studied, and existing methods cannot accurately calibrate thermal-infrared cameras. To overcome this problem, we proposed an active system parameter calibration method for the thermal-infrared visual measurement system. The robot is controlled to scan the detected target according to its mechanical models and make sure Os(t)∈Ω and the z-axis of {E} coincides with the normal line of the surface at Os(t). With these constraints, s(t)z = 0. Therefore, parameters m13, m23, and m33 in M are unnecessary.
M = [ k x 0 u 0 0 0 k y v 0 0 0 0 1 0 ] [ n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 ] = [ m 11   m 12   m 13   m 14   m 21   m 22   m 23   m 24   m 31   m 23   m 33   m 34 ]
According to the analysis above, a polygonal flat calibration target that is smaller than the camera’s FOV and is of negligible thickness can be used. We put it on the workpiece and make sure the calibration target is located on the tangent plane of the workpiece. The coordinate frame established on the calibration target is denoted as {S(0)} whose z-axis parallels the normal line of the tangent plane. In Equation (11), m34 is the distance between the control points and the camera’s optical center, so m34 ≠ 0 [23]. The following equation can be obtained,
{ m 11 x   S ( t ) + m 12 y   S ( t ) + m 14 m 31 x   S ( t ) u m 32 y   S ( t ) u = u m 21 x   S ( t ) + m 22 y   S ( t ) + m 24 m 31 x   S ( t ) v m 32 y   S ( t ) v = v
where m′ = m/m34. Now, there are eight parameters in Equation (12). Four control points are enough for solving all of them, as long as any three of them are not located on the same line. Using more control points can optimize the results. The calibration target can be designed like Figure 4a or Figure 4b. The edges of the calibration target in its thermal image can be detected, and the intersections of the detected edges are used as the control points.

4.2. Calibration of System Parameters WTS(0)

The system parameter WTS(0) is the transform matrix between {S(0)} and {W}.The coordinate frame {S(0)} is established on the calibration target, as shown in Figure 5. The origin of {S(0)} is a corner point of the rectangular calibration target. The x-axis and the y-axis are the sides of the calibration target. The vertical direction of the calibration target is the z-axis. The calibration target is placed on the x-y surface of the {W}. Since the calibration target is thin enough, it can be considered that the x-y surfaces of {W} and {S(0)} are on the same plane. There are only three independent variables in the transform matrix WTS(0), including two translation variables and one rotation variable. It can be described as:
T   S ( 0 )   W = [     cos α sin α 0 0       sin α cos α 0 0       0 0 1 0       p x p y 0 1 ]
Since the size of the calibration target is too small compared to the size of the workpiece, it is difficult to directly measure the rotation angle α. Two calibration targets with edges on the same straight line are used to solve this problem. Thermal-infrared images of these two calibration targets are taken by the robot’s movement for measurement. In industrial applications, we can print two identical calibration patterns at both ends of the workpiece and ensure that the two corresponding edges of calibration targets are on the same straight line. We use different materials to print the calibration patterns and ensure they have a different thermal conductivity with the workpiece. Therefore, the calibration patterns can be imaged with the thermal infrared camera after heating.
The two calibration targets are denoted as cal.1 and cal.2. Firstly, the picture of cal.1 donated as Img1 is captured. Then, the robot is moved along the y-axis of the {W} until cal.2 appears in the camera’s FOV. Then, the picture of cal.2 is captured and donated as Img2. The distance of the robot’s movement is recorded as Δy. According to the corresponding points in Img1 and Img2 and the calibration results of M, the offset Δx can be calculated. Then, the angle α between {S(0)} and {W}can be denoted as:
α = a r c t a n ( Δ y Δ x )
When using the method, we should consider the robot’s accuracy and calibration accuracy. If the robot’s motion accuracy is high enough, the distance between the two calibration targets can be increased to optimize the calibration results. Otherwise, the distance should be shortened because the errors will make one of the calibration targets beyond the camera’s FOV.

4.3. Calibration Points Extraction Algorithm

For the imaging of a certain object, the thermal-infrared image is much blurrier than its visible-spectrum image. The continuous change in temperature at the edge of the object inevitably causes a blurring of the infrared image. Therefore, it is necessary to fit the sidelines of the calibration target and use their intersections as the calibration points. If using the edge extraction operator directly, a lot of interference points will be extracted and affect the correctness of sideline fitting.
To accurately extract calibration points, an approach is proposed in this article, as shown in Algorithm 1. First, an edge extraction operator is used to extract the approximate edge points. These points are inaccurate, but the region of the edge can be determined. Then, the Hough transform is used to fit these points into a straight line. In this case, several straight lines will be fitted near the true edge. Hough parameters are clustered in the Hough parameter space into c categories. The value of c is equal to the number of sides of the calibration target. After clustering, the mean value of each class is calculated and output as the representative parameter of those straight lines. The intersections of these c lines are calculated, and the noise points are removed. Then, c line segments can be obtained, and the detection region can be determined according to these line segments. The two endpoints of the segment are denoted as C1(x1, y1) and C2(x2, y2). The rectangular detection region, which needs to be determined, is denoted as a Region of Interest (ROI) (C1′, C2′), where C1′ = (x1′, y1′), C2′ = (x1′, y1′). The ROI can be obtained as follows:
x1′ = x1 + α(x2 − x1), y1′ = y1 + β(y1 − y2)
x2′ = x2 + γ(x1 − x2), y2′ = y2 + φ(y2 − y1)
where, α, β, γ, φ are constants (< 1). A row scan and a column scan are performed for each ROI separately. If the difference between the x-axis of the two intersections is bigger than the difference between the y-axis of the two intersections, this edge is the horizontal edge. The column scan will be used in the subsequent processing. Otherwise, it is a vertical edge, and a row scan will be adopted. Taking the column scan as an example, the scan mode is that each column of pixels is scanned row by row starting from the top left corner of the ROI, and then the grayscale value of the pixel is achieved. The gray value of the nth pixel is subtracted from the gray value of the (n + 4th) pixel, and the four sets of data with the largest difference in each column are calculated. The coordinates of the (n + 2nd) point are output as the searched edge points. The least-squares method is used to fit these edge points into a straight line as the extracted edge. The intersections of these edges are calibration points. The edge extraction algorithm is listed in Algorithm 1.
Algorithm 1: Sideline Extraction
Input: Image I
Data container: edge point set Pe, line parameters set Lh, point set Pr
Pe ← Canny (I);
Lh ← Hough (IC);
Lh ← Cluster (Lh);
Lh ← SideDetection (Lh);
For each line segment Lh:
Pe ← EdgeScan (Lh);
Lh ← Fitting (Pe);
End
Output: Lh

5. Experiments

The proposed calibration method was used to calibrate the thermal-infrared visual measurement system to evaluate the feasibility and accuracy. After calibration, standard workpieces were measured several times. By comparing the measurement value with the true value of the workpieces, the efficiency and accuracy of the proposed calibration method were verified.

5.1. Extraction of Calibration Points in the Thermal-Infrared Image

Figure 6 shows the improvement of the proposed sideline extraction method compared with the traditional one. The red lines mark the extraction results. We can see that the traditional method of directly using an edge detector extracts multiple small line segments at the edge of the calibration target, as the edges are too blurred. Meanwhile, the line segments extracted by our method are accurate. After the sidelines are extracted, the calibration points are obtained by their intersections.

5.2. Calibration of System Parameters M

According to the method proposed in Section 4.1, a small rectangular calibration target of size 20 mm × 40 mm was used to acquire the thermal-infrared images for system calibration. Using the calibration points extraction method proposed in Section 4.3, the four calibration points were extracted, and their image coordinates were A(260.58, 256.61), B(278.64, 144.45), C(627.01, 148.36), and D(642.30, 261.41). In {S(0)}, the coordinates of the calibration points were A(0, 0), B(0, 20), C(40, 20), and D(40, 0). Combined with their image positions, Equation (12) can be solved; thus, we have:
m′ = [9.47 2.234 260.58 0.09 − 4.92 256.61 − 1.142 0.005]
The precision of m′ can be verified by measuring some standard workpieces. In our experiments, two different sizes of standard blocks were used. Their size was 20 mm × 40 mm and 30 mm × 50 mm. They were translated and rotated into different poses. Using the calibrated m′, we calculated their size and compared it to their true size. The measured errors were calculated as shown in Table 1. It can be seen from the measurement results that the absolute error was lower than 0.3 mm, and the relative error was around 0.5%. This indicates the correctness and accuracy of our calibration model of parameter M.

5.3. Calibration of System Parameters WTS(0)

According to the method proposed in Section 4.2, two calibration patterns were printed on the workpiece along the same line denoted as Lc. The calibration procedure was shown in Figure 7. First, we took a picture of the first pattern and obtained the camera parameters matrix m′ with the method introduced above. Then, we moved the robot along the y-axis of {W}. When the second pattern appeared in the camera, the distance of the robot’s movement was recorded, and the result was Δy = 400 mm. There was an offset angle between line Lc and the y-axis. This offset angle was exactly what we needed to calibrate. Calibration points extracted from the second pattern and Equation (13) were used to calculate the transform matrix between {W} and {S(0)}. The result is:
T   W   S ( 0 ) = [ 0.0091 1 0 0 1 0.0091 0 0 0 0 1 0 36.96 20.76 0 1 ]
Positions were measured by the proposed method several times. The measurement results were compared with those measured by the robot’s calibration needle. For all of the points to be measured, we denoted them as A1~A3. The robot’s calibration needle was set on the measured points, as shown in Figure 8. The robot system would obtain their coordinates in {W}. Then, these points were measured with the proposed infrared visual system. The measured results are all shown in Table 2.
The results show that the measurement difference between the results measured by the calibration needle and the thermal-infrared camera was small. This indicates the correctness of the calibration model of the thermal-infrared measurement system. Since the robot’s calibration needle measures the 3D position by relying on a human to read the data, the measurement results have a certain error. This experiment can only determine whether the infrared measurement system measures correctly, but cannot evaluate the measurement accuracy. Therefore, we designed another experiment. We measured the dimensions of a standard workpiece simultaneously using an infrared measurement system calibrated with the method proposed in this paper, an infrared measurement system calibrated with the calibration method proposed in [25], and a calibration needle. Figure 9 shows the calibration images based on the method proposed in [25]. Table 3 shows the measurement results and measurement errors of these three methods.
The root mean square errors (RMSE) were calculated as 0.225 mm, 0.185 mm, and 0.044 mm. Table 3 shows the measurement results of these three methods. The accuracy of the proposed method in this paper was improved compared to the other two methods. Paper [26] uses the infrared visual measurement system calibrated by this method to measure the defects in the online inspection of automated fiber placement. Figure 10 shows the human-computer interaction interface. The conditions of the composite placement experiment are set as follows: environment temperature 19 °C, environment humidity 40.5%, lay-up surface temperature 26 °C, heating lamp power 73%, compaction pressure 0.3 Mpa, and laying speed 150 mm/s. Defects such as gaps, foreign bodies, and air pockets are detected. By extracting the edges and calculating the pixel coordinate of defects, the position of defects can be achieved in the workpiece coordinate frame according to calibration results. The left side of the interface shows the original image captured by the thermal-infrared camera and the defects detected in real-time. The right side of the interface shows the location and dimensional information of the defects marked onto the workpiece model, forming a 3D defect model for industrial HMI. Through the human-computer interaction, the defect model can be manipulated with the mouse and click to select the defect of interest. The selected defect displays its position information on the interface; here, the information was (319.3, 298.4, 327.2). The results illustrate that the proposed method achieved good performance in practical applications.

6. Conclusions

This paper presented a thermal-infrared visual measurement system for industrial HMI together with its modeling and calibration method. Experiments show a good performance of the proposed system. Our work fills the gap in the application of infrared cameras in industrial measurement. The thermal infrared vision sensor designed in this paper can be used in a wider range of industrial HMI applications such as digital twins, AR/VR [27,28], human modeling scans, and modeling in Industry 4.0 [29].

Author Contributions

Conceptualization, B.W. and M.C.; methodology and validation, M.C.; writing—original draft preparation, M.C. and S.T.; writing—review and editing, F.H. and Q.F. project administration and funding acquisition, Q.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The Laboratory Open Fund of Beijing Smart-chip Microelectronics Technology Co., Ltd. Grant number SGITZX00KJQT2108704.

Data Availability Statement

Data is available on request to the first author.

Conflicts of Interest

The author declares there to be no conflict of interest.

References

  1. Rasheed, A.; Zafar, B.; Rasheed, A.; Ali, N.; Sajid, M.; Dar, S.H.; Habib, U.; Shehryar, T.; Mahmood, M.T. Fabric Defect Detection Using Computer Vision Techniques: A Comprehensive Review. Math. Probl. Eng. 2020, 2020, 8189403. [Google Scholar] [CrossRef]
  2. Li, B. Research on geometric dimension measurement system of shaft parts based on machine vision. EURASIP J. Image Video Process. 2018, 2018, 101. [Google Scholar] [CrossRef]
  3. Liu, S.; Liu, J.; Jin, P.; Wang, X. Tube measurement based on stereo-vision: A review. Int. J. Adv. Manuf. Technol. 2017, 92, 2017–2032. [Google Scholar] [CrossRef]
  4. Shirmohammadi, S.; Ferrero, A. Camera as the instrument: The rising trend of vision based measurement. IEEE Instrum. Meas. Mag. 2014, 17, 41–47. [Google Scholar] [CrossRef]
  5. Das, M.P.; Matthies, L.; Daftry, S. Online Photometric Calibration of Automatic Gain Thermal Infrared Cameras. IEEE Robot. Autom. Lett. 2021, 6, 2453–2460. [Google Scholar] [CrossRef]
  6. Griffith, B.; Türler, D.; Goudey, H. IR Thermographic Systems: A Review of IR Imagers and Their Use; Lawrence Berkeley National Laboratory: Berkeley, CA, USA, 2001. [Google Scholar]
  7. Teutsch, M.; Mueller, T.; Huber, M.; Beyerer, J. Low resolution person detection with a moving thermal infrared camera by hot spot classification. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops, Washington, DC, USA, 23–28 June 2014; pp. 209–216. [Google Scholar]
  8. Brehar, R.; Nedevschi, S. Pedestrian detection in infrared images using HOG, LBP, gradient magnitude and intensity feature channels. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 1669–1674. [Google Scholar]
  9. Li, J.; Gong, W.; Li, W.; Liu, X. Robust pedestrian detection in thermal infrared imagery using the wavelet transform. Infrared Phys. Technol. 2010, 53, 267–273. [Google Scholar] [CrossRef]
  10. Buchlin, J.M. Convective heat transfer and IR thermography (IRTh). J. Appl. Fluid Mech. 2010, 3, 55–62. [Google Scholar]
  11. Hung, Y.; Chen, Y.; Ng, S.P.; Liu, L.; Huang, Y.; Luk, B.L.; Ip, R.; Wu, L.; Chung, P. Review and comparison of shearography and active thermography for nondestructive evaluation. Mater. Sci. Eng. R Rep. 2009, 64, 73–112. [Google Scholar] [CrossRef] [Green Version]
  12. Denkena, B.; Schmidt, C.; Völtzer, K.; Hocke, T. Thermographic online monitoring system for Automated Fiber Placement processes. Compos. Part B Eng. 2016, 97, 239–243. [Google Scholar] [CrossRef]
  13. Horn, B. Robot Vision; MIT Press: Cambridge, MA, USA, 1986. [Google Scholar]
  14. Cigliano, P.; Lippiello, V.; Ruggiero, F.; Siciliano, B. Robotic Ball Catching with an Eye-in-Hand Single-Camera System. IEEE Trans. Control Syst. Technol. 2015, 23, 1657–1671. [Google Scholar] [CrossRef] [Green Version]
  15. Vidas, S.; Lakemond, R.; Denman, S.; Fookes, C.; Sridharan, S.; Wark, T. A Mask-Based Approach for the Geometric Calibration of Thermal-Infrared Cameras. IEEE Trans. Instrum. Meas. 2012, 61, 1625–1635. [Google Scholar] [CrossRef] [Green Version]
  16. Saponaro, P.; Sorensen, S.; Rhein, S.; Kambhamettu, C. Improving calibration of thermal stereo cameras using heated calibration board. In Proceedings of the 2015 IEEE International Conference on Image Processing (ICIP), Quebec City, QC, Canada, 27–30 September 2015; pp. 4718–4722. [Google Scholar]
  17. Zhang, Z. Flexible Camera Calibration by Viewing a Plane from Unknown Orientations. In Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV’99), Kerkyra, Greece, 20–27 September 1999; pp. 666–673. [Google Scholar]
  18. Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P. Calibration and verification of thermographic cameras for geometric measurements. Infrared Phys. Technol. 2011, 54, 92–99. [Google Scholar] [CrossRef]
  19. Luhmann, T.; Piechel, J.; Roelfs, T. Geometric calibration of thermographic cameras, International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences. Int. Soc. Photogramm. Remote Sens. 2010, XXXVIII, 411–416. [Google Scholar]
  20. St-Laurent, L.; Prévost, D.; Maldague, X. Fast and accurate calibration-based thermal/colour sensors registration. In Proceedings of the 10th Quantitative InfraRed Thermography Conference, Paper QIRT2010-126 Québec, Quebec, QC, Canada, 24–29 June 2010; pp. 1–8. [Google Scholar]
  21. Dias, A.; Brás, C.; Martins, A.; Almeida, J.; Silva, E. Thermographic and visible spectrum camera calibration for marine robotic target detection. In Proceedings of the 2013 OCEANS-San Diego, San Diego, CA, USA, 23–27 September 2013; pp. 1–5. [Google Scholar]
  22. Yang, R.; Yang, W.; Chen, Y.; Wu, X. Geometric calibration of IR camera using trinocular vision. J. Light. Technol. 2011, 29, 3797–3803. [Google Scholar] [CrossRef]
  23. Ellmauthaler, A.; da Silva, E.A.; Pagliari, C.L.; Gois, J.N.; Neves, S.R. A novel iterative calibration approach for thermal infrared cameras. In Proceedings of the2013 20th IEEE International Conference on Image Processing (ICIP), Melbourne, VIC, Australia, 15–18 September 2013; pp. 2182–2186. [Google Scholar]
  24. Yang, R.; Chen, Y. Design of a 3-D Infrared Imaging System Using Structured Light. IEEE Trans. Instrum. Meas. 2010, 60, 608–617. [Google Scholar] [CrossRef]
  25. Usamentiaga, R.; Garcia, D.; Ibarra-Castanedo, C.; Maldague, X. Highly accurate geometric calibration for infrared cameras using inexpensive calibration targets. Measurement 2017, 112, 105–116. [Google Scholar] [CrossRef]
  26. Chen, M.; Jiang, M.; Liu, X.; Wu, B. Intelligent Inspection System Based on Infrared Vision for Automated Fiber Placement. In Proceedings of the 2018 IEEE International Conference on Mechatronics and Automation (ICMA), Changchun, China, 5–8 August 2018; pp. 918–923. [Google Scholar]
  27. Szajna, A.; Stryjski, R.; Woźniak, W.; Chamier-Gliszczyński, N.; Kostrzewski, M. Assessment of Augmented Reality in Manual Wiring Production Process with Use of Mobile AR Glasses. Sensors 2020, 20, 4755. [Google Scholar] [CrossRef] [PubMed]
  28. Szajna, A.; Stryjski, R.; Woźniak, W.; Chamier-Gliszczyński, N.; Królikowski, T. The Production quality control process, enhanced with augmented reality glasses and the new generation computing support system. Procedia Comput. Sci. 2020, 176, 3618–3625. [Google Scholar] [CrossRef]
  29. Kostrzewski, M.; Chamier-Gliszczyński, N.; Królikowski, T. Selected reflections on formal modeling in Industry 4.0. Procedia Comput. Sci. 2020, 176, 3293–3300. [Google Scholar] [CrossRef]
Figure 1. Thermal images of the calibration board with thermal diffusion. (a) Thermal image of the calibration board captured at time = 0 s. (b) Thermal image of the calibration board captured at time = 0.2 s. (c) Thermal image of the calibration board captured at time = 0.4 s.
Figure 1. Thermal images of the calibration board with thermal diffusion. (a) Thermal image of the calibration board captured at time = 0 s. (b) Thermal image of the calibration board captured at time = 0.2 s. (c) Thermal image of the calibration board captured at time = 0.4 s.
Electronics 11 01230 g001
Figure 2. The active thermal-infrared visual measurement system.
Figure 2. The active thermal-infrared visual measurement system.
Electronics 11 01230 g002
Figure 3. Infrared visual measurement system.
Figure 3. Infrared visual measurement system.
Electronics 11 01230 g003
Figure 4. Calibration targets. (a) Quadrilateral calibration targets. (b) Polygonal calibration targets.
Figure 4. Calibration targets. (a) Quadrilateral calibration targets. (b) Polygonal calibration targets.
Electronics 11 01230 g004
Figure 5. Calibration of coordinate system.
Figure 5. Calibration of coordinate system.
Electronics 11 01230 g005
Figure 6. Edge extraction of thermal images. (a) Traditional method; (b) proposed method.
Figure 6. Edge extraction of thermal images. (a) Traditional method; (b) proposed method.
Electronics 11 01230 g006
Figure 7. Parameters calibration.
Figure 7. Parameters calibration.
Electronics 11 01230 g007
Figure 8. Position measurement.
Figure 8. Position measurement.
Electronics 11 01230 g008
Figure 9. Calibration images based on the method of using hollow calibration targets. (ak) are images captured from different views.
Figure 9. Calibration images based on the method of using hollow calibration targets. (ak) are images captured from different views.
Electronics 11 01230 g009
Figure 10. Human-computer interaction interface of defect inspection based on the proposed system.
Figure 10. Human-computer interaction interface of defect inspection based on the proposed system.
Electronics 11 01230 g010
Table 1. Comparison of measured data with real values.
Table 1. Comparison of measured data with real values.
Real Value (mm)Measured Value (mm)Absolute Error (mm)Relative Error (%)
A1B120.020.10.10.50
B1C140.039.90.10.25
C1D120.019.70.31.50
D1A140.039.80.20.50
A2B240.040.10.10.25
B2C220.019.80.21.00
C2D240.039.90.10.25
D2A220.019.80.21.00
A3B330.030.10.10.33
B3C350.050.00.00.00
C3D330.029.90.10.33
D3A350.050.00.00.00
Table 2. Position measurement results of two methods.
Table 2. Position measurement results of two methods.
IndexCalibration Needle Data (mm)Infrared Vision System Data (mm)Difference (mm)
XWYWZWXWYWZW
1A1511.8403.4256.9511.7404.3256.90.14
A2532.2349.9248.4532.0349.9248.50.22
A3419.1298.5327.3419.3298.4327.20.24
2A1511.9403.5256.8511.7404.3256.90.30
A2532.3349.7248.3532.1349.9248.40.30
A3419.2298.5327.3419.3298.4327.20.17
3A1511.8403.5256.9511.7404.3256.90.22
A2532.3349.8248.3532.1349.9248.40.24
A3419.2298.6327.4419.3298.4327.20.30
Table 3. Comparison of measurement results.
Table 3. Comparison of measurement results.
Real Value
(mm)
Measurement Value (mm)Error (mm)
NeedleTraditional MethodProposed MethodNeedleTraditional MethodProposed Method
2525.925.625.00.90.60.0
3030.330.530.10.30.50.1
3535.235.435.90.20.40.1
4040.640.640.20.60.60.2
4545.645.945.00.60.10.0
5050.450.349.90.40.30.1
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chen, M.; Tian, S.; He, F.; Fu, Q.; Gu, Q.; Wu, B. Modeling and Calibration of Active Thermal-Infrared Visual System for Industrial HMI. Electronics 2022, 11, 1230. https://doi.org/10.3390/electronics11081230

AMA Style

Chen M, Tian S, He F, Fu Q, Gu Q, Wu B. Modeling and Calibration of Active Thermal-Infrared Visual System for Industrial HMI. Electronics. 2022; 11(8):1230. https://doi.org/10.3390/electronics11081230

Chicago/Turabian Style

Chen, Mengjuan, Simeng Tian, Fan He, Qingqin Fu, Qingyi Gu, and Baolin Wu. 2022. "Modeling and Calibration of Active Thermal-Infrared Visual System for Industrial HMI" Electronics 11, no. 8: 1230. https://doi.org/10.3390/electronics11081230

APA Style

Chen, M., Tian, S., He, F., Fu, Q., Gu, Q., & Wu, B. (2022). Modeling and Calibration of Active Thermal-Infrared Visual System for Industrial HMI. Electronics, 11(8), 1230. https://doi.org/10.3390/electronics11081230

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop