Next Article in Journal
A Decentralized Signcryption Scheme Based on CFL
Previous Article in Journal
A Comprehensive Review of Home Sleep Monitoring Technologies: Smartphone Apps, Smartwatches, and Smart Mattresses
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Data Collection Through Optimization of Laser Line Triangulation Sensor Settings and Positioning

1
Department of Robotics, Faculty of Mechanical Engineering, VSB–Technical University of Ostrava, 70800 Ostrava, Czech Republic
2
Polytechnic Department of Engineering and Architecture, University of Udine, 33100 Udine, Italy
3
Institute of Robotics and Cybernetics, Slovak University of Technology in Bratislava, 812 19 Bratislava, Slovakia
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(6), 1772; https://doi.org/10.3390/s25061772
Submission received: 27 January 2025 / Revised: 6 March 2025 / Accepted: 10 March 2025 / Published: 12 March 2025
(This article belongs to the Section Intelligent Sensors)

Abstract

:
This study proposes a new approach to improving laser sensor data collection through optimised sensor settings. Specifically, it examines the influence of laser sensor configurations on laser scanning measurements obtained by using a laser line triangulation sensor for transparent and non-transparent plastics, as well as aluminium alloys. Distance data were acquired with a three-degree-of-freedom positioning device and the laser sensor under both manual and automatic settings. Measurements were performed at the sensor’s reference distance and across a wide range of positional configurations. The results of extensive experimental tests highlight optimal sensor configurations for various materials and sensor orientations relative to the scanned surface, including both in-plane and out-of-plane angles, to enhance the reliability and accuracy of distance data collection.

1. Introduction

Laser and optical sensors represent a key technology in modern industry, as well as in non-industrial areas, for high-precision distance measurement. Outside of industry, laser sensors are applied in medicine, for example, in measuring the thickness of the cornea during ophthalmological examinations [1]. They are also used in the creation of artificial organs or joints by using additive technology, where laser sensors scan parts of the human body and subsequently create a model of the organ [2].
In geodesy, laser sensors enable accurate terrain mapping, distance measurement, slope measurement, and height differences with high precision and speed. They serve to digitise terrain models or monitor terrain deformations [3,4]. Additionally, laser sensors are essential to the creation of 3D models for virtual reality, the gaming industry, and realistic reconstruction of buildings [5]. However, with increasing popularity and improved artificial intelligence, the platforms being developed for these purposes are capable of generating structured and unstructured environments [6,7].
Laser sensors have already become a part of agricultural systems, often integrated into mobile robots in modern orchards. These devices monitor and collect data on plants and their fruits, particularly their geometric characteristics [8]. The information obtained provides farmers with important data to plan and manage key operations, such as irrigation, fertilisation, plant pruning, and prevention of disease spread to other plants [9].
In industry, laser scanning is widely used for various applications. These include object localisation for bin-picking applications [10], product inspection, and accurate high-speed dimensional measurement [11]. It is also essential to reverse engineering, where detailed digital models of existing objects are created for analysis and modifications [12]. Additionally, laser scanning is used to monitor manufacturing processes, such as the application of adhesive or sealant layers; inspect welds [13]; and ensure adherence to technological procedures. Another important use is in additive manufacturing, where it is used to monitor the application of material layers [14,15] and check the general condition of an object, including quality control during printing [16].
Laser sensors enable contactless measurement, preventing sensor wear and avoiding damage to the measured object. They provide high precision and, most importantly, fast measurement, which is crucial to meeting the increasingly demanding conditions of production line cycles. Another key advantage is their ability to measure distances or other control variables even in hard-to-reach areas, which is often problematic for contact sensors. Laser sensors also enable the creation of large point clouds, aiding in the digitisation of undocumented spaces for subsequent analysis [17] or the digitisation of unstructured environments [18].
In industrial environments, triangulation laser sensors (1D, 2D, or 3D) are often preferred over contact sensors for several reasons. These sensors operate on the principle of triangulation [19]. They project a laser beam towards the target and capture the reflected beam by using light-sensitive chips, such as a Complementary Metal-Oxide-Semiconductors (CMOSs) or Charge-Coupled Devices (CCDs). The angle at which the reflected beam is captured is then analysed to calculate the distance from the target; see Figure 1.
However, laser triangulation sensors have certain limitations that need to be considered. One of the main constraints is the potential for laser interference from external factors, such as ambient light sources or reflections [20]. These interference phenomena can lead to measurement errors or even complete sensor failure. For instance, a red laser cannot be used to scan red-hot steel because the measured object emits light at the same or similar wavelength (625 to 740 nm). This limitation can be overcome by using a laser with a different wavelength; blue lasers with a wavelength of 405 nm are often used [21]. Another limitation could be an operating environment where heavy dust, fog, or smoke is present. Under these harsh conditions, laser sensors may be unusable due to reduced transmission of the laser beam. The geometry of the measured object also poses a challenge. Because of the principle of triangulation, the incomplete scanning of the object can occur, leading to so-called blind spots.
One of the main limitations is the sensor’s sensitivity to surfaces with varying reflectivity. When scanning objects with different materials or surface finishes, incorrect measurements or insufficient data recording can occur. However, this problem is partially mitigated by High Dynamic Range (HDR), which allows for the automatic adjustment of the laser beam power depending on the surface reflectivity [22].
Reflectivity is the ability of a material to reflect light, which depends on its optical properties. There are two main types of light beam reflection: diffuse and specular reflection [23], as shown in Figure 2a. Both types of reflection can exist simultaneously, but one usually dominates. In laser scanning, diffuse reflection is the preferred type. A surface with a high light-scattering index is called a Lambertian surface [24]. Specular reflection is dominant in shiny and exceptionally smooth materials, whereas matte materials are characterised by the predominance of diffuse reflection.
Beyond these traditional types of reflection, it is worth to mention metasurfaces—engineered two-dimensional structures capable of manipulating light in unprecedented ways. By controlling the phase, amplitude, and polarisation of light at the subwavelength scale, metasurfaces enable phenomena like anomalous reflection, where the reflection angle can deviate from the conventional laws of reflection. This effect has been demonstrated in all-dielectric gradient metasurfaces for visible light [25] and in broadband metasurfaces optimised for efficient reflection [26]. These capabilities offer new possibilities for advanced optical device design and precise control over light propagation.
Digitising almost mirror-like and highly reflective surfaces continues to be a challenge, despite significant progress in this field. The emitted light reflects off these surfaces in a mirror-like manner, often away from the CCD/CMOS chip, making it undetectable to the sensor. However, it is possible to scan these surfaces under certain conditions, as demonstrated in a study [27].
When light interacts with transparent materials, it also encounters the effects of light refraction. Upon striking a transparent surface, a portion of the light beam is absorbed, while the rest splits; some reflects off the surface, and the remainder travels through the transparent medium, such as water or glass, as depicted in Figure 2b. This phenomenon results in the detection of multiple reflections, typically from the top and bottom surfaces. However, this effect can be used to measure the thickness of the material or is mitigated by using masks or employing a special light spectrum [28].
In general, these phenomena can be minimised by applying a layer of matte material, which dulls the surface and improves diffuse light scattering into the surroundings. However, this approach may not always be feasible, especially in the automotive industry, where the application of materials to visible parts such as headlights is prohibited. Therefore, it is important to address this issue.
In our previous research study [27], we examined the effect of the angle of incidence (AoI) of a laser beam on its reflected intensity and proposed recommendations for sensor positioning relative to the object to improve the reliability of data collection. These recommendations, along with the properties of different materials, were tested in the study [29], which included a comparison of virtual and real scanning. This approach was suitable for flat surfaces, assuming that the sensor was placed perpendicular to the surface or at an out-of-plane angle (refer to Figure 3a). It was primarily used to measure geometric attributes such as radius, angles, gaps, and ovality. However, these characteristics are no longer valid when measuring at in-plane angles (see Figure 3b) or in combination with out-of-plane angle. This is especially relevant for the comprehensive scanning of entire objects to generate a detailed point cloud, where the sensor positioning must be adjusted or the component is highly geometrically complex.
It should be noted that sensor orientation has an impact on the quality of the resulting point cloud [30]. However, systematic and random errors resulting from positioning at in-plane and out-of-plane angles can be eliminated by using the LLT sensor error model, as presented in [31]. Furthermore, in our previous research study [27], we focused on ensuring reliable data acquisition with laser triangulation scanners by analysing the intensity of the reflected laser. The optimal scanning position was identified as the one that yields the highest possible intensity, excluding areas affected by specular reflection. However, the reliance on the reflected intensity posed limitations due to the sensor’s saturation threshold. When this threshold is exceeded, the sensor registers a maximum intensity value, even though the actual intensity might be significantly higher, resulting in data loss—a phenomenon analogous to overexposure in photography. To address these limitations, we have changed our approach in the current study. Instead of relying on the reflected intensity, we evaluate the number of detected points in various scanning positions and their associated accuracy. This approach provides a more robust framework for analysing data reliability across different materials and surface conditions.
In this article, we emphasise the importance of the sensor’s positioning in relation to the scanned surface to ensure reliable data collection. Specifically, we highlight how the sensor’s parameters and its orientation relative to the scanned object influence both the number of detected points and the accuracy of the measurements. The main contributions of this work lie in presenting a systematic evaluation of these effects and providing insights into optimal sensor configurations for both common and challenging materials. Moreover, our approach offers significant practical advantages for industrial applications. By identifying the optimal positioning of the LLT sensor to ensure reliable data collection, the design and deployment of scanning workstations can be accelerated while reducing costs. Knowing the ideal sensor configuration eliminates the need for extensive testing and experimentation with different hardware setups. A key advantage of our method is that it works purely with the sensor itself, without requiring additional investments in new hardware or software. By positioning and orienting the sensor correctly and setting the right parameters, users can achieve more accurate and reliable measurement results. Existing workstations can also be adjusted to achieve even better measurement performance, enhancing their efficiency and data quality.

2. Methodology

In our previous research study [27], a UR10e robot was used for sensor positioning to collect data, which was sufficient for positioning at out-of-plane angles. However, due to its kinematic structure, dynamic effects, and the construction of its joints, vibrations occurred after the robot stopped. This required a longer waiting period for the system to stabilise when changing positions. Additionally, as the joints heated up, the positioning accuracy decreased, as noted in [32]. Although temperature effects could be compensated for, as also demonstrated in [32], this compensation would be significantly time-consuming for complex motions. To address these challenges and improve movement variability, a three-degree-of-freedom positioning device was developed.
The positioning device allows the sample to tilt around the X-axis (roll angle) and the Y-axis (pitch angle) and move linearly along the Z-axis, as illustrated in Figure 4a, which represents a top view of the sensor. The sample rotates around point M, which is defined by the intersection of the X- and Y-axes. For data collection, the sample was gradually tilted around the X-axis from −70° to +40°, as indicated in Figure 4b, which represents a side view of the sensor. Subsequently, the sample was tilted by a specified step around the Y-axis, and the measurements were repeated around the X-axis. This process was repeated until measurements were taken at all positions around the Y-axis from −40° to +45°, as shown in Figure 4c, which represents a front view of the sensor. The step between positions was 1°; therefore, the data for one sample were collected from a total of 9546 positions.

3. Experiment Setup and Data Collection

The positioning device (shown at Figure 5a) allowed the samples to tilt at two angles —roll and pitch—and allowed for the adjustment of the sample’s distance from the sensor. For measurements, we used an LLT sensor LJ-X8080 [33], manufactured by Keyence. This sensor features a blue laser with a wavelength of 405 nm, which prevents interference from ambient light. Additional sensor parameters are provided in Table 1.

3.1. Experimental Workplace

The positioning device was assembled from aluminium profiles and standard structural components, with custom-designed parts produced through additive manufacturing. These parts have been modified to improve their resistance to temperature, as described in [34]. The total weight of the system, including the sensor (1.1 kg), was 10.5 kg. The individual axes were driven by stepper motors. To calibrate the positioning device, two LLT sensors were used to measure the tilt in each axis, using the manufacturer’s software (Terminal Software v 1.2.0). This calibration ensured that the platform holding the sample was set to a default position perpendicular to the projected laser beam. The measuring station is shown in Figure 5a. The Z-axis was adjusted for each sample according to its thickness. The sensor measures the distance of individual points relative to the base of the sensor, which is shown in Figure 5b.
The positioning device was controlled through a custom-developed application running on the control computer. This application, written in C#, provided a user interface and managed real-time command execution and data acquisition from the device. The positioning device was connected to the control computer via USB. When a command was sent to the positioning device, the control unit provided feedback once the movement was completed. Subsequently, a command was sent to the sensor control unit to initiate data collection. The sensor control unit communicated with the computer through Ethernet. The connection diagram is shown in Figure 6.
The measured samples had varying dimensions and surface roughness, as shown in Table 2. In the zero position, with the roll and pitch angles set to 0°, the sensor was placed at distance of 73 mm from each sample. This reference distance was specified by the sensor manufacturer and represents the midpoint of the sensor’s measurement range along the Z-axis.
The samples chosen for this experiment are commonly used materials in the automotive industry. These samples include transparent acrylic plastics in red, orange, and clear colours (Figure 7a), black and grey plastic sheets (Figure 7b), and aluminium alloy samples with different levels of roughness after face milling (Figure 7c).

3.2. Data Collection

After the sample was inserted, the positioning device was calibrated with the aid of two LLT sensors, which measured the sample’s tilt and ensured precise alignment of the sample in the positioning device. The main LLT sensor (as seen in Figure 5a) measured the tilt of the platform around the X-axis and the distance from the sample along the Z-axis. The auxiliary LLT sensor measured the tilt around the Y-axis. Following calibration, the auxiliary sensor was removed to enable unrestricted movement of the positioning device during measurement. The main sensor projected a laser line onto the sample, aligned with the axis of rotation around the X-axis. The rotation around the Y-axis was perpendicular to the X-axis and passed through the midpoint of the X-range, corresponding to the central point M of the positioning device, as illustrated in Figure 5a.
The measurement sequence began by setting the sample’s tilt angle around the Y-axis within a range of −40° to +45°. Subsequently, the tilt angles around the X-axis were adjusted within a range of −70° to +40°. At each position, a waiting period was allotted for the sample to stabilise after completing its movement. After stabilisation, profile data (points) were collected for further processing. This procedure was repeated until all positions were scanned, resulting in a complete dataset. The measurement process is detailed in Algorithm 1.
Algorithm 1. The measurement process.
for i = −40 to 45    //i represents the pitch angle (rotation around the Y-axis)
    SetPitchPose (i)
    for j = −70 to 40    //represents the roll angle (rotation around the X-axis)
        SetRollPose (j)
        WaitForStabilization ()
        CollectProfileData ()
    end for
end for
Some sensors require warming up to their operating temperature to ensure accurate measurements. Although this sensor does not require preheating, it was allowed to warm up prior to data collection. For this sensor, the operational temperature did not affect the number of captured points. However, it did influence the accuracy of the entire profile. The accuracy was measured at a single position with multiple scans, immediately after the sensor was powered on, and subsequently at 30 min intervals up to 90 min. After 90 min of operation, a deviation of ±0.016 mm was observed when comparing scans taken at room temperature and after the warm-up period. While this level of deviation may be negligible for some applications, it could become critical in scenarios that demand high precision.
Data were collected by using both manual and automatic sensor settings, with individual parameters configured before each measurement, adjusted to the specific sample and measurement type. In manual mode, the parameters were set based on the user’s observation and estimation. In automatic mode, the sensor applied the “Sensitivity adjustment” function to determine the optimal settings. An overview of the configurable parameters and their possible values is provided in Table 3, based on the sensor datasheet. The specific parameter values used for each sample and type of measurement are listed in Table 4. When using automatic adjustment, the laser power was modified during measurements in response to the reflected beam intensity.

4. Experimental Results

All samples used in this experiment were flat, resulting in an expected point profile that should form a straight line. However, due to overexposure, noise, and material properties, scanning inaccuracies can occur. These inaccuracies cause some points to deviate significantly along the Z-axis, resulting in a noisy profile. To evaluate the accuracy of the scanned profiles, we employed linear regression, as the expected profile should be linear.
An iterative approach was used to refine the profile. In each iteration, the most distant point outside the defined accuracy threshold was removed. The regression equation was then recalculated, and the profile was reevaluated in subsequent iterations without the removed points. The accuracy threshold was set to 0.1 mm (±0.05 mm) along the Z-axis of the regression line. The final accuracy of the profile was determined by the number of points that remained within this threshold. The iterative procedure to calculate the accuracy of the profile is shown in Figure 8a–c.
The data for each sample are visualised by using two heat maps, where each pixel represents a discrete position in which the measurement (scan) was performed, for a total of 9546 positions.
The first heat map illustrates the number of detected points. The pixel intensity is represented in shades of grey, where black indicates 0% of the maximum detectable points and nearly white represents 100%. For example, a measurement detecting 2915 points out of a maximum of 3200 points would correspond to a grey level reflecting approximately 91% (see Figure 9a).
The second heat map depicts the accuracy of the profile. Again, the intensity of pixels is expressed in shades of grey, but it represents the number of points in the profile that fall within the specified accuracy threshold. For instance, if 2184 out of 2915 points meet the accuracy criterion of ±0.05 mm, the intensity of pixels reflects this proportion (approximately 75%; see Figure 9b).
These heat maps provide a comprehensive visual comparison of scanner performance in terms of both data completeness and accuracy for various positions and material samples. The intensity scale is divided into 10% increments in both heat maps, as shown in Figure 9a,b. As demonstrated by this example, a profile with a large number of points does not necessarily guarantee reliable data.

4.1. Non-Transparent Plastics

Measurement results are presented by using heat maps, as shown in Figure 10, to illustrate the outcomes of different sensor settings. The red axis represents the roll with a tilt value of 0°, and the green axis corresponds to the pitch with a tilt value of 0°. The orange rectangle highlights the area with mirror-like reflections. The pixels in the heat maps represent individual positions where measurements were conducted, corresponding to specific combinations of roll and pitch angles.
At a first glance, the results obtained by using manual and automatic sensor settings for the non-transparent black plastic appear nearly identical. However, a closer analysis reveals notable differences. In certain areas, the automatic sensor settings resulted in a slight loss of detected points, as indicated by darker regions in the heat maps (Figure 10c). Furthermore, the accuracy of the profile was higher when the sensor was manually configured (Figure 10b). This is evident from the presence of grey pixels in the graphs corresponding to the automatic settings (Figure 10d), which indicate that a small number of points exceeded the specified accuracy threshold. The automatic settings dynamically adjusted the laser power, which may have resulted in data loss or reduced precision due to light absorption by the black surface.
The heat maps for the non-transparent grey plastic demonstrate that both manual and automatic sensor settings performed similarly in terms of the number of captured points. Figure 11a,c show nearly identical results, with most positions reaching near-maximum detection levels, as indicated by the predominantly light shading across the maps. This consistency suggests that the material properties of the grey plastic did not significantly challenge the scanner’s ability to detect points, regardless of the sensor settings used.
However, in terms of accuracy, distinct differences can be observed between the two configurations. Manual settings (Figure 11b) yielded a large light-shaded area, indicating good accuracy across a significant portion of the measured positions. Notably, the automatic settings (Figure 11d) outperformed the manual settings, with an even larger light-shaded region, particularly in the upper part of the map. This indicates that the profiles measured under automatic settings were more precise in these positions.

4.2. Aluminium Alloys

Despite the varying surface roughness of the aluminium alloy samples, the resulting heat maps were very similar across all samples.
Regarding the number of points captured by using manual sensor settings, it was observed that fewer points were detected in the specular reflection region. This can be attributed to overexposure of the image, which caused data loss. Additionally, black spots indicating a low number of points were visible in the lower corners of the graphs, corresponding to extreme positive roll values (approximately 40°) and extreme pitch values (−40° and +45°), as shown in Figure 12a, Figure 13a, Figure 14a, Figure 15a and Figure 16a. For the sample with a surface roughness of Ra 0.8, this region was the largest, as seen in Figure 12a. In these positions, due to reduced sensor power, no light (reflected laser) returned to the sensor, resulting in minimal or no point detection. This was particularly evident for the Ra 0.8 sample, where the smooth surface caused the laser to reflect outside the sensor’s field of view. For samples with higher roughness, light was reflected to the sensor more effectively.
With automatic sensor settings, more points were generally captured, as illustrated in Figure 12c, Figure 13c, Figure 14c, Figure 15c and Figure 16c. Heat maps for all samples showed nearly 90% or more captured points. In the upper region of the graphs (roll of approximately −70° to −40°), the profile coverage was slightly lower, around 80% or more. However, in this region, manual settings detected more points (approximately 90%). In the specular reflection region, the profile coverage was also 80% or higher, while manual settings exhibited significant point loss in this region. For the Ra 0.8 sample, point loss occurred in the Roll 40° and Pitch 45° region (lower right corner), as shown in Figure 12c. This can be explained by the same phenomenon as observed with manual settings (a smooth surface causing the light to reflect away from the sensor).
In terms of accuracy, within the pitch range of ±17°, the accuracy was nearly 100%. However, with the increase in tilt toward the positive or negative extremes, the accuracy decreased sharply, as shown in Figure 12b,d, Figure 13b,d, Figure 14b,d, Figure 15b,d and Figure 16b,d. Notably, low accuracy was observed with manual settings in the specular reflection regions, as shown in Figure 12b, Figure 13b, Figure 14b, Figure 15b and Figure 16b. In contrast, the accuracy was significantly better with automatic settings. For samples with Ra 1.6 and Ra 3.2, the profile accuracy in the specular reflection region was consistent with the accuracy in the rest of the measured positions, as seen in Figure 13d and Figure 14d.

4.3. Transparent Plastics

The differences between manual and automatic sensor settings were particularly pronounced when scanning the clear plastic sample. When using manual settings, the detected points were concentrated in a significantly smaller region, primarily around the area of specular reflections, as shown in Figure 17a. Outside this region, no points were detected, likely due to the fixed intensity of the laser, which could not adapt to the high transparency of the material. In contrast, automatic settings resulted in a much broader distribution of detected points, covering nearly the entire range of positions except for the extreme corners and the region of specular reflections, as indicated by darker pixels in Figure 17c.
In terms of accuracy, the manual settings exhibited relatively stable accuracy within the narrow vicinity of the specular reflection zone. However, further away from this zone, noticeable fluctuations in accuracy were observed, as shown in Figure 17b. These deviations are likely attributable to the properties of the material itself. Transparent plastics are not always perfectly homogeneous; internal defects or structural inconsistencies, invisible to the human eye, may refract or redirect the laser beam. This can lead to noise in the captured profile and reduce accuracy.
With automatic settings, the accuracy followed a trend similar to that observed with aluminium samples. Within the pitch range of ±17°, the accuracy was almost 100% but decreased rapidly with the increase in tilt angles in both the positive and negative directions. Additionally, the accuracy dropped at a roll angle of approximately 40°. A pronounced decrease in accuracy was also evident in the region of specular reflections, as shown in Figure 17d.
For the red and orange transparent plastic samples, the number of detected points with manual sensor settings was very similar for both materials. Points were detected only around the specular reflection region, as shown in Figure 18a and Figure 19a. However, this detected area was slightly smaller than that observed with clear plastic. Automatic settings, while providing a larger detection range compared with manual settings (Figure 18c and Figure 19c), still showed a smaller area than that achieved with automatic settings for clear plastic. Additionally, in both materials, point loss was observed in the specular reflection region.
In terms of accuracy, the manual sensor settings demonstrated high precision (90–100%), as depicted in Figure 18b and Figure 19b. The accuracy achieved with the automatic settings was similar to that seen with the clear plastic sample. Within the pitch angle range of ±17°, the accuracy was close to 100% but dropped significantly with the increase in tilt angles in either direction. A notable decrease in accuracy was also evident in the specular reflection region, as shown in Figure 18d and Figure 19d.
In the case of transparent materials, light refraction could cause the detection of both top and bottom edges, as shown in Figure 20a. When scanning transparent materials, it is important to establish a mask or preferred reflection (either top or bottom), as seen in Figure 20b. The high laser power not only makes the entire profile visible but also causes significant overexposure of the image (Figure 20a). Reducing power helped partially eliminate overexposure. However, the main part of the image was still overexposed, resulting in an inaccurate profile, with step changes in the profile, as shown in Figure 20b. Smoothing the profile can be achieved by reducing the exposure time, as shown in Figure 20c. However, in the specular area, this may lead to reduced intensity at the boundary points of the profile and to the detection of points from the reflection of light from the bottom surface.
By adjusting the exposure time, it was possible to stabilise the central portion of the profile, as illustrated in Figure 21a–d. The figure shows data obtained from clear plastic in specular reflection (roll of −17° and pitch of 0°) with exposure times ranging from 240 μs to 30 μs. Trimming the profile by 20% on both sides helps to avoid capturing noise and allows for data collection with a stable profile.
For applications requiring accuracy within hundredths of a millimetre, it is crucial to take into account the warm-up time of the scanner to reach its operational temperature. Both manual and automatic parameter settings of the scanner (see Table 3) share a common limitation: when the sensor’s pitch angle exceeds 17° or is less than −17°, the number of detected points may remain consistent, but the profile shows increased noise and inaccuracies. Therefore, it is advisable to maintain the sensor’s orientation within the in-plane angle range of <−17°, 17°>, as larger tilt angles resulted in a significant deterioration of profile accuracy.
Automatic parameter adjustment is more suitable for scanning larger areas, such as when the scanner is mounted on an industrial robot and must operate across multiple angles. However, for scanning specific regions, manual parameter settings provide better precision and control. In manual scanning, the number of detected points can be increased by extending the exposure time, though this comes with the risk of increased noise; conversely, shorter exposure times reduce noise but may capture fewer points. For transparent materials, an exposure time of 120 μs was suitable in areas of specular reflections and 240 μs outside those areas, while 480 μs was optimal for non-transparent materials.

5. Conclusions

Triangulation laser sensors are used in industry for the precise measurement of distances and object positions. They operate on the principle of laser beam reflection and analysis of the reflected light. These sensors are used in quality control and automated manufacturing processes, increasing efficiency and reducing errors. However, it is essential to obtain reliable data from the sensor to achieve accurate results.
In this article, we focused on enhancing the reliability of data collection when scanning with an LLT sensor at in-plane and out-of-plane angles. The Keyence LLT sensor LJ-X8080 was used for measurements. The positioning device was designed to allow for the precise tilting of samples and data acquisition at various positions.
For non-transparent black and grey plastics, the results demonstrate that both manual and automatic sensor settings can achieve comparable completeness in point detection. However, the automatic settings provided better profile accuracy.
Across samples of aluminium with varying roughness, similar trends were observed in heat maps. Manual settings struggled with point detection in the specular reflection area. Automatic settings captured more points overall and provided better accuracy in these regions.
The scanning of transparent plastics posed unique challenges due to light refraction. For clear plastic, manual settings detected points only near the specular reflection zone, whereas automatic settings captured points across a broader area, though with limitations in extreme positions and regions of specular reflection. For red and orange transparent plastics, similar trends were observed, with manual settings providing high accuracy but limited coverage and automatic settings expanding the detection area while maintaining comparable accuracy.
Manual settings are suitable for tasks that require controlled conditions and high precision in limited regions, such as the specular reflection area. Automatic settings are advantageous for broader applications, offering greater adaptability to challenging surface properties and improving data reliability in less controlled environments.
While the proposed optimisation approach provides valuable insights, it also has certain limitations. First, it is highly dependent on the specific type and geometry of the LLT sensor used. Variations in sensor design, such as differences in triangulation base and camera angle, could lead to a different optimal positioning, requiring additional testing and adjustment. Second, the study is limited to a specific set of materials, and the sensor’s performance may differ significantly on surfaces with other optical or structural properties. Establishing a comprehensive digital database of results for various materials would be essential to generalising the approach. Third, practical deployment of the optimised sensor configuration may face geometric constraints in industrial environments, where physical obstructions or existing equipment can limit the feasible positioning of the sensor, leading to necessary compromises between accuracy and practicality.
Future research could delve into the detailed parameters required for scanning transparent materials and their relative positioning to the sensor. Furthermore, the scope of this methodology could be extended to semi-transparent materials. Another area for potential research is the dynamic optimisation of sensor settings and parameters, such as laser power and exposure time, using feedback from the sensor. These findings could also be applied to digital twins of sensors [35,36] or even entire technological processes [37].

Author Contributions

Conceptualisation, D.H. and T.K.; methodology, D.H. and T.K.; software, D.H.; validation, D.H., J.C. and J.M.; formal analysis, Z.Z., L.S. and M.D.; investigation, D.H., J.C. and J.M.; resources, T.K. and Z.B.; data curation, D.H. and Z.Z.; writing—original draft preparation, D.H.; writing—review and editing, J.C., J.M., L.S. and M.D.; visualisation, D.H. and J.C.; supervision, T.K.; project administration, Z.B.; funding acquisition, Z.B. All authors have read and agreed to the published version of the manuscript.

Funding

This article was co-funded by the European Union under REFRESH (Research Excellence For REgion Sustainability and High-tech Industries) project number CZ.10.03.01/00/22_003/0000048 via the Operational Programme Just Transition and by specific research project SP2025/042, financed by the state budget of the Czech Republic. This paper was also created as part of project No. CZ.02.01.01/00/22_008/0004631, Materials and technologies for sustainable development within the Jan Amos Komensky Operational Program, financed by the European Union and from the state budget of the Czech Republic.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Contact the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Camburu, G.; Zemba, M.; Tătaru, C.P.; Purcărea, V.L. The measurement of Central Corneal Thickness. Rom. J. Ophthalmol. 2023, 67, 168–174. [Google Scholar] [CrossRef] [PubMed]
  2. Kaushik, A.; Gahletia, S.; Garg, R.K.; Sharma, P.; Chhabra, D.; Yadav, M. Advanced 3D body scanning techniques and its clinical applications. In Proceedings of the 2022 International Conference on Computational Modelling, Simulation and Optimization (ICCMSO), Pathum Thani, Thailand, 23–25 December 2022; pp. 352–358. [Google Scholar] [CrossRef]
  3. Jiang, N.; Li, H.-B.; Li, C.-J.; Xiao, H.-X.; Zhou, J.-W. A Fusion Method Using Terrestrial Laser Scanning and Unmanned Aerial Vehicle Photogrammetry for Landslide Deformation Monitoring Under Complex Terrain Conditions. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
  4. Blair, J.B.; Rabine, D.L.; Hofton, M.A. The Laser Vegetation Imaging Sensor: A medium-altitude, digitisation-only, airborne laser altimeter for mapping vegetation and topography. ISPRS J. Photogramm. Remote Sens. 1999, 54, 115–122. [Google Scholar] [CrossRef]
  5. Andaru, R.; Cahyono, B.K.; Riyadi, G.; Istarno; Djurdjani; Ramadhan, G.R.; Tuntas, S. The Combination of Terrestrial Lidar and UAV Photogrammetry for Interactive Architectural Heritage Visualization Using Unity 3D Game Engine. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, XLII-2/W17, 39–44. [Google Scholar] [CrossRef]
  6. López, A.; Ogayar, C.J.; Jurado, J.M.; Feito, F.R. A GPU-Accelerated Framework for Simulating LiDAR Scanning. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–18. [Google Scholar] [CrossRef]
  7. Gan, C.; Schwartz, J.; Alter, S.; Schrimpf, M.; Traer, J.; De Freitas, J.; Kubilius, J.; Bhandwaldar, A.; Haber, N.; Sano, M.; et al. 3D World: A Platform for Interactive Multi-Modal Physical Simulation. In Proceedings of the Annual Conference on Neural Information Processing Systems, Online, 6 December 2021; Available online: https://research.ibm.com/publications/threedworld-a-platform-for-interactive-multi-modal-physical-simulation (accessed on 20 May 2024).
  8. Dupuis, J.; Holst, C.; Kuhlmann, H. Laser Scanning Based Growth Analysis of Plants as a New Challenge for Deformation Monitoring. J. Appl. Geodesy 2016, 10, 37–44. [Google Scholar] [CrossRef]
  9. Williams, S.R.; Baniya, A.A.; Islam, M.S.; Murphy, K. A Data Ecosystem for Orchard Research and Early Fruit Traceability. Horticulturae 2023, 9, 1013. [Google Scholar] [CrossRef]
  10. Rofalis, N.; Olesen, A.S.; Jakobsen, M.L.; Krüger, V. Fast Visual Part Inspection for Bin Picking. In Proceedings of the 2016 IEEE International Conference on Imaging Systems and Techniques (IST), Chania, Greece, 4–6 October 2016; pp. 412–417. [Google Scholar] [CrossRef]
  11. Phan, N.D.M.; Quinsat, Y.; Lartigue, C. Optimal scanning strategy for on-machine inspection with laser-plane sensor. Int. J. Adv. Manuf. Technol. 2019, 103, 4563–4576. [Google Scholar] [CrossRef]
  12. Yu, Z.; Wang, T.; Wang, P.; Tian, Y.; Li, H. Rapid and Precise Reverse Engineering of Complex Geometry Through Multi-Sensor Data Fusion. IEEE Access 2019, 7, 165793–165813. [Google Scholar] [CrossRef]
  13. Bao-qing, D.; Yu-shu, W. Research on Welding Seam Tracking Technology Based on Linear Laser CCD Robot. In Proceedings of the 2019 3rd International Conference on Robotics and Automation Sciences (ICRAS), Wuhan, China, 1–3 June 2019; pp. 54–57. [Google Scholar] [CrossRef]
  14. Vandone, A.; Baraldo, S.; Valente, A.; Mazzucato, F. Vision-based melt pool monitoring system setup for additive manufacturing. Procedia CIRP 2019, 81, 747–752. [Google Scholar] [CrossRef]
  15. Delli, U.; Chang, S. Automated Process Monitoring in 3D Printing Using Supervised Machine Learning. Procedia Manuf. 2018, 26, 865–870. [Google Scholar] [CrossRef]
  16. Senthilnathan, S.; Raphael, B. Using Computer Vision for Monitoring the Quality of 3D-Printed Concrete Structures. Sustainability 2022, 14, 15682. [Google Scholar] [CrossRef]
  17. Castellazzi, G.; D’Altri, A.M.; Bitelli, G.; Selvaggi, I.; Lambertini, A. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure. Sensors 2015, 15, 18360–18380. [Google Scholar] [CrossRef] [PubMed]
  18. Tallavajhula, A.; Meriçli, Ç.; Kelly, A. Off-Road Lidar Simulation with Data-Driven Terrain Primitives. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018; pp. 7470–7477. [Google Scholar] [CrossRef]
  19. Bickel, G.; Häusler, G.; Maul, M. Triangulation with expanded range of depth. Opt. Eng. 1985, 24, 975–977. [Google Scholar] [CrossRef]
  20. Buzinski, M.; Levine, A.; Stevenson, W.H. Laser Triangulation Range Sensors: A Study of Performance Limitations. In International Congress on Applications of Lasers & Electro-Optics; AIP Publishing: Melville, NY, USA, 1991; pp. 1–15. [Google Scholar] [CrossRef]
  21. Wen, X.; Wang, J.; Zhang, G.; Niu, L. Three-Dimensional Morphology and Size Measurement of High-Temperature Metal Components Based on Machine Vision Technology: A Review. Sensors 2021, 21, 4680. [Google Scholar] [CrossRef]
  22. Wu, S.; Feng, Q.; Gao, Z.; Han, Q. A Novel Laser Triangulation Sensor with Wide Dynamic Range. In SAE 2011 World Congress & Exhibition; SAE International: Warrendale, PA, USA, 2011. [Google Scholar] [CrossRef]
  23. Nayar, S.; Ikeuchi, K.; Kanade, T. Surface Reflection—Physical and Geometrical Perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 611–634. [Google Scholar] [CrossRef]
  24. Oren, M.; Nayar, S.K. Generalization of the Lambertian model and implications for machine vision. Int. J. Comput. Vis. 1995, 14, 227–251. [Google Scholar] [CrossRef]
  25. Tsitsas, N.L.; Valagiannopoulos, C.A. Anomalous reflection of visible light by all-dielectric gradient metasurfaces. J. Opt. Soc. Am. B 2017, 34, D1–D8. [Google Scholar] [CrossRef]
  26. Qi, C.; Wong, A.M.H. Broadband efficient anomalous reflection using an aggressively discretized metasurface. Opt. Express 2022, 30, 15735–15746. [Google Scholar] [CrossRef]
  27. Heczko, D.; Oščádal, P.; Kot, T.; Huczala, D.; Semjon, J.; Bobovský, Z. Increasing the Reliability of Data Collection of Laser Line Triangulation Sensor by Proper Placement of the Sensor. Sensors 2021, 21, 2890. [Google Scholar] [CrossRef]
  28. Yu, Q.; Zhang, Y.; Shang, W.; Dong, S.; Wang, C.; Wang, Y.; Liu, T.; Cheng, F. Thickness Measurement for Glass Slides Based on Chromatic Confocal Microscopy with Inclined Illumination. Photonics 2021, 8, 170. [Google Scholar] [CrossRef]
  29. Heczko, D.; Oščádal, P.; Kot, T.; Boleslavský, A.; Krys, V.; Bém, J.; Virgala, I.; Bobovský, Z. Finding the Optimal Pose of 2D LLT Sensors to Improve Object Pose Estimation. Sensors 2022, 22, 1536. [Google Scholar] [CrossRef] [PubMed]
  30. Van Gestel, N.; Cuypers, S.; Bleys, P.; Kruth, J.-P. A performance evaluation test for laser line scanners on CMMs. Opt. Lasers Eng. 2009, 47, 336–342. [Google Scholar] [CrossRef]
  31. Vlaeyen, M.; Haitjema, H.; Dewulf, W. Error compensation for laser line scanners. Measurement 2021, 175, 109085. [Google Scholar] [CrossRef]
  32. Vocetka, M.; Bobovský, Z.; Babjak, J.; Suder, J.; Grushko, S.; Mlotek, J.; Krys, V.; Hagara, M. Influence of Drift on Robot Repeatability and Its Compensation. Appl. Sci. 2021, 11, 10813. [Google Scholar] [CrossRef]
  33. LJ-X, S.H.8080. KEYENCE UK & Ireland. Available online: https://www.keyence.co.uk/products/measure/laser-2d/lj-x8000/models/lj-x8080/ (accessed on 21 December 2023).
  34. Suder, J.; Bobovsky, Z.; Mlotek, J.; Vocetka, M.; Zeman, Z.; Safar, M. Experimental Analysis of Temperature Resistance of 3D Printed PLA Components. MMSJ 2021, 1, 4322–4327. [Google Scholar] [CrossRef]
  35. Vlaeyen, M.; Haitjema, H.; Dewulf, W. Digital Twin of an Optical Measurement System. Sensors 2021, 21, 6638. [Google Scholar] [CrossRef]
  36. Kot, T.; Bobovský, Z.; Heczko, D.; Vysocký, A.; Virgala, I.; Prada, E. Using Virtual Scanning to Find Optimal Configuration of a 3D Scanner Turntable for Scanning of Mechanical Parts. Sensors 2021, 21, 5343. [Google Scholar] [CrossRef]
  37. Ružarovský, R.; Skýpala, R. A general take on a Tecnomatix Process Simulate’s Digital Twin creation and its exchange of information with the TIA Portal and PLC SIM Advanced. J. Phys. Conf. Ser. 2022, 2212, 012010. [Google Scholar] [CrossRef]
Figure 1. The principle of triangulation on a 2D line sensor.
Figure 1. The principle of triangulation on a 2D line sensor.
Sensors 25 01772 g001
Figure 2. Undesirable effects in laser scanning: (a) specular reflection, where n is the surface normal, θi is the angle of incidence, and θr is the angle of reflection; (b) refraction of the laser beam, where n is the surface normal and θ is the angle of incidence of the beam.
Figure 2. Undesirable effects in laser scanning: (a) specular reflection, where n is the surface normal, θi is the angle of incidence, and θr is the angle of reflection; (b) refraction of the laser beam, where n is the surface normal and θ is the angle of incidence of the beam.
Sensors 25 01772 g002
Figure 3. Positioning of laser line triangulation (LLT) sensor. The colored arrows indicate the sensor’s reference frame: red represents the X-axis, green represents the Y-axis, and blue represents the Z-axis. Dashed reference frame indicating orientation at a given angle: (a) out-of-plane angles, representing rotation around the X-axis; (b) in-plane angles, representing rotation around the Y-axis.
Figure 3. Positioning of laser line triangulation (LLT) sensor. The colored arrows indicate the sensor’s reference frame: red represents the X-axis, green represents the Y-axis, and blue represents the Z-axis. Dashed reference frame indicating orientation at a given angle: (a) out-of-plane angles, representing rotation around the X-axis; (b) in-plane angles, representing rotation around the Y-axis.
Sensors 25 01772 g003
Figure 4. Data acquisition scheme from experimental measurements. The colored arrows indicate the sensor’s reference frame: red represents the X-axis, green represents the Y-axis, and blue represents the Z-axis.: (a) top view of the sensor, illustrating the tilting of the sample around the X- and Y-axes; (b) side view of the sensor, illustrating the tilting around the X-axis and the basic parameters of the sensor, where θ is the AoI of the laser beam relative to the surface normal n, and α is the triangulation angle; (c) front view of the sensor, illustrating the tilting around the Y-axis, where δ is the angle of the laser projection cone.
Figure 4. Data acquisition scheme from experimental measurements. The colored arrows indicate the sensor’s reference frame: red represents the X-axis, green represents the Y-axis, and blue represents the Z-axis.: (a) top view of the sensor, illustrating the tilting of the sample around the X- and Y-axes; (b) side view of the sensor, illustrating the tilting around the X-axis and the basic parameters of the sensor, where θ is the AoI of the laser beam relative to the surface normal n, and α is the triangulation angle; (c) front view of the sensor, illustrating the tilting around the Y-axis, where δ is the angle of the laser projection cone.
Sensors 25 01772 g004
Figure 5. Experimental setup: (a) positioning device with three degrees of freedom; (b) Keyence LLT LJ-X8080 sensor in detail.
Figure 5. Experimental setup: (a) positioning device with three degrees of freedom; (b) Keyence LLT LJ-X8080 sensor in detail.
Sensors 25 01772 g005
Figure 6. A schematic diagram of the workstation.
Figure 6. A schematic diagram of the workstation.
Sensors 25 01772 g006
Figure 7. Measured samples: (a) transparent plastics; (b) non-transparent plastics; (c) aluminium alloys with different roughness (face milling, Ra 0.8–12.5).
Figure 7. Measured samples: (a) transparent plastics; (b) non-transparent plastics; (c) aluminium alloys with different roughness (face milling, Ra 0.8–12.5).
Sensors 25 01772 g007
Figure 8. Iterative accuracy calculation procedure. The profile data (points) are marked with a bold purple, the regression line is indicated with a purple dotted line, the positive threshold (+0.05 mm) is represented by a red dashed line, and the negative threshold (−0.05 mm) is shown as a blue dotted line: (a) raw profile; (b) iteration No. 150; (c) iteration No. 300; (d) iteration No. 350.
Figure 8. Iterative accuracy calculation procedure. The profile data (points) are marked with a bold purple, the regression line is indicated with a purple dotted line, the positive threshold (+0.05 mm) is represented by a red dashed line, and the negative threshold (−0.05 mm) is shown as a blue dotted line: (a) raw profile; (b) iteration No. 150; (c) iteration No. 300; (d) iteration No. 350.
Sensors 25 01772 g008
Figure 9. The determination of the grey level of a pixel in the heat map: (a) the number of points in the profile; (b) the accuracy of the profile.
Figure 9. The determination of the grey level of a pixel in the heat map: (a) the number of points in the profile; (b) the accuracy of the profile.
Sensors 25 01772 g009
Figure 10. Heat maps for non-transparent black plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 10. Heat maps for non-transparent black plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g010
Figure 11. Heat maps for non-transparent grey plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 11. Heat maps for non-transparent grey plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g011
Figure 12. Heat maps for aluminium alloys with surface roughness Ra 0.8. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 12. Heat maps for aluminium alloys with surface roughness Ra 0.8. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g012
Figure 13. Heat maps for aluminium alloys with surface roughness Ra 1.6. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 13. Heat maps for aluminium alloys with surface roughness Ra 1.6. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g013
Figure 14. Heat maps for aluminium alloys with surface roughness Ra 3.2. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 14. Heat maps for aluminium alloys with surface roughness Ra 3.2. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g014
Figure 15. Heat maps for aluminium alloys with surface roughness Ra 6.3. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 15. Heat maps for aluminium alloys with surface roughness Ra 6.3. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g015
Figure 16. Heat maps for aluminium alloys with surface roughness Ra 12.5. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 16. Heat maps for aluminium alloys with surface roughness Ra 12.5. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g016
Figure 17. Heat maps for clear transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 17. Heat maps for clear transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g017
Figure 18. Heat maps for orange transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 18. Heat maps for orange transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g018
Figure 19. Heat maps for red transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Figure 19. Heat maps for red transparent plastic. Red line represents position with roll value 0°, green line represents position with pitch value 0°, orange represents the area with mirror-like reflections: (a) captured points, manual settings; (b) accuracy of the profile, manual settings; (c) captured points, automatic settings; (d) accuracy of the profile, automatic settings.
Sensors 25 01772 g019
Figure 20. Sample of clear plastic, measured in the specular reflection area, roll of −20°and pitch of 0°. The image shows screenshots from the sensor controller. In the upper part, data from the camera are displayed, including reflected light and detected points connected by a red line. In the lower part, the reflected intensity of individual laser beams is shown (colored in cyan): (a) laser power of 100% and exposure time of 480 μs, standard detection; (b) laser power of 20% and exposure time of 480 μs, near detection; (c) laser power of 20% and exposure time of 240 μs, near detection.
Figure 20. Sample of clear plastic, measured in the specular reflection area, roll of −20°and pitch of 0°. The image shows screenshots from the sensor controller. In the upper part, data from the camera are displayed, including reflected light and detected points connected by a red line. In the lower part, the reflected intensity of individual laser beams is shown (colored in cyan): (a) laser power of 100% and exposure time of 480 μs, standard detection; (b) laser power of 20% and exposure time of 480 μs, near detection; (c) laser power of 20% and exposure time of 240 μs, near detection.
Sensors 25 01772 g020
Figure 21. Sample of clear plastic, measured in the specular reflection area, roll of −17° and pitch of 0°, with varying exposure times. In the upper part, data from the camera are displayed, including reflected light and detected points connected by a red line. In the lower part, the reflected intensity of individual laser beams is shown (colored in cyan): (a) 240 μs; (b) 120 μs; (c) 60 μs; (d) 30 μs.
Figure 21. Sample of clear plastic, measured in the specular reflection area, roll of −17° and pitch of 0°, with varying exposure times. In the upper part, data from the camera are displayed, including reflected light and detected points connected by a red line. In the lower part, the reflected intensity of individual laser beams is shown (colored in cyan): (a) 240 μs; (b) 120 μs; (c) 60 μs; (d) 30 μs.
Sensors 25 01772 g021
Table 1. Parameters of the LJ-X8080 sensor [33].
Table 1. Parameters of the LJ-X8080 sensor [33].
ParameterSpecification
Reference distance (Z-axis)73 mm
Measuring range (Z-axis)±20.5 mm (full scale = 41 mm)
Measuring range (X-axis)30 mm (near side)
35 mm (reference distance)
39 (far side)
Linearity (Z-axis)±0.03% of the full scale
Profile data count3200 points
Laser typeBlue laser
Laser source10 mW
Laser wavelength405 nm (visible light)
Table 2. Known parameters of scanned objects.
Table 2. Known parameters of scanned objects.
MaterialWidth [mm]Height [mm]Thickness [mm]Roughness
Non-transparent plastics1001003Ra 1.2
Coloured transparent plastics1001003Ra 0.01–0.04
Pure transparent plastic1001005Ra 0.01–0.04
Aluminium alloy408012Ra 0.8
Ra 1.6
Ra 3.2
Ra 6.3
Ra 12.5
Table 3. Description of sensor configuration parameters [33].
Table 3. Description of sensor configuration parameters [33].
ParameterDescriptionValue
Dynamic rangeIt specifies the light-receiving sensitivity range of the capture element in the sensor unit. For high precision, the dynamic range is lowered, and the peak is measured at high sensitivity. This is used for target objects that have a small difference in reflectance.1 to 9
Exposure timeIt sets the exposure time of the capture element in the sensor unit. It is the length of time in which the camera collects light from the sample.15 μs, 30 μs, 60 μs, 120 μs, 240 μs, 480 μs, 960 μs, 1700 μs, 9.6 ms
Detection sensitivityIt sets the threshold value for the received light quantity to be detected. Increasing this value makes it easier for a received light quantity to be detected. Reduce this value to prevent mis-detection due to ambient light or multiple reflected lights.1 to 5
Laser powerLaser power in percentage. The maximum power is 10 mW.1–100%
Table 4. Specific parameter values for manual and automatic sensor settings.
Table 4. Specific parameter values for manual and automatic sensor settings.
ParameterManual SettingsAutomatic Settings
Aluminium Alloys
and Grey Plastic
Black
Plastic
Transparent Plastics
Dynamic range1791
Exposure time120 μs (transparent, roll, and pitch)
240 μs (transparent and roll)
480 μs (other materials)
480 μs9.6 ms9.6 ms
Detection sensitivity3333
Laser power100 (black plastic)
40 (aluminium alloys)
20 (grey and transparent plastics)
1–100 (auto)1–100 (auto)1–100 (auto)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Heczko, D.; Chlebek, J.; Mlotek, J.; Kot, T.; Scalera, L.; Dekan, M.; Zeman, Z.; Bobovský, Z. Enhancing Data Collection Through Optimization of Laser Line Triangulation Sensor Settings and Positioning. Sensors 2025, 25, 1772. https://doi.org/10.3390/s25061772

AMA Style

Heczko D, Chlebek J, Mlotek J, Kot T, Scalera L, Dekan M, Zeman Z, Bobovský Z. Enhancing Data Collection Through Optimization of Laser Line Triangulation Sensor Settings and Positioning. Sensors. 2025; 25(6):1772. https://doi.org/10.3390/s25061772

Chicago/Turabian Style

Heczko, Dominik, Jakub Chlebek, Jakub Mlotek, Tomáš Kot, Lorenzo Scalera, Martin Dekan, Zdeněk Zeman, and Zdenko Bobovský. 2025. "Enhancing Data Collection Through Optimization of Laser Line Triangulation Sensor Settings and Positioning" Sensors 25, no. 6: 1772. https://doi.org/10.3390/s25061772

APA Style

Heczko, D., Chlebek, J., Mlotek, J., Kot, T., Scalera, L., Dekan, M., Zeman, Z., & Bobovský, Z. (2025). Enhancing Data Collection Through Optimization of Laser Line Triangulation Sensor Settings and Positioning. Sensors, 25(6), 1772. https://doi.org/10.3390/s25061772

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop