**4. Implementation and Results**

Based on the developed methodology, software was developed to verify its ability to remove noise. Multiple workpieces were analyzed in this study to get a sample of different reflectivity. The variables defined in the methodology were set as follows; minimization factor was 10, maximum reduction step was 0.3, the number of iterations was 10, and the minimum rate of change of 0.005%. These variables were determined through experimentation for the samples chosen. Three samples were used:


**Figure 6.** *Cont*.

**Figure 6.** Images of three test samples: (**a**) gauge block; (**b**) brushed aluminum; (**c**) SLS printed part.

All objects were selected to have minimal form error to reduce the effects these errors may have on noise generation. These three samples provided unique challenges for scanning. The flatness of the parts is according to the manufacturers' specifications and verified by a tactile inspection coordinate metrology probe, where applicable. The gauge block had a reflective surface, yet also had a thin film of oil on its surface which reduced the amount of reflection. The SLS part, which is comprised of small, melted granules of plastic, absorbed a lot of the laser light due to its color and had granules on the surfaces which scattered light depending on how they were hit. Finally, the brushed aluminum surface structure had differently oriented grains that reflected light in different manners based on how the light was interacting with them. This was different from the SLS part as the aluminum contained long sections of reflectivity, so it was possible for bands of noise to appear on the part. All these possible sources of error can contribute to the uncertainty of measured values. When determining uncertainty, one of the areas that must be considered is the manufacturer specifications for the device being used to measure the object [36]. When in a non-standard situation, these specifications may not be accurate. Processing of the data through the use of knowledge of the measurement and conditions can help to reduce the uncertainty of the measurement [37]. This method may be a useful tool in the reduction of uncertainty of optical scanning of non-ideal material. The results of a few of these tests will be discussed in detail, then the results for the entire test will be shown.

The first sample to be examined is the brushed aluminum at an angle of +15◦. There is a band of noise along the center of the piece, highlighted in red in Figure 7. This is an example of the best-case scenario for the algorithm. The STD delta graph is of the optimal shape, so there is a clear point where the effect of point removal becomes minimal. The deviation zone estimated by the data was reduced by 90.8% through this method with 6% of the point cloud filtered. The STD of deviations was 0.515 mm which was reduced by only 6% filtration to 0.047 mm. The total deviation zone was estimated as 0.113 mm which is fairly close to the known flatness of the part.

The next sample is the gauge block at +10◦ shown in Figure 8. The noise in this scan is more distributed throughout the entire surface of the part, both above and below. Again, this situation is ideal. The STD delta graph follows with the shape that is sought after, and a large majority of the noise is filtered from the scan. For this sample, the flatness was reduced by 97.4%. The STD of deviations was 0.512 mm which was reduced by filtration to 0.013 mm. The total deviation zone was estimated as 0.024 mm. Obviously, the gauge block is more precise than the level of accuracy in a laser scanner and the uncertainties in the mechanical and optical features of the measurement device do not allow accurate

measurement of a highly precise gauge block. However, the results show that the algorithm was successful in filtering the dominating outlier noises from the data.

**Figure 7.** Results for brushed aluminum at +15◦ with STD history and delta graphs combined.

**Figure 8.** Results for gauge block at +10◦.

The next sample is the SLS piece, after a 90◦ rotation, at a scan angle of 0◦ (Figure 9). The scan for this part looks cupped due to the capture of the edges of the surface. There is noise shown under the scan. However, the density of the noise is very low. For this scan, there is no initial quick drop in the standard deviation of point-plane distances, which is a less than ideal result. This is due to the relatively small amount of noise having a small effect on the flatness measured when compared to the rest of the data set. There was a reduction of only 38.2% for this sample. The STD of deviation was 0.101 mm which was reduced by filtration to 0.062 mm. The total deviation zone was estimated as 0.153 mm.

**Figure 9.** Results for the SLS part at 0◦ after 90◦ rotation.

After reviewing fairly straightforward cases, a few rare problematic cases are presented in the following. As can be seen, these problematic cases require overly aggressive filtrations with some results that may cause misleading. The next sample is the brushed aluminum piece, with overexposed lighting conditions, after being rotated 90◦, at a scan angle of +25◦ (Figure 10). This scan had very little noise, however, the removal of the background data was not entirely completed to determine the effect of additional planar data on the algorithm. This resulted in a "stepped" planar surface with two separate heights. In a situation like this, the fit plane would be angled to capture both high-density planar areas. As the surface of the workpiece dominated the scan, a portion of it survived the filtering process, however, a large number of correct scan points was removed due to the initial plane fitting. The STD delta graph shows the wildly varying changes in the data set as points were filtered. Once the secondary background plane was eliminated from the data set, there was no further reduction in the flatness measurement, as the rest of the scan was already ideal. The STD of deviation was 4.51 mm which was reduced by filtration to 0.057 mm. The total deviation zone was estimated as 0.108 mm. While this behavior may seem desirable, it is possible that it could lead to misleading results for an operator or other decision-making process. Fortunately, it is easy to tell from the STD delta graph that something went wrong. This could be used as a diagnostic to ensure scans are occurring correctly.

The next example of the problematic cases is brushed aluminum at +25◦ (Figure 11). At first glance, the STD delta graph looks to have two bilinear sections, but with a large region of slowing change. The other issue is that the change is quite small to begin with, by only thousandths of a millimeter as indicated by the STD delta scale. This is because the scan had very little noise. As this process is a statistical analysis of the point-plane distances for a data set, if that data set is already very well formed, with only a couple of noisy points standing out from the main data set, the algorithm will overcompensate and begin to remove useful data from the set. This can be seen in the top left picture of Figure 11, where pieces of the plane have been removed without there being any noise in those areas. Like the previous issue, this can be detected by examining the change in STD over time. If there is a very small change, or if the STD history graph is nearly linear, it is likely that the data set is already very clean. The STD of deviation was 0.058 mm which was reduced by filtration to 0.039 mm. The total deviation zone was estimated as 0.104 mm.

**Figure 10.** Results for the brushed aluminum part at +25◦ after 90◦ rotation, overexposed.

**Figure 11.** Results for the brushed aluminum part at +25◦.

Finally, the SLS part after a 90◦ turn, at −10◦ while overexposed (Figure 12). This data set was quite noisy and contained a "ghost" layer of data below the actual surface. This is possibly due to the light absorption properties of the material. As the laser light hits the surface, it is diffused throughout the surface, causing a "glow". The scanner's receiver could still pick this refracted light up, resulting in a surface below the actual surface. This added layer of data caused a shift in the STD delta graph. There are not two linear sections; instead, the line has steps in it after the initial steep increase. Due to the algorithm only using a section from either end of the STD delta graph to determine the intersection point, shown in red in the ideal STD delta graph from Figure 4, the steps are not considered when determining the correct percentage of data to remove. This allowed most real data to remain while removing most of the noise. The STD of deviation was 0.823 mm which was reduced by filtration to 0.525 mm. The total deviation zone was estimated as 0.734 mm.

Table 1 shows the overall results for each part to four significant digits, due to the number of points used and the accuracy of the used scanner. Regardless of the sample or scan angle, there was a decrease in the detected flatness, showing removal of noise. In the naming of the cases, BA stands for brushed aluminum, GB stands for gauge block, and SLS stands for the selective laser sintered part. The number 90 in the name indicates that this part is rotated 90 degrees for the second set of measurements. The overexposed and underexposed items are defined with the letters "O" and "U" respectively, at the end of each name in this table. Generally, the overexposed condition benefitted the most from noise reduction, likely due to the added light allowed into the receiver causing more noise to be recorded. Conversely, the underexposed condition did not see as great a benefit as less light is allowed into the receiver.



#### **5. Behavior Analysis**

In order to determine how effective the developed methodology is, it is important to study how it is successful to determine the behavior of data. In order to do this, different quantitative parameters for effectiveness were determined. These were determined based on the ideal shape of the STD delta graph. As the graph ideally was comprised of two linear sections, with the intersection of these sections determining the amount of data to remove, parameters evaluating the closeness of this model were chosen. These six parameters are the slopes of the two fit lines, the distance of the chosen filtered percentage and the fit lines, and finally the standard deviation of the Euclidean distance of the sample points to the fit lines. In the ideal situation, the slope parameters will be maximum for the initial line, and 0 for the end line, the distances will both be 0, and the STD values will also both be 0. The following Figures 13 and 14 show a RadViz for the test results, to determine the similarities in the parameters for each material and angle tested. RadViz is a multivariate visualization algorithm that allows for different variables to sit along the outside of a circle, where inside the circle different datapoints, for this purpose test cases, are placed. These data points are pulled to each of the outer variables as though attached by a spring and using the value of the variable as a spring constant, the datapoint is placed where the force would be equal to zero [38]. This method can be useful to determine if your data clusters well, or if there is excessive variation within your data set.

**Figure 13.** RadViz of parameters for each test, separated by material.

In these visualizations, clear clustering can be seen. In this visualization, clustering is indicative of the correlation between different parameters. If for multiple, or many tests, the results cluster to one section of the circle, the items closest have the greatest effect on the result of the test. For all but the outliers, there was a very clear pull to both the distance of intersection to the initial line fit and to the STD of the end line. This means these two parameters likely have the greatest effect on the result of this filtering method and can be used to judge if a particular set of parameters for a particular work piece is optimal. For both the SLS Part and gauge block, the results were all very closely clustered together. These parts did not have the extreme reflectivity of the brushed aluminum piece, and so the results are very closely correlated. For the brushed aluminum sample, the test results outside of the cluster area all correspond to extreme angles of a scan. This is likely due to

the extreme angle of the scan and the high reflectivity of the material causing only a small amount of light to enter the receiver. In this case, filtering is not an effective strategy as the data collected will be inherently wrong. To fix these cases, different scanning parameters would need to be chosen entirely. However, in most cases, the tight clustering shows that the filtering scheme is effective, even at extreme angles, for two of the three surfaces tested, and for non-extreme angles for the brushed aluminum sample.

**Figure 14.** RadViz of parameters for each test, separated by angle.

#### **6. Conclusions**

In this paper, a noise reduction algorithm for planar systems was introduced. This system examined datasets globally in order to remove noisy data points from 3D point cloud datasets using a statistics-based approach. The methodology and algorithm were introduced and explained, and three sample parts were measured using a detailed procedure by robotic laser scanner 33 times under various conditions, for a total of 198 scans. These datasets were then processed using the algorithm in order to determine if the noise reduction was effective. Overall, the results are highly satisfactory and in the majority of cases, only with automatic filtration of a small amount of data, the estimated deviation zones are significantly improved. Examples of these cases are provided in the paper. In datasets where there was a large amount of noise, the algorithm was very effective at minimizing the effect of noise on the results. However, when the data set was already noise-free, the algorithm tended to slightly overcompensate and remove actual data. There are rare problematic datasets that are severely affected by the environmental scanning condition. As a result, a group of false data were introduced in these cases which challenged the algorithm in its filtration process. A few worst cases of these kinds are also presented in the paper. An example can be the cases where the background data was left in the scan, the fit plane was at an angle with the actual surface and so some real data was removed during the filtration process. Regardless of the error, there was evidence of these issues existing in both the STD delta and STD history graphs. It has been demonstrated that the algorithm was fairly successful to filter the false data even in these problematic cases. However, since the false data already form some patterns in these cases, the recommendation is the remove the false data by employing a pattern recognition method prior to using the developed noise filtration algorithm. In general, the developed algorithm and methodology are evaluated to be very efficient in removing the noisiness in optical metrology data. The methodology

can also be employed in various scales of data and for various industrial applications. It is computationally very efficient and can be easily used for on-line noise removal during the inspection process.

**Author Contributions:** Conceptualization, M.S.G.T. and A.B.; Data curation, C.B. and A.B.; Formal analysis, C.B. and A.B.; Funding acquisition, A.B.; Investigation, C.B. and A.B.; Methodology, M.S.G.T. and A.B.; Project administration, C.B.; Resources, A.B.; Software, C.B.; Supervision, A.B. All authors have read and agreed to the published version of the manuscript.

**Funding:** There is no specific funding for this research.

**Institutional Review Board Statement:** Not applicable.

**Informed Consent Statement:** Not applicable.

**Data Availability Statement:** Not applicable.

**Acknowledgments:** The research support provided by the Natural Science and Engineering Research Council of Canada (NSERC) is greatly appreciated.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**

