2.1.1. ISO 15530-3 Technical Specification

The ISO 15530-3 [46] technical specification is a substitution method that simplifies the uncertainty evaluation exercise through the similarity between the dimensions and shapes of the workpiece and one calibrated reference part. It is based on a statistical evaluation of the measurement errors observed concerning the calibrated value of the reference part. The user must perform a relevant number (>20) of measurements under various conditions that they might expect while measuring real workpieces. This approach appears to be straightforward from the viewpoint of the user and attempts to cover intrinsic and extrinsic uncertainty contributors. However, in practice, it is fraught with difficulties. Any divergence between the master and measured parts can lead to uncertainties. Because of the similarity requirement between the produced workpiece and the calibrated standard, this approach is very arduous and expensive for large-scale metrology, where the storage, maintenance and calibration of large components is a major expense. However, it is a reliable approach for serial production, usually for small- and medium-sized components because it is affordable to manufacture and calibrate a reference part for uncertainty assessment purposes. It is usually employed for medium-size component uncertainty assessments in CMMs or Machine Tools (MT). This approach determines four input quantities as explained below [46]:

*ub*: standard uncertainty associated with the systematic error of the measurement process;

*up*: standard uncertainty associated with the measurement procedure;

*ucal*: standard uncertainty associated with the uncertainty of the workpiece calibration; *uw*: standard uncertainty associated with material and manufacturing variations.

$$
\Delta I = k \ast \sqrt{u\_p^2 + u\_{cal}^2 + u\_w^2 + u\_b^2} \tag{1}
$$

Finally, the law of uncertainty propagation is applied to obtain the combined standard uncertainty according to GUM JCGM 100:2008 [13] and the result is multiplied by an appropriate coverage factor to yield an expanded uncertainty, according to Equation (1). Figure 1 shows the practical approach to this method.

**Figure 1.** A practical approach to the ISO 15530-3 [5].

According to the ISO 15530-3 technical specification, the standard uncertainty (*up*) is determined using Equation (2). The standard uncertainty associated with the systematic error is given by Equation (3). Moreover, if the measurement result is not corrected by the systematic error, the error fully contributes to the uncertainty budget; thus, (*ub*) = *b*. Thus,

$$\overline{y} = \frac{1}{n} \sum\_{i=1}^{n} y\_i \quad u\_p = \sqrt{\frac{1}{n-1} \sum\_{i=1}^{n} \left( y\_i - \overline{y} \right)^2} \tag{2}$$

$$b = \overline{y} - \text{x} \text{cal} \tag{3}$$

wherein:


#### 2.1.2. ISO 15530-4 Technical Specifications

The ISO 15530-4 [47] technical specification introduces a "task-specific uncertainty" assessment method based on computer simulation. Measuring instruments such as CMMs and 3D optical systems are multi-purpose instruments which means that potential measurement uncertainties vary with the task being performed, environment, operator, or chosen measurement methodologies. The "task-specific uncertainty" in coordinate measurement is the measurement uncertainty that results when a specific feature is measured using a specific inspection plan. The approach is similar to GUM but instead of using an analytical approach based on a complete closed-form mathematical model, it uses a simulation method (for example, the Monte Carlo method) run on a computer to estimate the uncertainty statement for a particular measurement task. This is even more complex than the GUM approach because an initial model of the measurement instrument and process is required to run the simulation. The simulation model or virtual instrument model generates a perturbed point that represents an estimate of what a particular measurement instrument would have reported when measuring that commanded point. This process is performed several times, running as many measurements as the simulation iterations (hundreds or thousands), which enables the creation of simulation results of the measurement uncertainty.

The current state-of-the-art shows that the ISO 15530-4 approach is already being applied to measuring instruments such as CMMs or laser trackers, while modelling of optical sensors is still being developed by the research community and therefore VCMM for optical sensors is not still commercially available. The popular name for the method applied to CMMs is the so-called Virtual Coordinate Measuring Machine (VCMM) [48], which performs a point-by-point simulation of measurements, emulating the measurement strategy, measuring conditions, and physical behaviour of the CMM with dominant uncertainty contributions disturbing the measurement [48–50]. For spherical measuring instruments such as laser trackers or laser scanners, a basic spherical error model is considered in combination with a Gaussian Probability Density Function (PDF) to apply the law of uncertainty propagation. Figure 2 shows the VCMM approach, where the thick black lines show the data flow for a normal CMM measurement, while the thick grey lines show the additional data flow that is employed to achieve a VCMM estimate. Wilhelm et al. [50] presented a description of the complete VCMM workflow, as shown in Figure 2.

**Figure 2.** The VCMM approach for coordinate measuring machines [50].

The so-called Digital Metrology/Measurement Twin (D-MT) is the virtual representation of either a measurement instrument or the complete measurement procedure [51–53] and it is frequent to use a similar mathematical model to that developed within the ISO 15530-4 approach to run the simulation. Here, Artificial Intelligence (AI) algorithms such as machine learning, deep learning or neural networks are being researched for the development of those D-MT and uncertainty assessment tasks [54–56].

## *2.2. MBD-Based Metrology*

The MBD approach will allow meeting the challenge of converting dense data into meaningful information in a matter of seconds within the production line which will allow a quick decision-making process within the production environment. However, the current state of MBD industrial implementation shows that manufacturers have applied MBD to product definition for some time, whereas aerospace and defence customers have played the role of leaders with a slower adoption in other industries [22].

From the CAD suppliers' point of view, MBD is seen as the cornerstone of creating a functioning digital thread. While the goal is to have a single source of truth for downstream operations in making a part, most CAD suppliers provide MBD in a proprietary format which means that interoperability between systems remains a challenge. Previously, a universal CAD format already exists, the ISO 10303 STEP format [57] with its accompanying AP 242 extension which includes 3D model data representation, geometric tolerance and PMI to enable global design and manufacturing collaboration [23]. However, several questions remain regarding the full definition of MBD. Standards such as ASME Y14.41 [58] and ISO 16792 [59] still exist to document how a model should be defined with annotations. These standards also help in understanding how to interpret the data within the model but the standards do not document the required amount of information that the model must contain [60].

From an MBD-based metrology point of view, MBD is allowing an automatic quality assurance workflow, allowing the automation of either the measurement program creation or the data evaluation process stages [24]. While the former is already available within the main CMM commercial software, the latter can be applied for any point cloud if the measurand MBD model and the MBD software are available. Model-based inspection has not been paid much attention to within the metrology community since the 1990s [61–64].

The MBD-based metrology process starts by creating a 3D CAD model with semantic PMI information that should be both human and machine-readable [65]. The 3D model with PMI shall contain all GD&T geometric information related to the component under measurement as well as the information related to the Bill of materials (BOM), Surface finish, weld symbols, manufacturing or measurement process plan data, metadata and notes, history of an engineering change order, legal/proprietary/export control notices and other definitive digital data [65].

The associativity between the CAD model and MBD is required to have a fully semantic smart model that allows automatic part programming and post-processing. The ability of downstream programs to read MBD models and create measurement programs is as important as creating CAD models with an attached semantic MBD. Thus, the CMM is virtually configured, and once the MBD file is imported and a set of rules is applied and matched to the configured CMM, a part program is automatically generated aided by these a priori digital approaches [24,66,67]. Typically, a second optimisation is performed to reduce the number of probe changes and minimise the CMM path length.

In the post-processing stage, the MBD concept allows fast point cloud analysis and evaluation of the measurement data. During the automatic evaluation process, the acquired point cloud is aligned to the CAD model, and an automatic segmentation process is performed using the available MBD data. At this stage, each measured point is associated with its corresponding geometric features. Then, the geometric features were adjusted using linear regression methods, rejecting possible outliers. Finally, the real relationships among the adjusted features were estimated (dimension, form error, relative positioning, etc.) through the fully automatic interpretation and evaluation of previously defined GD&Ts. Thus, the process of converting dense data into meaningful metrology-rich information is executed automatically in seconds.
