**2. Overview of Contributions**

Following is a brief overview of the first ten contributions published in the Topical Collection.

The core of every contribution is represented by the "measurement uncertainty", a concept introduced in 1995 by the "Guide to the expression of uncertainty in measurement", generally known as GUM. The word "uncertainty" has a lexical meaning and reflects the lack of exact knowledge or lack of complete knowledge about something. Therefore, the value associated to a measured value (which should express the lack of exact knowledge about the value of the measurand) is called the "uncertainty value". This value can be found, according to the suggestions of the GUM and following the mathematical probabilistic approaches therein proposed.

In the last decades, however, other methods have been proposed in the literature, which try to encompass the definitions of the GUM, while overcoming its limitations. Some of these methods are based on the possibility theory, such as the RFV (random-fuzzy variable) method. The authors of [1] briefly recall the RFV method, starting from the very beginning and the initial motivations and summarizing the most relevant obtained results.

Kalman filters, a concept that has been in existence for decades now, are widely used in numerous areas. The Kalman filter provides a prediction of the system states as well as the uncertainty associated to it. In [2], the same authors of [1] propose a new application of the RFV method on Kalman filters, with the specific aim of reducing the overall uncertainty associated to the state predictions. In particular, a possibilistic Kalman filter is defined,

**Citation:** Salicone, S. New Frontiers in Measurement Uncertainty. *Metrology* **2022**, *2*, 495–498. https://doi.org/10.3390/ metrology2040029

Received: 28 October 2022 Accepted: 7 December 2022 Published: 12 December 2022

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

which uses random-fuzzy variables; not only does it consider and propagates both random and systematic contributions to uncertainty, but also reduces the overall uncertainty associated to the state predictions by compensating for the unknown residual systematic contributions.

In [3], measurement uncertainty is considered to be associated to measuring bridges for non-conventional instrument transformers with digital output. In this paper, the authors underline the necessity of synchronization between the analogue output and the digital one. They hence propose an ad hoc measurement setup that is able to monitor and quantify the main quantities of interest. The proposed measurement setup is laboratory implemented and the main sources of uncertainty are discussed and combined through a statistical analysis.

The authors of [4] yearn for a future scenario in which the digital reporting of measurement results is ubiquitous, and digital calibration certificates (DCCs) contain information about all the components of uncertainty in a measurement result. To show the benefits of this possible future scenario, the authors consider and compare the actual "international measurement comparisons" used by the International Committee for Weights and Measures (CIPM) and the regional metrology organization (RMO). They propose an uncertainnumber digital reporting format, which caters to all the information required and would simplify the comparison analysis, reporting, and linking; the format would also enable a more informative presentation of comparison results.

In [5], the authors deal with measurement uncertainty in Prompt Fission Neutron Spectra (PFNS) measurements, measurements of fission cross-sections, and measurements of Maxwellian spectrum-averaged neutron capture cross-sections for astrophysical applications. In particular, they demonstrate that these measurements are all subject to the presence of Systematic Distortion Factors (SDF). SDF may exist in any experiment: it leads to the bias of the measured value from an unknown "true" value. The SDF appears as a real physical effect if it is not removed with additional measurements or analysis. For a set of measured data with the best evaluated true value, their differences beyond their uncertainties can be explained by the presence of Unrecognized Source of Uncertainties (USU) in these data. The authors link the presence of USU in the data to the presence of SDF in the results of the measurements.

In [6], the topic of digital calibration certificates (DCC) is considered again. In calibration certificates, information about a quantity is frequently provided in the form of an estimate of the quantity and an associated standard or expanded uncertainty. Then, if the quantity must be used in another calculation, it is common—in the absence of any additional information—to assign a Gaussian probability distribution to the quantity. However, the true probability distribution of the quantity could be significantly different from the Gaussian one; therefore, this assignment may lead to unreliable results, when subsequent calculations are made. Even if the uncertainty evaluation has been made using a Monte Carlo simulation, only the summary information "estimate of the quantity and associated uncertainty" are generally reported in the calibration certificate, for the sake of brevity. Using two examples, the authors show how to present all the information derived from a Monte Carlo simulation in a fully machine-readable form and insert the whole information inside digital calibration certificates. In this way, no information is lost.

Technologies that can measure, analyse, and make critical decisions autonomously are beginning to emerge; hence, there is great interest in the digitalisation of metrology. In [7], the authors report on a Python package that implements algorithmic data processing using 'uncertain numbers', which satisfy the general requirements of the GUM for the expression of uncertainty. An uncertain number can represent a physical quantity that has not been exactly determined. Using uncertain numbers, measurement models can be expressed clearly and succinctly in terms of the quantities involved. The proposed algorithms provide an example of how metrological traceability can be supported in digital systems. In particular, uncertain numbers provide a format to capture and propagate

detailed information about quantities that influence a measurement along the various stages of a traceability chain.

One of the main challenges in designing information fusion systems is to decide on the structure and order in which information is aggregated. The key criteria by which topologies are constructed include the associativity of fusion rules as well as the consistency and redundancy of information sources. Fusion topologies regarding these criteria are flexible in design, produce maximal specific information, and are robust against unreliable or defective sources. In [8], an automated data-driven design approach for possibilistic information fusion topologies is detailed that explicitly considers associativity, consistency, and redundancy. The proposed design is intended to handle epistemic uncertainty and obtain robust topologies.

In [9], the authors analyse the measurement uncertainty associated to the evaluated frequencies of the spectral tones of signals created from superimposed sinusoids and white Gaussian noise, when different methods for the spectral analysis of the signals are applied. By comparing the obtained results, the authors draw some useful conclusions in order to guide a designer in choosing a method for the spectral analysis, according to the operating conditions.

Data-driven manufacturing in Industry 4.0 demands digital metrology to not only drive the in-process quality assurance of manufactured products, but also to supply reliable data to constantly adjust the manufacturing process parameters for zero-defect manufacturing processes. The better quality, improved productivity, and increased flexibility of manufacturing processes are obtained by combining intelligent production systems and advanced information technologies where in-process metrology plays a significant role. Today, the massive integration of 3D optical sensors occurs within manufacturing processes, replacing traditional Coordinate Measurement Machines (CMM) within the automotive, aerospace, and power generation industries. However, while the delivery of millions of points in a matter of seconds is assumed by 3D optical sensors, the process of automatically converting dense data into meaningful information and assuring the quality of these data remains a challenge. In [10], the authors present a practical approach to addressing both these challenges, based on ISO 15530-3 and ISO 15530-4 technical specifications and the application of MBD-based post-processing for the automatic processing of point clouds.
