**1. Introduction**

Measurements of a variable are made using numerical scales. Usually, the value of a measurement is associated with a real number, but this association is not exact because of imperfect or partial knowledge due to uncertainty, vagueness or indiscernibility.

Incomplete knowledge comes from limited reliability of technical devices, partial knowledge, an insufficient number of observations, or other causes [1]. Among the different types of uncertainty, we find imprecision, vagueness, or indiscernibility. Vagueness, in the colloquial sense of the term, refers to ambiguity, which remains in a datum due to lack of precision, although its meaning is understood. An example could be the measurement of a person's weight using a scale, which provides a value within a range of scale accuracy, e.g., between 75 and 75.2 kg. Uncertainty refers to imperfect or unknown information. For example, it is known that the weight of a car is within limits (1000–1500 kg), but the exact value is unknown due to missing information, such as the number of occupants and the load.

The problems of vagueness and uncertainty have received attention for long time by philosophers and logicians (e.g., [2,3]). Computational scientists have also provided new tools for dealing with uncertainty and vagueness, such as interval analysis, either classic intervals[4,5]ormodalintervals[6,7],fuzzysettheory[8,9]androughsettheory[10].

 Indiscernibility has also received the attention of philosophers. The identity of indiscernible [11] states is that no two distinct things are exactly alike. It is often referred to as

**Citation:** Sainz, M.A.; Calm, R.; Jorba, L.; Contreras, I.; Vehi, J. Marks: A New Interval Tool for Uncertainty, Vagueness and Indiscernibility. *Mathematics* **2021**, *9*, 2116. https:// doi.org/10.3390/math9172116

Academic Editor: Ioannis Konstantinos Argyros

Received: 12 July 2021 Accepted: 27 August 2021 Published: 1 September 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

"Leibniz's Law" and is usually understood to mean that no two objects have exactly the same properties. The identity of indiscernibility is interesting because it raises questions about the factors that individuate qualitatively identical objects. The marks, which are presented and developed in this paper, are outlined to address indiscernibility.

For example, the temperature in a room can be measured with a thermometer at different location points within the room to obtain a spatial distribution. So the temperature is not, for example, 20 ◦C but an interval of values, between, for example 19 and 21 that represents the different values of the temperature in the room. It does not represent the temperature of the room, which might be necessary for the modeling of an air-conditioning system. The temperature is at this time one value of the interval [19,21], considering the points of this interval as being indistinguishable. This is a known issue in handling "lumped" or "distributed" quantities. Moreover, the thermometer used as a measurement device has a specific precision and provides a reading on a specific scale, which is likely translated to another digital scale to be used in computations. Therefore, a real number or even an interval is not able to represent the read temperature.

Until the 20th century, the preferred theory for modeling uncertainty was probability theory [12], but the introduction of fuzzy sets by Zadeh [8] had a profound impact on the notion of uncertainty. At present, sources of uncertainty remain an active challenge for the scientific community, and different research efforts are directed toward finding solutions to deal with these uncertainties, such as using Bayesian inference for predictions of turbulent flows in aerospace engineering [13], fuzzy sets for time series forecasting based on particle swarm optimization techniques [14], modal intervals for prediction modeling in grammatical evolution [15], interval analysis method based on Taylor expansion for distributed dynamic load identification [16] or rough sets to evaluate the indoor air quality [17].

In this article, marks are presented as a framework to deal with quantities represented in digital scales because this methodology can take into account many the sources of uncertainty. In any use of a mathematical model of a physical system, such as simulation, fault detection, or control, the system of marks provides values for the state variables and, simultaneously, their corresponding granularities, which represent a measure of the accumulated errors in the successive computations. This performance leads to the following:


In the following sections of this paper, we present and review marks theory and basic arithmetic operations. The main contributions of this paper are the following:


To demonstrate the applicability and potential of marks, a well-known benchmark in process control in which the problems of uncertainty, imprecision, and indiscernibility are present is introduced. After introducing the benchmark, three different problems built on it are presented and solved, using marks: simulation, fault detection, and control.
