*2.4. Routing of Modelling and Simulation*

The basic principle of simulation lies in the simplified representation of the real system of its simulation model, describing only those characteristics of the real system that interest us in terms of its study (simulation). Instead, it would be possible to say that a simulation is a supportive tool that allows the experiment to test the effects of its decisions on the simulation model. By this, we can obtain an answer to the question "what happens if". The great advantage of this approach is that it is possible to previsualise the future behaviour of the system and to realise the necessary interventions in the real system based on its knowledge [25].

In view of the future needs of the simulation, supporting strategic decision-making, new classes of simulation systems must be developed to enable work with aggregated data at different hierarchical levels of the systems being analysed. Such solutions will require the development of entirely new integrated, hierarchical simulation systems capable of modelling complex corporate systems and working with heterogeneous modelling approaches [26]. The main task of the creators of such systems will be to integrate heterogeneous environments into a single, holonic concept. The hierarchy will require integration at micro-, meso- and macrolevels. The simulation environment will provide modelling techniques and approaches for modelling of all corporate hierarchical structures.

Digital twin (DT) is the concept of the functioning of future production systems, based on the digital technology application currently promoted by Siemens. Although the principles of digital twins are known to be more distant, Siemens has stretched the development into a phase of products that are now offered on the market. The digital twin is now presented mainly at the product level, and its essence consists in the creation of a virtual (digital) model of a developed product, machine, or device. The virtual model thus created (digital twin) can be used in all phases of the development, operation, and improvement of the product. For example, the digital twin of a car allows the costs of developing and testing a car to be reduced. The entire development and most of the tests can be implemented

through virtual testing and simulations, using the digital model. Physical tests are used only for the calibration of the test method [27].

The concept of the digital twin has been gradually expanded from product level to process levels, manufacturing systems to the enterprise level. The digital twin can be used in the performance of many business processes, whether it is logistics, manufacturing, assembly, and machining [28]. Industry 4.0 requires phenomenon twins to functionalised the relevant systems (e.g., cyber-physical systems). A phenomenon twin means the computable virtual abstraction of a real phenomenon [29]. The digital factory includes digitisation of the three most important business areas: products, processes and resources. Thus, the era is launched, in which all critical physical production entities are represented by their digital copies and digital models, also called the digital mock-ups (DMUs). In addition to the real manufacturing system, a digital manufacturing system, which is represented by a set of static, kinematic and dynamic digital models, which is integrated into a single digital development environment, i.e., a digital factory, will also be available to all companies.

This has allowed us to study and analyse the efficiency and performance of production before putting it into real operation. Decision-making has begun to become more and more algorithmised, with the database for decision making being the results of dynamic computer simulations. Hence, the beginning of 21st-century enterprises are confronted with two parallel worlds, real and digital (a real-digital world) [27].

Sensor hardening, the rapid development of new communication equipment and systems, have enabled the virtualisation of the world of manufacturing. Such a virtual manufacturing world has generated vast amounts of data that businesses have kept, analysed and started to use for predicting the future behaviour of manufacturing systems. Virtualisation, in this case, means that the managers obtain information about the immediate state of the manufacturing system through sensors. Data from sensors, processed by intelligent algorithms, create a dynamic, virtual image of a real production, which is named "virtual factory", and represent the duality of the real–virtual world (Figure 5). By linking digital, real and virtual worlds, this new quality is now known as the digital twin.

**Figure 5.** Combining three worlds—digital, real and virtual [30].

#### **3. Results**

#### *3.1. Control and Simulation in the Processes of Factories of the Future*

The manufacturing system is a multi-factor system. Its model is dynamic, not static. Therefore, it is not possible to say that the efficiency of the manufacturing system is a function of low stock or short, intermediate periods. The efficiency of manufacturing depends on a set of (huge) factors that are dynamically changing over time and are different for each manufacturing system. Although we do not have to know in detail the functioning of each element of manufacturing and we do not have to understand it fully, we can control it. However, we only apply its effectiveness to a very narrow range of criteria (most significant parameters) [30].

Correlation is a statistical characteristic of the statistical dependency rate of two (or more) statistical variables (random quantities). If we consider only two variables, we can easily interpret the dependencies. However, if we move in n-dimensional space with hundreds of variables, relationships will begin between variables (statistical dependencies) to acquire an often meaningless character. In manufacturing systems, we work in reality with an almost infinite number of variables (factors). Therefore, it is very complex (if not impossible) to compile an exhaustive mathematical model of the manufacturing system that would faithfully and accurately represent its dynamism. In this case, the approximate method of computer simulation will help. A cause and its effect, represented by correlation, does not always reveal the causes of the latter, and, rather, may reveal the consequences. Too much data brings the so-called "elusive correlation". A lot of data is used for many different estimates and predictions [31].

The control concept that uses virtualisation contains predictive mechanisms that enable the control system to "see potential scenarios for the future" [31]. The data structure for such a control concept is illustrated in Figure 6.

**Figure 6.** The data structure of the factory control system.

The use of a multi-agent control system is for the distribution of tasks and the hierarchical behaviour of the members of the system. In such a system, the holonic system works; it can be seen as a system consisting of subsystems, but at the same time, the system is part of a larger whole (system). A set of holons with their characteristics creates a holonic organisation called holarchy, which is characterised by the fulfillment of common objectives. Holarchy allows the creation of structures and representations of the behaviour of complex systems, often referred to as social systems. The functioning of the holonic systems is based on the use of the ability of autonomous agents. An agent is a system entity that has a specific degree of independence, allowing it to autonomously address tasks within a defined level of action. Agents accept tasks from the parent level of the holarchy, but their solution is carried out autonomously.

An intelligent agent is a computational or natural system capable of perceiving its surroundings and, based on its monitoring, performing actions that result in the extreme of its objective function (minimum, maximum), thereby fulfilling the global objectives of the system. In the agent systems, in the vast amount of interactions that occur between individual, autonomous agents (for example, in social systems), we are no longer able to predict the future behaviour of such a system [32]. If we were modelling such a system, it would be better to define the behaviour of individual parts of the system (agents). An agent can use the services of holon, which is used for simulation of varying inputs and to see the outcomes of actions. The use of simulation metamodelling within the holon simulation is illustrated in Figure 7.

**Figure 7.** Use of simulation metamodelling in the manufacturing control of complex manufacturing systems.

Figure 8 also illustrates the principle of application of digital factory instruments to changes in the product range, the exchange of technology and the change of layout. As seen, the entire management concept is first developed and tested offline in the virtual environment of the digital factory. Agents that represent the physical elements of the system use the knowledge of previous actions as well as existing models and, on the virtual model of the manufacturing system, carry out experiments in which scenarios are tested. Then, it selects the appropriate scenario that matches the target characteristics of the system. After completion of the development, the validated control concept is transferred to the real production system. Therefore, the simulation becomes an emulation when the startup point of a real-element agent that is recorded in a specific position predicts future statuses. The principle of the knowledge-based environment will support the system in the form of learning from process activities.

**Figure 8.** Linking simulations in continuity to control and learning from processes.
