Proposed Solution

We propose the adoption of the broader term: "Digital Twin Framework" to encapsulate all relevant work. We surmise it sufficiently conveys the concept to solve the aforementioned argument. Furthermore, we encourage the further adoption of the DM–DS– DT classification, as it has already been explored by researchers and found to be helpful, with the added requirement that the DS reflects the evolution of the real twin in time ("living" model), in addition to the data flow requirement. This means that a DTF, which automatically receives real-time sensory input but does not include i.e., aging or fault progression mechanisms, is classified as a DM instead of a DS. The essence of a model is its depiction of a particular effect or mechanism, while a DS reflects the complete state of the visualized system. The minimum requirements of each classification are given in Table 4.

**Table 4.** Improved categorization of DTF by necessary requirements.


### *3.2. Proposed Complete Definition of Digital Twin Framework*

Stemming from state-of-the-art literature, we propose a complete definition of the Digital Twin Framework from the scope of PM. This definition combines the core qualities of the DT from its inception to present, as mentioned in the reviewed work.

### 3.2.1. Life Cycle

Broad adoption of a well-designed DTF aims to solve one of the main problems encountered in CM and PM: lack or adequate handling of data. Managing the complete lifecycle of a product is of paramount importance in industry. There exists an information gap in a product's lifecycle, mainly between the production and the service phase, hindering provision of adequate after-sales services while driving up their cost. Furthermore, the retire phase of a product is all but ignored in relevant literature, allowing older generation issues to pass onto the next when they could have been easily avoided [27]. These problems are depicted clearer in Figure 6.

**Figure 6.** Illustration of data flow problems in the conventional product lifecycle.

The proposed DTF includes all four lifecycle domains. Design and production are handled by the provider and stored in the DTF in the appropriate form (which will be discussed below). Data are integrated into the DTF itself, using the "Box" paradigm. It is imperative that providers can withhold corporate and technological secrets. The obscurity associated with this fact can be overcome via providing a Black or Gray Box in the DTF, safely stored in the provider's cloud server (and accessed directly by the DTF). Retire phase data is stored in the DTF, which is perpetual and not destroyed along with the physical twin, as long as storage is provided. These data can be used to improve next models in the design phase. Most importantly, historical and usage data, mainly degradation and aging, is of paramount importance to the PM industry and will accelerate research efforts immensely.

The proposed lifecycle paradigm manifests into a circular lifecycle, as depicted in Figure 7 and is the final guideline of this work. The most important contributions are mentioned.

**Figure 7.** Proposed DTF lifecycle. The data layer allows for storage and improvement, making the lifecycle essentially circular.

### 3.2.2. Five-Dimensional Digital Twin Framework

The proposed DTF follows the five-dimensional paradigm discussed in the introduction. We surmise that these dimensions are exactly needed for a proper understanding, build, and integration of the DTF in current and future work.

Physical: encompasses the system hardware. In typical CM applications such as EMs, the machine is the core of the physical dimension and is built upon with sensors, controllers, and other needed hardware. One important aspect to consider is the duality of these components; EMs are purely physical as they are electro-mechanical conversion systems fulfilling one purpose. Sensors and controllers, on the other hand, are in both the physical and the virtual dimension. Their hardware, physical indicators and logical operations are in the physical dimension. However, they offer a clear virtual footprint to be tapped into by the simulation. We advise that DTFs should not create virtual twins of these "digitally enabled" components and rather treat them as existing in both dimensions, saving on both computational power and complexity.

Virtual: its core is a mirror representation of the core physical system to the best of our ability. Creating this mirror is an iterative process, discussed below. Models follow the "Box" paradigm and are multilevel regarding their fidelity, built upwards. Toward "digitally enabled" components, the virtual dimension encompasses and displays their software part directly. The virtual dimension is differentiated from the closely related Data and Service dimensions via its pure usage as a representation medium, mirroring the real world's mechanisms and laws, written themselves as virtual twins.

Data: this dimension bears no physical representation (barring the actual storage hardware which is not a concern of the DTF) or virtual workings. The DTF data are the collective information of values and their physical meaning. Its purpose is feeding the physical and virtual dimensions with information to be worked by their mechanisms. It can be found on-premises and on-cloud and comes from the three different source iterations as discussed above: historical, real-time, and predicted. Finally, data can be found in three different forms: structured, semi-structured, and unstructured [52]. Data used to be part of the virtual dimension but became its own due to its complexity and different workings in the DTF. The DTF aims to provide Product Embedded Information (PEI), meaning all necessary data are included in the framework.

Services: include any and all ancillary utilities of the DTF. "Twinning" means to create something identical, while services are additional inclusions. New approaches such as the DTF enable novel services and completely revamp older ones, warranting their own dimension. The service dimension can be regarded as a toolbox for the DTF; examples include the UI, CM, AI platforms, training utilities etc. Services are not twins

per the discussed definition; they are products of the DTF. Conventional services were not included in the limitations of what was defined as "the system".

Connections: similar to the data concept, connections are the realization of connecting the different parts of the DTF. These include connections between models and mechanisms, blocks and boxes, different DTFs, services, hosts, providers and users. Akin to the data dimension, connections play a pivotal role in the DTF and concern state-of-the-art research and techniques. Their primary role is the translation of different data forms into the operation appropriate. Connections can be physical (cables, tubes, shafts) and virtual (links, decryptions, translators).

Authors of [53] have proposed an eight-dimensional version of the DTF, which follows the same paradigm but analyzes the data-focused dimensions further. We deem the fivedimensional model as the maximum necessary complexity for adoption and broader appeal.

### 3.2.3. Creating the DTF Iteratively

We mentioned that creation of a DTF is an iterative process. There currently is no optimal way to approach it, but literature agrees on this iterative principle, expertly demonstrated in [38,44,47,49]. The proposed process combines and expands upon the paradigm followed in these works.

Core Model: Ref. [38] offers a concise methodology for the core model. The first step is to create the geometric representation of the real twin, following dimensions, orientation, and all geometric qualities. Then, these qualities are given physical attributes such as weight, density, and other material qualities. The third proposed step is creating the behavior model, the interactions between components and the environment, following the laws of physics and the virtual laws in the simulation. Finally, a constraint model is realized, giving the model the boundary of the physical world. At this stage, our model is what we referred to as a Digital Model. It depicts an object and mechanism in steady state. This is also the first step in the other mentioned works.

**Enhancement:** the core model is enhanced with DTF exclusive techniques such as prediction models, fault progressions and probability mechanisms. The latter exceptionally describe the physical world and are materialized via state-of-the-art technology such as ANNs, Fuzzy Logic, and general AI techniques. The exclusivity refers to the edge this approach gives to the DTF compared to conventional, autonomous to each other approaches.

**Model Optimization:** the model is now upgraded with better behavioral descriptions, higher fidelity models and more extensive analyses. Focus is dependent on the objective of each DTF and is not strictly enforced. For example, a DS aiming to calculate the thermal strain of a pump should have analytical thermal modeling and can manage with simpler electrical or hydraulic models. Following in this concept, our model is optimized according to its purpose; the core model is the same in every similar system, but the optimization can differ dramatically. An example is [47] Levels 2–4 or [44] Step 2.

**Data Validation:** following the clear-cut [44] steps, the next iteration is validating the virtual representation model with real output data. Holding on the comparison with the real twin for the next step, data validation provides larger freedom of changes and has no time constraints. Large system DTFs are often accompanied by integration constraints and time limits. The proposed DTF should resemble as closely as possible the real twin according to theory.

**Real Twin Validation:** the proposed model is connected to the real twin and validated with comparison methods. Changes done to the model in this stage should be limited to data tables and weight constants, due to the aforementioned issues.

The complete vision of the proposed DTF is illustrated in Figure 8.

## 3.2.4. Software

DTFs can be realized in numerous software packages provided by large competitors. We incentivize usage of the researcher's preferences, as each package specializes in different aspects of the DT. Instead, focus of research ought to be in integrating the final frameworks into a seamless, plug-and-play package able to cooperate with various mediums. To that end, we surmise that the general guideline for DTF integration is akin to the one proposed by [30].

### *3.3. Contribution of the DTF in Industry*

Finally, this work aims to discuss the purpose of the DTF in industrial state-of-the-art, focusing on CM and more especially PM. In short, the DTF ventures to tap into the data availability experienced in today's IoT, namely:


Modeling and simulating the physical system have already reached an adequate level and have been in the works for the last 20 years. CNC has been a staple in manufacturing since the 2000s. The cornerstone of the DTF and the reason for its rapid advancement today is data handling and integration. State-of-the-art capabilities in CPUs and AI techniques both enable and benefit from the DTF. Thus, focus should be split into two major offensives:


Models and theories can of course be updated, especially in the second generation of DTFs (assuming today's work launches the first), after having the historical data to improve progression and prediction mechanisms significantly, which is in our opinion the only lacking (compared to modeling and theories) aspect of PM in CM. A review on Big Data and comparison with DTs can be found in [52]. We once again propose the "build-upon" DTF paradigm. Enable IoT integration via assuring that PEI is structured data, handled internally. Focus appears to be in feeding raw measurement data in hybrid neural networks such as ANFIS, since literature suggests that they provide the best results [56]. Concerning the DTF, even raw data input becomes structured when exiting a proper DTF.

Advanced market and user needs warrant facilitation of more customized products in concordance with smarter manufacturing. Smart sensors and IoT integration are preexisting in most modern shop floors and machines. The DTF's purpose is to interconnect this pre-existing foundation and, in unison with big data technologies, build the virtual representation of the system. The challenge is combining these heterogenous devices and organizational structures [57] in a uniform framework.

A real example concerns employing the discussed hierarchical structure. Computerassisted Design (CAD) models are the basis of creating the DT for each part of the system. FEM analysis is optional but can greatly enhance the effects of the second part, the behavioral model, typically realized with differential equations. After the creation and validation of each subsystem, the complete representation of the system is connected to its physical counterpart via the sensor footprints. The data layer is then constructed via calculations and experiments, followed by the services after sufficient data handling and accumulation. Connections are handled by frameworks such as AutomationML. Interconnection schemes are left to the discretion of the user/client and bear no importance bar adhering to literature and industry consensus, such as the work discussed in this paper. Different generic architectures have been proposed and research remains to determine the feasibility and contribution of each. Core challenges in every studied work include identifying the basic structures and relationships and encapsulating the critical details of each component. An in-depth methodology adhering to the proposed paradigms can be found in [57].

Real applications can be found in studied work such as the CNC machine tool [33–36] and industrial machine retrofitting [30]. Our future work pertains to the creation of a ship generator and propulsion system DT and PM of an industrial machine in a factory shop floor adhering to industrial IoT and the Industry 4.0 paradigm.

### *3.4. Proposed Definition*

The presented DTF paradigm can be summarized as a hierarchical approach, both microscopically (in-DT) and macroscopically (DT cooperation), in addition to its iterative aspect. In the context of Power and Energy Systems, we expect the DTF to fit in adjacent sectors, such as EVs and Micro-Grids, which are built in the same philosophy and face the same challenges [58]. Our complete definition of the term "Digital Twin" follows:

"The Digital Twin is an organic multiphysics, multiscale, probabilistic simulation that can represent the physical counterpart of a system in real-time, based on the bi-directional flow and complete volume of product-embedded information, encapsulating the full lifecycle data to facilitate knowledge sharing and integration."

We deem the above definition as complete regarding the discussed requirements in classifying a DT, given in the shortest context possible. We highlight two aspects of the above definition as potentially ambiguous and in need of further clarification, namely:


The DTF is a broader term, encouraged to convey the potential or intended usage of the proposed system in a way equivalent to the new IoT/Industry4.0 paradigm, missing one or more qualities of the true DT (that may or may not be added in the future). Furthermore, the prefix "i-DT", as in "intelligent-DT", which conveys the usage of AI in the framework (not a classification requirement), is encouraged as it provides an additional layer of information. Finally, the "nexDT" term, while excellent in this establishment era of the concept, is of limited use up until this new paradigm is established, and thus has a finite lifespan so as to not be included in the definition.
