**1. Introduction**

Research activity in Electrical Machine (EM) Predictive Maintenance (PM) observes renewed interest in recent years as industrial and commercial applications diversify and expand into novel areas, while their role in them becomes more prominent. Conventionally, the sector's efforts have been focused on Squirrel Cage (SC) Induction Motors (IMs) and conventional rotor Synchronous Machines (SMs), owing to their domination of motor and generator applications, respectively. Nowadays, other types of EMs observe an increase in their relative usage in lieu of SCIMs and SMs, such as Wound-Rotor (WR) IMs (especially as generators in wind turbines), Permanent Magnet (PM) SMs (mainly due to electric vehicles), multiphase Alternating Current (AC) machines, and Switched or Synchronous Reluctance Machines (SRMs or SyncRMs). This shift is attributed to new materials, design and control architectures, the exponential expansion of generators in Renewable Energy Sources (RES), and specialized requirements of new applications such as spatial availability (power density), efficiency targets and fault-tolerant systems. Moreover, substantial increment in computing power, sensor advancements, artificial intelligence and Internet-of-Things (IoT) applications enable the discovery of new research avenues with novel approaches and powerful tools both in laboratory tests and industrial (on-site) applications, promoting a combination of different techniques.

EMs typically constitute the core part of their application and thus dictate its entire condition and performance. As electromechanical processes, they are prone to different

**Citation:** Falekas, G.; Karlis, A. Digital Twin in Electrical Machine Control and Predictive Maintenance: State-of-the-Art and Future Prospects.*Energies* **2021**, *14*, 5933. https:// doi.org/10.3390/en14185933

Academic Editor: Anouar Belahcen

Received: 18 August 2021 Accepted: 15 September 2021 Published: 18 September 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

faults of various severity. Breakdown during runtime leads to severe economical and safety repercussions (i.e., significant repair costs, unscheduled production halt, increased man-hours, missed deadlines) while faulty operation significantly reduces efficiency and is a safety hazard. Consequently, timely and correct fault diagnosis is of paramount importance to the industry by way of scheduling necessary repairs during downtime and before breakdown. In addition, online Condition Monitoring (CM), which is the main PM operation, ensures that processes run with optimal efficiency, further cutting down on operating costs and needed reserves. These efforts are further reinforced by novel cost-effective sensors, Data Acquisition (DAQ) and evaluation techniques, making EM diagnostics a significantly profitable venture in industry.

Great reviews, such as [1–6], concerning state-of-the-art PM methods and their application have been published in recent years, addressing techniques with their applications and comparison. Literature highlights a need for evaluation benchmarks and new combinations. The authors of this work hypothesize that this endeavor fits the paradigm of creating a Digital Twin of a part or the entirety of the studied system, as it was coined by Grieves [7] and expanded upon in recent years [8]. While the next-generation-Digital Twin (nexDT) concept is increasingly explored in recent works, the authors noted a lack thereof concerning specialized EM CM and control, thus inspiring this review. This work reviews DT literature concerning EM CM concepts (rotating machinery, electromechanical systems, PM applications) spanning the last five years, to evaluate its usability in EM CM and control going forward in Industry 4.0., as well as retrofitting and reverse engineering applications of older machines.

The main aim of this review is to combine literature descriptions of the nexDT, consider fellow researchers' concerns, emphasis, and preliminary work and then merge them with the well-established PM paradigms into a definitive description for the realization and usage of the nexDT in EM PM, in an effort to synchronize research efforts. Future prospects include the conjunction of a sensor setup and its connection to a Digital Twin via the appropriate interface and techniques, in order to set the stage for designing a state-of-theart CM test bench with the goal of making it portable for industrial application. Surmised benefits are three-fold: establishing and validating the sector's cutting-edge approach, extracting and unifying manufacturing and operation data, and combining the latest proposed techniques of various sciences. Research has come to the consensus that a combination of methodologies is needed for achieving the optimal diagnostic procedure, namely obtaining a correct assessment with the minimum measurements in minimum time. According to the latest literature, DT is the prime candidate in this effort and the industrial and research state provides the perfect storm for its development at the current time. It is surmised that such a framework can facilitate future research efforts and aid benchmarking. To the best of the authors' knowledge and research, a focused review of EM DT has not been ye<sup>t</sup> realized. The DT concept is picking up speed in the latest decade and is predicted to become the new paradigm in industry. Its synergy with EMs is even higher compared to various industrial applications, constituting its exploration mandatory. Ref. [9] provides a specialized review concerning EMs, explaining the concepts introduced.

All citations and studied work were found via the Google Scholar database by employing the following search keywords: digital twin; digital twin electrical machine; digital twin electrical machine fault diagnosis; digital twin electrical machine fault diagnosis review; digital twin machine fault; digital twin fault diagnosis; industry 4.0; digital twin software. This search yielded a low number of relevant works pertaining to usage of DT in EM CM and PM compared to broader DT concepts. While previous literature research into predictive maintenance seldom refers to abstract DT concepts, no established parallels are drawn. Choice of keywords was based on the DT review papers' (discussed below) methodology of selecting relevant work, in addition to a further limitation of including the keyword "digital twin" in search. This was done to establish that specifically searching PM DT will not guide the user to pertinent work. The authors noted a lack of relevant literature even when expanding the search terms to a broader spectrum. While general

consensus regarding the concept can be derived from the results, we surmise that a specific review will enable and accelerate related work, serving as a starting point for novice and experienced researchers alike.

This work is divided into three major parts: review of DT literature as explained above, a brief report on EM PM state-of-the-art as provided by the related reviews mentioned above, and finally the conjoined report on most important aspects and suggestions regarding the two sectors, complete with groups and visuals. To the best of their ability, the authors made an effort not to repeat the reviewed work, but rather provide an outline and create a web of relevant citations.

### **2. Literature Review**

In a broader context, the aim of a DT is to aid in optimization problems. Ref. [10] provides an excellent insight in robust design optimization and emerging technologies, specialized for EMs. As stated by the work, design optimization requires multi-disciplinary analysis and multi-pronged investigation of the system, areas where a DT excels. The main open challenge in state-of-the-art is accurate CM of an EM, as methodologies largely depend on accurate values to produce acceptable results. Simplified models due to software, hardware or knowledge limitations may hinder otherwise productive algorithms. The second open challenge of benchmarking and evaluation of novel AI techniques stems from the main, as it depends on the quality of data produced. Thus, the capability of a DT to generate a virtual copy of the system can prove invaluable to literature in both design and real-time monitoring stages.

According to [11], limitations in concurrent CM of industrial EMs have been exposed. Core problems concern modeling and DAQ fidelity, both of which are in the forefront of DT research. Secondary advances required are in EM degradation mechanisms, allowing for faster and more robust PM. Novel AI techniques are ye<sup>t</sup> to be accepted in the field, requiring extensive testing, hindered by conventional modeling techniques. Literature concludes that combination of different CM methodologies is required in order to ge<sup>t</sup> a complete and reliable overview of the system. Common surveys (with their respective mediums) include: insulation testing (partial discharges), electrical testing (current spectrum), flux analysis (stray flux). The latest tendencies concern transient analysis. Steady-state analysis employs FEM to model the machine in greater detail. Focus has shifted into flux analysis, due to sensory advancements and its richer harmonic content [4]. Cutting edge approaches investigate AI and Fuzzy Cognitive Maps (FCM) decision making to distinguish fault indications. Latest trends in depth can be found in the cited work. We surmise that the DT can solve the combination of the aforementioned challenges and is a worthwhile venture in both literature and industry. Details are discussed in the framework presentation.

### *2.1. Digital Twin Reviews*

Five major DT reviews [8,12–15] were chosen to be studied in the context of this work. The primary reason is their latest publishing date, while the secondary reason is their review of older DT reviews. Each review takes a slightly different approach. The goal is to provide an outline of the latest concept while discussing similarities and deviations. This work focuses on DT usage and applications in EM PM. Readers are encouraged to investigate the mentioned reviews for broader DT coverage. Ref. [16] provides a clear and categorical review of the three different digital replica variations researchers, including the above reviewers, have adopted: Digital Model (DM), Digital Shadow (DS), and Digital Twin. A clear data-flow categorization is used, as depicted in Table 1. Modeling methodology is not and should not be limited in the scope of this categorization.

This categorization is necessary in the context of EM PM, to clearly classify methodologies. One key example is the usage of DS for pure diagnosis, AI training and data logging, while the DT can also be integrated in control. Some proposed PM methodologies require the usage of bidirectional automatic data flow, meaning they should use the term

Digital Twin instead of Digital Shadow. Researchers are encouraged to abide by the above distinction.

**Table 1.** Digital construct categorization as established by [16] and adopted by broader literature.


### What Is Digital Twin?

The concept's first appearance is contributed to M. Grieves, in his course on "product lifecycle management" in 2003. In his whitepaper in 2014, Grieves defines the DT [7]. In 2012, the DT concept is reevaluated by the National Aeronautics and Space Administration (NASA). After this point in time, the DT concept begins to rise exponentially in popularity. Since then, researchers have provided numerous definitions and concerns, making the DT an increasingly complete concept, which serves today's research needs. Some of the most concise and inclusive definitions are provided in chronological order, to be discussed in the scope of EM PM. The provided definitions include only historical ones in M. Grieves' and NASA for the sake of clarity, and some of those reviewed and provided by the studied papers. Only definitions that insert additional clarifications are included.

M. Grieves defines the DT in his whitepaper [7] as a "Virtual representation of what has been produced", after having discussed it in detail in previous years. This broader definition is meant to encapsulate the complete concept of the DT in the scope of production. NASA has defined the DT in greater detail in 2012 [17], catering to a more specific use, as interpreted by Tao et al. [8]: "The DT is a multiphysics, multiscale, probabilistic, ultrafidelity simulation that reflects, in a timely manner, the state of the corresponding twin based on the historical data, real-time sensor data, and physical model". New additions in definitions are mentioned by Gabor et al. [18]: "The DT is a special simulation, built based on the expert knowledge and real data collected from the existing system, to realize a more accurate simulation in different scales of time and space", Chen [19]: "A digital twin is a computerized model of a physical device or system that represents all functional features and links with the working elements", Zhuang et al. [20]: "Virtual, dynamic model in the virtual world that is fully consistent with its corresponding physical entity in the real world and can simulate its physical counterpart's characteristics, behavior, life, and performance, in a timely fashion", Liu et al. [21]: "The DT is a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information and can forecast the future of the corresponding physical counterpart", Zheng et al. [22]: "A DT is a set of virtual information that fully describes a potential or actual physical production from the micro atomic to the macro geometrical level", Xu et al. [23]: "Simulates, records and improves the production process from design to retirement, including the content of virtual space, physical space and the interaction between them", Madni [24]: "A DT is a virtual instance of a physical system (twin) that is continually updated with the latter's performance, maintenance, and health status data throughout the physical system's life cycle", and Kannan and Arunachalam [25]: "Digital representation of the physical asset which can communicate, coordinate and cooperate the manufacturing process for an improved productivity and efficiency through knowledge sharing".

Core contributions of each definition are highlighted in bold. NASA's definition, originally given for a flying vehicle [17], is one of the most inclusive to start. The authors' vision, as described in the paper, is close to today's reality. In summary, the DT is a simulation with the following characteristics:

• Multiphysics, meaning cooperation of different system descriptions, such as aerodynamics, fluid dynamics, electromagnetics, tensions etc.;


Originally proposed as a three-dimensional model [7], the five-dimensional extension model proposed by [8] has attracted a lot of attention in literature, according to the studied reviews ([12,15]) and reviewed work. The dimensions (initial three and extended two) are:



The two extra dimensions (4 and 5) are important enough in differentiating the DT from previous work to warrant a categorization akin to i.e., the physical space. It is important to note that the DT is not only a virtual representation of an object, but can encapsulate an entire process i.e., a complete diagnostic procedure, depicting the necessary equipment, data acquisition, flow and handling, connections, and algorithms. This iteration, however, can often produce more confusion than results. Researchers are encouraged to proceed as they see fit, following the established DT paradigm. It is, however, important to emphasize digitization of services and processes along with objects [19]. Correlation between these layers is depicted in Figure 1.

**Figure 1.** Five-dimensional DT model as proposed in literature [8].

The concept of the DT began in manufacturing. Operational DTs are the state-of-theart in literature. Stemming from the previous statements, one can make a case for which portion of a system's lifetime is included in the simulation. The obvious answer is "all of them". To elaborate a bit further, we can split a system's lifetime in the following phases, as expertly studied by [15] and expanded upon in [26,27], shown in Figure 2:


**Figure 2.** Summary of a product lifecycle and relevant operations.

PM is of course interested in the service phase. However, a recurring problem in relevant literature is the existence of little to no manufacturing data, which can prove invaluable for state-of-the-art diagnostics. Algorithms often must expend time and resources in evaluating systems and gathering preliminary data. Access to a DT generated in the design and manufacturing phase of the system can enable instantaneous PM methodologies, as well as strengthen existing ones with a slew of data [25]. Ref. [15] concurs that researchers often neglect the retire phase of the system. Integrating it in the DT study can provide valuable information for next generation manufacturing, completing the circle. Retrofitting older machines will be discussed in the conclusion. It is imperative that DT manifestations span the entirety of the system's lifecycle [24].

Data are considered the core of the DT, since it is this bidirectionality and processing that defines it. DT data comes from three major sources:


DT data can be in a multitude of forms, such as physical sensor signals, virtual signals, manuals, tables, data banks. Localization can simultaneously be in the system itself, in adjacent (ancillary) systems which may or may not be part of the DT itself (although their data is), and the cloud. Furthermore, these data can be raw (i.e., voltage, current, flux, counts, dimensions) or processed (i.e., health indexes, state values, clustered or labeled). Therefore, adequate handling is of paramount importance. Data scale corresponds to the Big Data definition, which in itself is the sweeping trend in industry according to the reviews.

The most important aspect of the DT regarding PM is its definition as a "living" model [21]. The main concept of this "life" is updating the system itself to reflect the physical twin, meaning health state and history, dynamic behavior, efficiency and performance, and links with outside elements. One very challenging aspect of DT is keeping these calculations and representations "in a timely manner" as many works have stated [20]. The simplest definition of this is doing the necessary processes (handled by a CPU) before the next necessary update, according to required fidelity.

### *2.2. Surveyed Literature*

In this section, the chosen relevant papers are reviewed shortly in ascending chronological order regarding their publication date. A brief overview of each work is given, followed by a general discussion regarding research state of the art and primary concerns.

Modeling an EM is a complex, multi-level task. The dual electro-mechanical behavior of the system, in addition to its high symmetry, requires heterogenous skills in definitively achieving a robust and reliable model. Ref. [28] provides an insight into how DT technology can model and solve EMs and their challenges in an industrial environment. The EM modeling problem stems from their application since the earliest days of modern industry, where inadequate provision of data sheets can skew modeling efforts. Furthermore, high dependence on load and environmental conditions results in a highly nonlinear and stochastic asset. Logging an EM's features requires a considerable testing effort without ensuring minimum uncertainty. Most present drives do not provide an accurate parametrization of the controlled machines. Goals are reduced sensor costs, minimized invasion and estimation accuracy. The paper proposes a reduced order FEM model of an IM. Design parameters include real-time application while preserving the adequate accuracy, managing this accuracy-computational complexity tradeoff, and realizing the DT as a virtual sensor. Proof of concept is provided via current density and thermal modeling through measurements, while highlighting the benefits of an optimized cooling system, which is frequently overused due to non-reliable temperature readings. Finally, real-time implementation is evaluated.

Ref. [29] presents a DT application of an automotive braking system PM. Although not directly related to EMs, the principle of using DT in PM remains unchanged. It is important to note that all such works combine different modeling formalisms (FEM, mathematical) and dimensionalities (0D, 2D, 3D) integrated into one master model. In this regard, universal and combinable software should be used. This work uses Modelica models and ANSYS Simplorer FEM simulations. The authors observe an increasing interest from industry giants (Siemens, GE) in employing DTs for predictive maintenance. The resulting physics-based part of the model can be subjected to various failure modes to simulate its response, while combination with machine learning algorithms can trigger pre-emptive maintenance and optimize operational downtime, while also being trained by the simulated data.

Ref. [30] provides a practical approach in industrial DT employment. The proposed application is focused on machine reconditioning projects, which involve a reverse engineering phase and short commission times due to lack of data and production timelines, respectively. The guiding principle is conforming to Industry 4.0 practices while completing the work of retrofitting to validating the machine. In order to be productive, the DT realization should pay off the extra time required to build it. It is important to note that the old machine retrofitting company typically is not the one that built the machine. The concept of Virtual Commissioning is proposed, which provides the evaluation party with a DT of the project under scrutiny, reducing travel time and costs while providing the commissioner with preliminary confidence before the real-world application. It is stated that the DT was started to be used more extensively in the early 2010s and brought upon a new wave in modelling and simulation. Since the DT realization depends on the complexity of the system itself, it is viable (scaled) for both simple and complex projects.

The studied application is the reconditioning of a core making machine used for foundry sand cores for the automotive industry, with retrofitting of the old Siemens PLC and HMI devices. The allocated time was dictated by the maximum possible stop in production, four weeks. The DT was successfully built, validated and used in validation during this time period. The used software is Simumatik3D. The simulation engines are 3D graphics, physics, and logic. Importantly, the complete model can be controlled by the same industrial controllers exactly like the real twin. The four application phases are:


Ref. [31] utilizes the DT approach to model a physics-based Remaining Useful Life (RUL) prediction model for an offshore wind turbine power converter. While the model itself is based on literature proven methodologies, the DT aspect benefits can be summarized as such: Combination of the SCADA DAQ system with physics-based models can enable medium- and short-term predictions to accompany SCADA long-term data availability, increasing the prediction accuracy by a significant margin. Each system is modeled in the

appropriate setting (computational, numerical, model, FEM) and then combined via the DT framework. This enables the integration of the large number of components present in the system, while ensuring optimal modeling of each part and varying degree of precision. This situation enables operators and owners to make the important decisions of every aspect of the turbine (end-of-warranty review, inspections, life extension, re-powering, retrofitting, decommissioning) remotely and with better visualization. Furthermore, the DT platform "converts big data into manageable small data and presents it as high-level performance indicators that influence the decisions of O&M planning and execution". Once again, the DT method is employed to merge the real and virtual twins to achieve the best possible control and maintenance methods, while enabling the operators with the optimal Human Machine Interface. In this scenario, uses extend to evaluating the weather conditions prediction of the SCADA system to include operator transit planning and other miscellaneous uses indirectly related to the system maintenance. Future work of this and every proposed DT framework is two-fold: optimize the twins' correlation and discover new uses and merges for neighboring equipment and services.

Ref. [32] from Siemens AG provide the definition of the DT simulation in three major principles: linked collection (of all available data during manufacturing and service life), evolution mechanism (tracking any and all changes to the system and keeping a history), and behavior description coupled with solution provision (evaluation and decision-making aspect of the DT). The authors highlight the need to support any and all stakeholders' (manufacturers to end users) interests in forming the next generation DT, meaning a balance between information obscuring and usability. The course is to follow the current paradigm of supporting design tasks and validating system properties while completing the merge of the physical and virtual world. The key takeaway of this work is the separation of manufacturing and usage data. While corporate secrets and espionage are an important matter, combination of a product's entire life cycle data will provide researchers with a grea<sup>t</sup> source of data. "Digital artefacts" terminology is stated to be any and all data structs (simulations, measurements, descriptions, values). These are linked via a "Knowledge Graph", to be accessible by any stakeholder at any time. Physics-based models are used to predict the future via differential equations, while builds give an insight into the inner workings of the studied system.

The second part of the work focuses on planning the next generation DT. The first important idea is that the DT can become a product in and of itself, providing additional functionalities to consumers, and following the design principle of normal product features. The second insight is the choice of location depending on application, ranging from embedded logic to a cloud-based service on demand. The authors conclude with a brief maintenance analysis, stating that the applied logic is the deviation of the two models, and the fact that today's maintenance data are disconnected from the models and the sensor data. To summarize, the DT is the gateway to realizing new services with low effort. The stated paradigm is far from complete, and the need for additional research, benchmarks and applications is highlighted. The two main challenges are connecting the various aspects and structuring the model parts.

Refs. [33–36] make a case for the importance of the DT synergistically with Computerized Numerical Control (CNC). CNC revolutionized the industry and provided us with smarter manufacturing. The tackled problem is the dependence of simulations on user provided data and manual records, which block instantaneous response and reduce accuracy. Provision of a consistent, accurate model (via real-time and reliable data-mapping) between design and operation will take advantage of the full capabilities of CNC, namely self- sensing, prediction, and maintenance. Related work evaluation concludes with the following three approaches:


3. Unified Modeling Languages (UML), such as Modelica.

Finally, a complete Modelica DT model is proposed, comprised of the physical, digital, and mapping (connection) layers. The work is finished with the common statement that DT research still requires extensive application and validation and proposing future work. The development of the suggested model is in three parts, with the second part advancing the theory. In the third work, a hybrid application for a milling head's cutting tool maintenance is performed, providing extensive results. Continuation of the above work in 2021 concerns a model consistency retention, which has been steadily gaining more traction in the nexDT research. The need for the DT to follow the performance attenuation of its real counterpart is highlighted. The authors propose a method for achieving this purpose based on their previous work, and a brief review of state-of-the-art attenuation parameters (to be chosen). A rolling guiderail is taken as a case study. Results are promising and fit the next generation DT paradigm, but as the authors themselves state, further validation is required.

Ref. [37] presents a clear paradigm for the DT. Industry 4.0 is in the process of transforming its environment into a "networked system of systems", making automation more "production friendly by being more reconfigurable and adaptable" via modular architectures. This approach is what the DT capitalizes and expands on, reshaping the pre-existing methodologies to fit the new plant environment while providing human technicians with sophisticated tools and a clear UI.

The focus of this work lies in enabling the human engineer by considering him/her as a part of the framework to increase his/her efficiency on the industrial floor. The DT can serve as a back end for future visual aiding technologies. The paper highlights modularity and describes the approaches of modeling the DT parts (component-, skill-, function-based approach). A proof of concept is provided by modeling a part using the capabilities and built-in library of Modelica while using the AutomationML data exchange protocol. Once again, the DT serves in realizing the multi-disciplinary aspect of any industrial system. Human aid can be provided in two ways:


In this work, a good overview of approaches and the HMI aspect of the DT is provided.

Ref. [38] provides an excellent proof of concept for the employment of DT in mechanical maintenance. The focus of the study is an aero-space engine main shaft bearing, which is both the key component of the engine and its weakest link, corresponding to the typical EM fault. The work begins with a quick state-of-the-art review in bearing maintenance and then proceeds to explain the integration of the DT approach. The proposed maintenance technique follows the tested modality separation (as excellently depicted in [39]), while the DT approach emphasizes data handling and processing.

The proposed model contains three kinds of elements: physical entity, virtual mode and service system. The dimensionality of the model is five, including the three entities plus DT data and connections. Furthermore, the model classifies the data via clustering techniques to assign data labels. Then, the DT can be used both in fault prediction via data feeding and fault diagnosis via real-time visualization. The process of creating the DT of the bearing is the following, illustrated in Figure 3:


This guarantees that the virtual twin operates within the scope of the real twin and thus the information between the twins can be interactively integrated. The virtual aspect of the DT is further explored via training a Neural Network to classify the state of the bearing as depicted in its virtual twin.

Finally, the implementation of creating and validating the repair process of the virtual world inquires additional research. The authors propose validating the repair process via integrating it in the DT and performing the actual physical repair only when the simulation results in an acceptable outcome. This aspect of the DT can be integral in assuring the maximum cost reduction of the maintenance process and has up until now only been a speculation.

**Figure 3.** Basic illustration of the creation of one complete part of the DT, as described in [38]. An important aspect of the DT is the 3D representation of the object, including physical flaws. Incorporating geometries with differential equations systems allows for greater freedom in simulation. Typical usage of FEM software. Images for illustration purposes only, courtesy of [40].

> Ref. [41] proposes a theoretical DT approach in monitoring a PMSM in an EV, based on its casing temperature. The authors developed and trained an ANN and a Fuzzy Logic DT with the same inputs and outputs. An important aspect that is presented in this work is the simulation of the EV driving cycle using MATLAB Simulink and its repercussions on the motor maintenance, including RUL, time to refill bearing lubricant, and motor temperatures. An EV emphasizes the interdisciplinary nature of EM health monitoring and maintenance, being on a narrow, highly synergistic and "hostile" environment for the machine. According to the DT guideline of being modular and user-friendly, this theoretical model can be expanded and applied in a wide range of uses and users, from EV manufacturers to service companies and individual users.

> The authors of [23] provide a clear and concise usage of the DT as literature suggests it should be used, combining manufacturing and operation. This work focuses on Deep-Neural Network (DNN) diagnostics assisted by the DT approach using Deep Transfer Learning (DTL) while also presenting a case study in a car body-side production line to compare the traditional and DT-based methodology. The aforementioned combination, a guideline of DT evolution, is achieved in two parts:


This work's contribution can be summarized as: applying the (sometimes theoretical or untested) intelligent diagnostic methods in the real environment and extending the diagnosis period from operation only to the full life cycle. This last part is a novel approach to using the DT to also optimize the manufacturing process.

The authors provide a quick, targeted explanation of the DT concept in industry, followed by a thorough review of DNN applications in predictive maintenance. The proposed work excellently combines the advantages of both DTL and DT technology, as explored and suggested by the relevant literature and the scientific consensus.

Ref. [42] presents a DT reference model for rotor unbalance diagnosis. Once again, the authors highlight the need to further investigate how to properly construct a DT to accurately represent the physical system. The trust of industry giants such as ANSYS, Oracle, SAP, Siemens and GE is confirmed. One important aspect that is proposed in this work is a model-updating algorithm, so that the virtual twin can better simulate its real counterpart. Physical systems include wear and tear. A good DT procedure should reflect this issue, which can be used in both RUL estimation during manufacturing and real-time conditioning.

The work includes a brief physical model and smart sensing description followed by the corresponding DT construction using the three blocks of digital modeling, data analytics and knowledge base. The novel model updating strategy is proposed. Finally, the proposed strategy is employed in a rotor simulation combined with a physical measurement using additional masses as rotor imbalances. The results are satisfactory and prove the usefulness of the model update.

Ref. [43] illustrates a DT proof of concept in prognostics with Low Availability Runto-Failure (RTF) data. The retrofit solution is readily available and low-cost in the form of a Raspberry Pi accelerometer mounted to the studied machine (a drill) and feeding data into MATLAB Simulink, where a model is realized. The work provides a brief but concise explanation of Condition-Based Maintenance (CBM) and RUL prediction models. DAQ is performed by the PLC of the drilling machine and the retrofitted Pi accelerometer. The employed prognostic technique is the Exponential Degradation Model, a stochastic approach which is suited to low data availability. In addition, it is a parametrized model and can be applied to a population of machines, enhancing its diagnostic capability via their comparison. In this work, the DT is used only as a "watchdog agent", meaning as a real-time evaluator. The objective is to bring together the concepts of DT and CBM applied to rotating machinery. In the broader aspect of DT in industry, it includes adaptation to retrofitting, simple software-hardware synergy and another proof of concept.

The authors of [44] provide a thorough analysis of the RUL estimation and how this approach can be integrated in DTs. The three pillars of PM are:


All of them conform to the application of the DT in the industry. In addition, as is with expansion of the DT capabilities under the exponential growth of computational powers and sensor precision in the scope of Industry 4.0, the authors sugges<sup>t</sup> the "simultaneous consideration of economic and stochastic dependence aiming at determining the optimal trade-off between reducing the RUL of components and decreasing maintenance set-up costs". The caveat of conventional RUL methodologies is mentioned again, referring to their dependence on historical data. The DT is the state-of-the-art approach in combatting this gap. The authors conclude that, due to increasing complexity, RUL estimations should be made in the component level. An optimal DT should be able to conform to the required fidelity, with the known trade-off between it and computational time. Furthermore, this work also classifies the components into three useful (in making the approach modular) categories, illustrated in Figure 4:


**Figure 4.** Example of a model bearing component in the three different "box" iterations. (**a**) Black Box; (**b**) Grey Box; (**c**) White Box. Black boxes are unknown to the model and can draw their solution from several sources. Grey boxes are models describing the mechanics and nature of the component, using theoretical data to complete it. White boxes are fully known to the user, customized for the application and simulate the component with near perfect accuracy. Analysis derived from [45], FEM model in (**c**) courtesy of [46].

The proposed approach is separated in four, common in the DT literature stages:


The software used in this endeavor is the OpenModelica, due to its correlation with the DT undertaking. A case study is presented, calculating the RUL of a six-axis robotic structure used for welding tasks. The authors conclude that the results are satisfactory while mentioning the methodology's caveats, and this process can be part of a more generic PM framework.

An important takeaway from this work is the suggestion that DT in industry and literature will become exponentially better as more and more actors propose, design, build and evaluate DTs. Industry giants depict an interest in this undertaking, while the current trend of Industry 4.0 is the perfect ground and timing for such an approach, facilitating and encouraging this effort.

Ref. [47] presents the challenges of developing a DT for RES generation. The author confirms the essentiality of DTs in optimizing the design and reliability of energy systems. In addition, the literature consensus that DTs have no serious strategy and comprehensive strategy ye<sup>t</sup> is further elaborated. DFIGs with Power Electronic converters may have solved the controllability issue, but come with additional impacts on the functionality, lifetime and reliability of the system, since they provide additional interactions. The author suggests that an optimization of manufacturing and maintenance in these systems is of paramount importance since any improvement will reflect on the energy production and reliability, the greatest issue of this century, according to many experts. The DT, along with its many benefits, is a prime candidate as a solution.

The author presents an exhaustive study and realization of a large DFIG digital twin, accompanied by a sophisticated test bench. The process handles conventional EM CM challenges such as converter-fed operation and the accompanying harmonics. This issue is of paramount importance since the magnetic flux can provide a plethora of information about the machine and accurate modeling and calculation facilitate the process. The

undertaking combines the important aspects of DT, namely multi-physical system modeling and data acquisition and handling. Conventional simulation models have been developed, but the combined model is in a reduced state-space, deeming it impossible to calculate all parameters with a high enough precision. The approach combines NNs, FEM and simulation modeling.

The proposed DT includes the novel approach of including stochastic modification of internal machine properties, which traditionally are very challenging to compute or model. Modeling is separated into four levels:


The author has taken the DT approach to a further step via tackling the modification problem with intelligent algorithms, meaning optimizing the DT with its own capabilities. This is an exemplar usage of the next generation DT approach. Finally, the comprehensive multi-physical model is converted into a "true" DT by combining internal calculations with real test-bench data to estimate the real behavior of the machine. It is important to note that this process requires training, since EM deterioration is not a spontaneous process but a gradual one, and a proper DT should reflect it. According to this work and literature consensus, this is the biggest challenge in realizing the novel DT approach, as suggested. In conclusion, the DT should include an ultimate understanding of system characteristics. This approach is one of the most extensive yet.

Ref. [48] proposes the interesting idea of creating a DT for one of the fundamental machines in industry, the Intelligent Machine Tool (IMT). This intelligence originates from the notion that the new IMT is "no longer limited to the operation of machining" but includes "features of multifunctionality, integration, intellectualization and environmental friendliness". These machines should follow the Industry 4.0 protocols to realize the faster creation of better, cheaper products while facilitating the product DT manifestation via appropriate measurements stemming from the IMT itself. This work follows the next generation DT paradigm in creating the IMT DT, meaning the task separation in physical, virtual and connection layers. The authors provide an on-point review of the DT in the industry of interest and explain the tackled challenges.

Previous work on the IMT DT has been focused on theoretical design or just the data analysis. This work provides a clear design paradigm. The main takeaways are the focus on the HMI and the mapping, since the IMT is an actuator for the creation of another product. IMT DT data falls into the category of Big Data, which is expected but also confirmed by the work. Finally, the authors present two experiments which confirm the grea<sup>t</sup> degree of optimization received from this endeavor. The IMT DT is a key component of connecting the manufacturing and the operation process through the data it handles, and thus should be a focus of the nexDT paradigm.

Ref. [49] is one of the first EM-related works encountered in this search. It proposes a precomputed FEM model originating from the machine geometry, fed with online measurements, which is the natural approach in considering an EM DT according to the state-of-the-art. The main benefit over non-DT methods is the consideration of difficult-tocompute and/or speculated EM quantities such as local flux, bar current and torque, in addition to asymmetries. This work also comments on the ambiguity of DT in literature.

A summary of this work follows. Industry uses dq models of machines due to their real-time simulation capability which stems from quantity speculation. Real-time monitoring requires a model which can be computed with contemporary CPUs faster than the real machine. FEM models typically offer the highest fidelity in a trade-off with high computational times. The authors considered two hybrid (with FEM) approaches to combine the benefits of high accuracy and computational cost. The two models are Magnetic Equivalent Circuits (MEC) and lumped circuit models. The resulting model combines MEC with FEM and is named Combination of Finite Element with Coupled Circuits (CFE-CC). Preliminary results show a close resemblance to the real twin, making this model a prime candidate for being the virtual twin. The authors present three strong points:


State-space reduction is achieved by using a 2D approach in the center of the machine, while the rotor skewing, and coil ends are tackled by altering the computed induction matrices. DAQ is guided by the necessary time step (thus choice of CPU) as discussed previously, while physical quantities are measured by common techniques, namely encoder and PCB. The entirety of the virtual layer is realized using the SimPowerSystems library of MATLAB Simulink. Finally, the work compares the proposed DT with real machine measurements, confirming its feasibility and more accurately its accuracy and computational efficiency.

Ref. [50] offers an in-depth analysis of a DT component model, namely an improvement on the well-established <sup>Γ</sup>-equivalent circuit of an Induction machine. The main focus lies in improving the loss distribution in rotor and stator to more closely resemble the results of a FEM model while using an equivalent circuit. Results show that the developed model provides a good alternative to FEM (and thus computationally expensive) modeling and can be used as a component of an IM DT without dramatically increasing its cost. The work continues with the model proposal, which is then fitted to the FEM model in 33 different steady states. While dynamic comparison is quite tough due to the solver integration with the control software requirement, the proposed model is meant to be used in a dynamic simulation. The comparison between the proposal and the FEM model is done in a different state so as to compare the result when it had not been used in fitting. Results are promising and this model should be thoroughly considered in coupling iron losses to the thermal analysis inside a complete DT of an IM.

The authors of [51] follow state-of-the-art discussed DT paradigms to present case studies of energy conversion systems. The five-dimensional model is accepted, while the importance of the "living" model is highlighted. A concise explanation of the concept is presented, followed by relevant tools and process flow. The first main part of the work discusses potent applications for the DT, namely: industrial robotics and virtual testbeds in manufacturing, EV design, CM and control, wind turbine PM, and finally telescopes. These applications are prime candidates for DTs and, apart from telescopes, are in the spotlight. The authors provide a brief but concise explanation to be tied with the next section, modeling methods. Total product lifecycle is discussed, with pertinent work concerning each phase, namely modeling and optimization, energy conversion, maintenance and service, diagnostics, and finally control. Finally, implementation examples are given, concluding with an excellent Strengths-Weaknesses-Opportunities-Threats (SWOT) analysis of the DT application.

### **3. Discussion**

Reviewed work is discussed in terms of adhering to the proposed definitions, starting from the basic Model—Shadow—Twin ambiguity. Results are summarized in Table 2. Many authors acknowledge these differing capabilities of digital representation but refer to, i.e., virtual sensors (namely DS) as DTs. Proper naming of each work is surmised to guide readers to the exact scope of the proposed technology, thus speeding up research.


**Table 2.** Characterization of reviewed work regarding data flow automation definitions.


**Table 2.** *Cont.*

Authors of [28] provide numerous DT definitions in their introduction. While DT capabilities are reflected, many are ambiguous in terms of the proposed differing characterization. In this work's introduction, we mention that "Modeling methodology is not and should not be limited in the scope of this categorization". A DS is not lesser to a DT; it only provides a different service. Ref. [29] acknowledges the fact that their proposed DT is model based, reinforcing the notion that researchers discern the different applications of each proposed characterization. It is only a naming matter.

Incorporating DTFs in PM encourages the usage of DS, namely a living model reflecting the changes in the system in real time without the necessity of back-transfer of information. Models are not efficient in PM barring their usage as a subsystem and not representing a complete diagnostic technique. DTs are of course encouraged but not strictly needed. Great examples of automatic correction provided by the virtual twin are the ones characterized as DT. A classification summary regarding the number of each framework is given in Figure 5.

**Figure 5.** Population of each classification of DTF reviewed in this work.

The expectation that most of the proposed work is categorized as DS is matched, since it is the most useful framework of depicting in-use state-of-the-art diagnostic techniques. Researchers are now essentially matching the concepts of PM and DT into one complete merger. This, however, is no strict requirement or correlation. The different aspects of each subcategory are discussed in Table 3.


**Table 3.** Primary applications according to the different aspects of each classification.

### *3.1. Is This Classification Useful for Literature?*

One presented problem in this approach is if this categorization enhances or hinders research efforts. As of now, the primary obstacle in broader DT adoption or evolution is its ambiguity in literature. The proposed classification resolves one aspect and guides researchers to pertinent work. However, we presume larger confusion, should most researchers choose not to adopt it, or introduce further ambiguity in classifying relevant work. Furthermore, one could argue that all relevant work is essentially a digital twin concept, and this classification introduces unnecessary complexity. Finally, one could acquire the notion that since each classification encapsulates the previous one, DM is lesser than DS and DS is lesser than DT, thus shadowing their work in a DS in favor of another DT. We argue that this fact perfectly encapsulates the "multiscale" pillar; if the DS is perfectly appropriate for the objective, a DT would only offer an unnecessary complexity.
