Next Article in Journal
Comparison of 2D and 3D Surface Roughness Parameters of AlMgSi0.5 Aluminium Alloy Surfaces Machined by Abrasive Waterjet
Previous Article in Journal
Effects of Scanning Strategies, Part Orientation, and Hatching Distance on the Porosity and Hardness of AlSi10Mg Parts Produced by Laser Powder Bed Fusion
Previous Article in Special Issue
Knowledge-Based Adaptive Design of Experiments (KADoE) for Grinding Process Optimization Using an Expert System in the Context of Industry 4.0
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Transforming Manufacturing Quality Management with Cognitive Twins: A Data-Driven, Predictive Approach to Real-Time Optimization of Quality

by
Asif Ullah
1,
Muhammad Younas
2 and
Mohd Shahneel Saharudin
2,*
1
Faculty of Mechanical Engineering, Ghulam Ishaq Khan Institute of Engineering and Technology, Topi, Swabi 23640, Pakistan
2
School of Computing and Engineering Technology, Robert Gordon University, Garthdee Road, Aberdeen AB10 7QB, UK
*
Author to whom correspondence should be addressed.
J. Manuf. Mater. Process. 2025, 9(3), 79; https://doi.org/10.3390/jmmp9030079
Submission received: 12 December 2024 / Revised: 17 February 2025 / Accepted: 22 February 2025 / Published: 28 February 2025
(This article belongs to the Special Issue Smart Manufacturing in the Era of Industry 4.0)

Abstract

:
In the ever-changing world of modern manufacturing, maintaining product quality is of great importance, yet extremely difficult due to complexities and the dynamic production paradigm. Currently, quality is rather reactively measured through periodic inspections and manual assessments. Traditional quality management systems (QMS), through these reactive measures, are often inefficient because of their higher operational cost and delayed defect detection and mitigation. The paper introduces a novel cognitive twin (CT) framework, which is the next evolved version of digital twin (DT). It is designed to advance the current quality management in flexible manufacturing systems (FMSs) through real-time, data-driven, and predictive optimization. This proposed framework uses four data types, namely feedstock quality (Qf), machine degradation (Qm), product processing quality (Qp), and quality inspection (Qi). By utilizing the power of machine learning algorithms, the cognitive twin constantly monitors and then analyzes real-time data. The cognitive twin optimizes the above quality components. This enables a very proactive decision making through an augmented reality (AR) interface by providing real-time visual insights and alerts to the operators. Thorough experimentation was conducted on the aforementioned FMS. Through the experiments, it was revealed that the proposed cognitive twin outperforms conventional QMSs by a great margin. The cognitive twin achieved a 2% improvement in the total quality scores. A 60% decrease in defects per unit (DPU) is observed as well as a sharp 40% decrease in scrap rate. Furthermore, the overall equipment efficiency (OEE) increased to 93–96%. The overall equipment efficiency increased by 11.8%, on average, from 82% to 93%, and the scrap rate decreased by 33.3% from 60% to 40%. The excellent results showcase the effectiveness of cognitive twin quality management via minimum wastage, continuous quality improvement, and enhancement in operational efficiency in the paradigm of smart manufacturing. This research study contributes to the field of industry 4.0 by providing a comprehensive, scalable, and adaptive quality management solution, thus leading the way for further advancements in intelligent manufacturing systems.

1. Introduction

In the age of the fourth industrial revolution, it has become a strategy of the manufacturers to maintain the quality. Having consistent quality increases the throughput, reduces waste, minimizes downtime, and encourages the rising trend of customization [1]. However, due to the increasing complexity of the ever-advancing shift of the fourth industrial revolution, it has become difficult for the traditional quality management (TQM) frameworks to maintain the quality improvement, even though, historically, they guided and achieved quality improvements. TQM has static guidelines that relies on timely inspections, defects after manufacturing, extra scrap, and inefficient resources. These practices result in the erosion of profit margins and also greatly hurt customer satisfaction. These reactive approaches demand more proactive, data-driven solutions that can monitor and improve manufacturing quality in real time.
In recent decades, industry 4.0 has resulted in technologies such as industrial internet of things (IIOT), advanced sensor networks, artificial intelligence (AI), and cloud computing. These technologies have led the world toward more intelligent and interconnected manufacturing systems. The same technologies are used in collection as well as in analysis of high-resolution and high-volume data, especially on performance, feedstock properties, environmental conditions, and product characteristics. By manipulating these data streams, the “Digital Twin (DT)” has emerged. A DT, in essence, is a virtual representation of a physical asset or system, continuously updated with real-world data, enabling simulations, diagnostics, and performance optimizations [2]. Although DT offers a valuable static or semi-dynamic model of the manufacturing process, it lacks the predictive, adaptive, and cognitive abilities. This is supported by foundational works such as those by Grieves [2,3] and Tao et al. [4,5,6,7,8]. These capabilities are a necessity to truly anticipate challenges and respond effectively in real time. While some researchers have incorporated AI to enable DT further, often relying on pre-trained models, these models often lack adaptive or self-learning capabilities.
Researchers and practitioners, in response to these limitations, are exploring the next step of DT, that is, “Cognitive Twin (CT)”. CT is actually an integration of machine learning algorithms, predictive analytics, real-time sensing, and adaptive feedback into the DT [9]. There is a growing body of literature that explicitly identifies the cognitive twin as the distinct next level of evolution of DT, as [10,11,12,13,14] have indicated.
This ultimately provides a system with the capacity to learn and respond to changing conditions. Cognitive twin recognizes the system’s operating parameters, even before the materialization of defects. The change actually transforms the previously static or reactive nature of quality management to a more proactive approach that continuously optimizes the quality. Three different classes of DT, according to the assistance through augmented reality (AR) [15], are shown in Figure 1. Furthermore, Figure 1 displays the readiness level of all the functions through three dimensions, namely virtual twin, hybrid twin, and cognitive twin.
This manuscript proposes a comprehensive framework for implementing cognitive twin in a flexible manufacturing system (FMS) environment. By utilizing the data from the sensors in real time, advanced predictive algorithms and augmented reality interfaces for the support of operator decision support can achieve persistent quality improvements and operational efficiency. In comparison with conventional manufacturing techniques, the cognitive twin presents a new dimension of the quality management, that is, ongoing, dynamic, and data-driven. This anticipates the problems before they arise and optimizes production in real time while reducing waste, downtime, and associated costs.
In the following sections, we delve into the theoretical underpinnings, mathematical modeling, and methodological details of the cognitive twin framework. We also explore how this approach compares to and outperforms traditional quality management techniques, discussing specific examples and experimental results gleaned from its application in an FMS with three prototypical machine tools.

2. Literature Review

The integration of advanced technologies such as augmented reality (AR), DT, and machine learning into the manufacturing paradigm has exponentially enhanced quality management systems (QMSs). The literature review arranges key studies that explore these integrations by highlighting contributions and methodologies and by identifying gaps that our research aims to address.

2.1. Augmented Reality in Quality Control

Ho et al. [16] worked on augmented reality in conjunction with manufacturing for quality control 4.0. He explained various AR app categories for quality control, such as virtual lean tool, AR-assisted metrology, and AR-based solutions for inline quality control.
Yoo et al. [17] studied AR applications in commerce. The study implements an information system success model in order to investigate the perceived quality of AR technologies in mobile shopping. The areas of focus were information quality and visual quality in relation to consumer satisfaction.
Alves et al. [18] studied AR for industrial quality assurance, particularly shop floors. The authors proposed an AR-based quality control system that overlays virtual information to a video stream, helping in relation to real-time feedback without any distraction caused by conventional camera recordings and photos, etc. Szajna et al. [19] worked on AR glasses and supporting algorithms to create a human machine interface (HMI) for the measurement and inspection process.

2.2. Digital Twin and Cognitive Enhancements

The study in [15] explores the integration of augmented reality and DT, highlighting their potential to transform human-centric industries through high-level human machine interfaces and smart manufacturing. Franciosa et al. [20] worked on an enhancement of the quality through closed loop in process (CLIP). The author used deep learning and computer-aided engineering techniques to enhance the DT of the laser welding process for aluminum doors.
D’Amico et al. [9] used an ontology approach for the review of cognitive DT in terms of the technology used, applications, and limitations, specifically in the context of maintenance. Zheng et al. [21] explored the emergence of a cognitive variety of DT, its challenges and opportunities. Cognitive digital twins (CDTs) can change the landscape of manufacturing through enhanced intelligence and lifecycle management of various complex industrial systems. The authors conclude that the results of CDTs are essential for industry 4.0.
Zhu et al. [22] utilized a process simulation model that captures the production status and quality data and processes through a modified genetic algorithm (GA) and a bidirectional gated recurrent unit (bi-GRU) coupled with an attention mechanism (AM). This methodology is termed digital-twin-driven (DTD) quality control.
Zheng et al. [23] introduced a multi-agent architecture comprising a material, production process, product function/feature, product quality model (MPFQ-Model) for quality management and guidance. Johansen et al. [24] showcased the results of the COGNITWIN project under Spire 2050.
Tao et al. [25] leveraged the concept of DT to enable real-time adjustments and personalization for the enhancement of cognition in robots for rehabilitation. Tao et al. used DT and cameras to capture the stimuli of the patients and then adjust the response of the robot.
The authors in [26] studied high-mix low-volume production to impact the manufacturing flexibility of the sustainability paradigm. The study explores the optimization parameters such as energy consumption and machine scrap percentage through a multi-criteria optimization method.

2.3. Asset Tracking and Optimization Through Digital Twin

Khan et al. [27] also gave concepts of quality and manufacturing philosophies through virtual manufacturing. The authors in [28] worked on a framework through which they integrated various components of the flexible manufacturing system (FMS) using the internet of things (IoT). Hu et al. [29] proposed a DT solution for building information modeling (BIM). For the measurements, the authors used a LiDAR-based 3D mobile mapping through IoT. Samir et al. [30] used DT to capture the visibility of the information of job shop floor. Through this visibility and asset tracking, various optimizations can be achieved.

2.4. Machine Learning and Predictive Analytics in Manufacturing

Liu et al. [31] worked on improving traceability and dynamic control of processing quality. The authors used a Bayesian network model to process the factors on quality in fault identification. He experimented his methodology on a diesel engine connecting rod and showcased his findings.
The authors in [32] developed a decision-making DT for manufacturing environments. The model makes use of knowledge graphs and deep learning to offer insights into decision making. Using the model along with ontological context, various types of outcomes were produced, such as predictive analytics to suggest decision-making options.
Fei et al. [4] proposed a framework through which big data can be connected with smart manufacturing. The goal was to have the ability to collect, store, process, and optimize a large set of data from smart manufacturing assets.

2.5. Flexible Manufacturing Systems and Optimization Techniques

Wang et al. [1] proposed a framework that consists of data from feedstock quality, machine degradation, product processing quality, and quality inspection status for a flexible manufacturing network (FMN). The framework is referred to as the operation risk assessment framework. Bagherian et al. [33] used the best worst method (BWM), which is a multicriteria decision making (MCDM) for performance ranking of FMSs. Their research reveals that productivity, flexibility, and, most importantly, quality are the influential factors for the performance of FMSs.
Bozzi et al. [34] attempted to solve the resource utilization issue through an optimized scheduling for FMSs. They proposed a mixed integer linear programming algorithm for the mentioned objectives. Daniyan et al. [35] worked on an FMS for railcar assemblies. The experimentation included a highly automated FMS with the capability of automatic storage and IOT-enabled hardware along with radio frequency identification technology (RFID). The experiment focused on the conveyor’s performance and the model predicted excellent results.
Howard et al. [36] experimented with an automated scheduling tool and showed promising results in relation to improving the schedule quality. Vincent et al. [37] published an article showcasing an integration framework for a cyber physical production system (CPPS). This framework consisted of database architecture and a data model, with the capacity to allow multiple agents to work in coherence and independently. The framework explained data collection, data storage, data processing, and the insights from the data that help in decision making. Wang et al. [38] worked on novel hybrid data on tag approaches to solve the problem of radio frequency identification (RFID), solving the multiagent-based decentralized control system. This system was explicitly developed for flexible manufacturing.
The work in [39] discusses FMSs in response to work in process (WIP) in the context of supply chain management. The study focuses on cost optimization that reduces the total supply chain costs through the optimal production rate. Prior to this proposal, the authors designed and developed a framework of digital twin (DT) to control FMSs [40]. Through this framework, the authors showcased an increase in overall equipment efficiency (OEE). The authors also designed an indoor localization system to enhance the reconfigurability of the FMS [41].

2.6. Research Gaps

In light of the literature review, there have been new advances in smart manufacturing and quality control/assurance. However, still, there is a huge gap in current scenarios regarding quality management systems. The traditional methods rely on reactive measures, post-production inspections, and static process optimization. Reactive measures mean that, after production, the quality adjustment is made. This always results in inefficiencies, higher operational costs, and slower adaptation to changes in production conditions. Traditional QMSs are primarily reactive, relying on end-of-line inspections, periodic audits, and batch testing to detect and correct quality issues. This delay causes increased waste, resource consumption, and downtime. In dynamic manufacturing systems, where real-time adjustments are critical, these methods are ill-suited to address quality problems proactively before they escalate.
There are also predictive techniques for QMSs. However, they are limited and still lead to delays in production, and they are not up to the mark in terms of adaptive capabilities to shift the production toward a higher quality.
There is also a gap in the integration of QMSs with smart manufacturing. Traditionally, the productive matrixes are not integrated with the quality matrixes, and this disconnect results in low resource utilization, extra waste, etc. There is also an issue of limited continuous improvement because of this lack of integration between systems to realize the holistic nature of production processes [1,23].

2.7. Proposition of Cognitive Twin Quality Framework

The research gaps identified above underscore the need for a more integrated, real-time, and adaptive quality management system. The cognitive twin framework addresses these gaps by incorporating continuous monitoring, machine-learning-driven predictions, and real-time feedback loops, thereby providing a more agile, data-driven approach to quality control. By leveraging real-time data from sensors, predictive models, and augmented reality interfaces, this framework allows for proactive quality assurance, dynamic optimization, and continuous improvement that traditional methods are unable to achieve.

3. System Design and Analysis

3.1. System Overview

This section explores the proposed quality optimization technique through mathematical and logical structures. The section extends from data collection to data processing and, finally, to quality optimization. The proposed methodology can be used in conjunction with any manufacturing system, but, for this paper, we selected an FMS cell. This cell has three processing machines, i.e., lathe, mill, and engraving. The FMS cell also has a conveyor belt that is used for material transportation of raw materials, works in process, and finished products. The workpieces are handled through a robotic arm inside the flexible manufacturing cell. Figure 2 shows the logical overview of the proposed framework. According to the figure, the system starts by collecting data. The data are preprocessed and then normalized. The data are then integrated into the CT framework. The CT framework suggests that the optimal quality and the error are identified feedback to the actual system. The visualization is sent to the AR display as well.

3.2. Data Structure

The framework was inspired by Liu et al. [31]. The methodology revolves around four types of data. The data types are as follows.
  • Feedstock quality, Qf
  • Machine degradation, Qm
  • Product processing quality, Qp
  • Quality inspection, Qi
These indicators were selected based on their critical role in influencing the overall product quality. They also reflect the key aspects of manufacturing processes. Feedstock quality (Qf) originates from raw material quality that directly impacts the initial conditions of manufacturing. It sets the baseline for the final product quality.
Machine degradation (Qm) and machine health influence precision, reliability, and operation efficiency. It is crucial to monitor performance over time. Product processing quality (Qp) captures the dynamic interaction between materials and machine conditions during production. Finally, quality inspection (Qi) provides a final assessment of the conformance to standards. It ensures that the product meets customer requirements.
The rationale behind the selection of these indicators is based on the impact on the total quality score (Qtotal) and feasibility of real-time monitoring. The aim is to combine these four factors into a single comprehensive quality assessment model. This uses the cognitive twin framework, which is discussed in later sections.

3.3. Mathematical Modelling

  • Feedstock quality, Qf(t): a function of time that represents the quality of raw materials fed into the manufacturing process. It is measured based on material properties like purity, consistency, and other relevant factors. The data are directly fed into the model from the distributor.
    Q f ( t ) = f f e e d s t o c k ( t )
  • Machine degradation, Qm(t): a function of time that reflects how the performance of the machine and its cutting tool deteriorates because of wear and tear, usage time, and other factors. This could be measured based on vibration, temperature, or other performance metrics.
    Q m ( t ) = g m a c h i n e ( t )
  • Product processing quality, Qp(t): this represents the quality of the product being produced at any given time. It is dependent on both the feedstock and the machine’s condition. A typical model could consider this as a function of feedstock quality and machine degradation.
    Q p ( t ) = h ( Q f t ,   Q m t , t )
  • Quality inspection, Qi(t): a quality score or metric obtained through inspection after the product is processed. This is an output that can be compared with the product’s expected quality standards. It can also depend on environmental factors and inspection methods.
    Q i ( t ) = i i n s p e c t i o n Q p t , t

3.4. Real-Time Quality Monitoring Model

To model a real-time quality monitoring algorithm, all the above data factors were integrated. The resulting parameter was termed quality score, Qtotal(t). By including various weights in relation to each corresponding factor, we obtained a more complex multivariate function.
The following is the final version of the total quality model:
Q t o t a l t = w 1 ·   Q f t + w 2 · Q m t + w 3 ·   Q p t + w 4 · Q i ( t )
where w1, w2, w3, and w4 are the corresponding weights assigned to each of the components, representing their relative importance to the overall quality.
The weights are learned and optimized by machine learning techniques, depending on the machine’s historical performance data. This model assumes that each of these factors contributes linearly to the total quality score.

3.5. Time-Dependent Evolution

To introduce the dynamic system model, we leveraged the time-dependent quality components. To illustrate this, let us use differential equations to represent the evolution of quality over time:
d Q f t d t = f f e e d s t o c k t
d Q f t d t = f f e e d s t o c k t
d Q p t d t = h ( Q f t , Q m t , t )
d Q i t d t = i i n s p e c t i o n Q p t , t
Through these equations, we can calculate the rate of change in quality metrics over time.

3.6. Cognitive Twin Integration

The integration of the aforementioned quality models into the flexible manufacturing system (FMS) transformed the conventional DT into an advanced cognitive twin (CT). The CT goes beyond the virtual representation based only on historical data. Moreover, CT incorporates real-time data analytics, continuous learning, and predictive capabilities.
The proposed CT framework makes use of a machine learning (ML) algorithm. The cognitive twin simulates and predicts the impact of real-time data on overall product quality. This predictive capability enables the system to foresee quality deviations before they manifest through defects, in turns allowing for proactive interventions such as adjustments to feedstock quality or early maintenance actions to mitigate potential downtimes. Furthermore, the cognitive twin facilitates feedback loops, from where the predicted outcomes are compared to real-time performance, and discrepancies. The abovementioned were used to update the system’s knowledge base, ensuring that predictions become more accurate over time.
The machine learning models are the key components of the cognitive twin. The models learn from vast amounts of historical data and then adapt their performance based on both the past and the present. A continuous refinement of the estimated weights (w1, w2, w3, w4) of each quality parameter enhances the system’s ability to simulate the effects of quality changes with higher accuracy. A detailed description of the CT framework is conducted in the Discussion Section.
Let Q ^ total(t) represent the estimated quality score predicted by the cognitive twin:
Q ^ t o t a l t = w ^ 1 ·   Q f t + w ^ 2 · Q m t + w ^ 3 ·   Q p t + w ^ 4 · Q i ( t )
By comparing the real-time data Qtotal(t) to its predicted Q ^ total(t), we can find errors and optimize the real-time manufacturing process.
δ Q t = Q t o t a l t Q ^ t o t a l ( t )

3.7. Augmented Reality Interface

We integrated an augmented reality (AR) module in order to further enhance the capabilities of the cognitive twin. An interactive visualization of the system’s performance was provided by this module. The operators and floor personnel were provided with optimum awareness and visibility of the system. This was achieved by overlaying the data of the predictive quality over the physical environment using a head-mounted display. The AR empowers more proactive responses to quality deviations for optimal system performance.
The operators and floor personnel were assisted through AR displays in the monitoring as well as the adjusting of the manufacturing in real time.
  • Machine health heatmap: the heatmap essentially shows the system’s health by showing the quality components of the cognitive twin. Through this, the operators are directed toward potential issues.
  • Deviation alert system: when deviations are detected through AR and cognitive twin, it sends a signal to communicate it to the floor personnel.
  • Product quality predictions: the cognitive twin, through the AR interface, shows the visualization of the predictive quality data. Through this, the outcome of the AR’s various predictions for the product can be made, assuming the conditions are not changing.
This augmented decision-making environment not only enhances the operator’s ability to manage complex processes, but also drives a more efficient and responsive manufacturing system. By seamlessly integrating real-time predictions and system health data with the physical workspace, the AR interface ensures that the operators are equipped with actionable insights, leading to reduced errors, an optimized performance, and improved product quality.

3.8. Optimization of the System Through Feedback in the Cognitive Twin Model

To any dynamic system, the concept of feedback is essential. The cognitive twin enhances the experience of the DT by integrating advanced machine learning, real-time data analytics, and an adaptive feedback system. Through this feedback, the FMS can improve its overall quality via adjusting the key factors that control the quality.
There are four basic steps for feedback. The first one is real-time data collection and simulation. The continuous stream of data is collected from the machines and raw materials. It is sent to the cognitive twin framework. The second step processes these data through a quality model, where the weights w1, w2, w3, and w4 are used to calculate the total quality score. The same are used to estimate the quality that will be generated if the system is in the current condition.
The third step is the calculation of the error through the calculation quality score and the estimated quality score. If a significant deviation is detected, then the system identifies this and suggests a decision to adjust the current conditions. This is achieved through the adjustment of the weights of the model. The fourth step is the identification or tracing of the error. If the feedstock quality, Qf, falls short of the mark, then adjusting its weight will encourage the operator to look into the key factors of the raw material and into how much improvement is required to obtain the desired traction. The same logic goes for the machine degradation quality, Qm. The mathematical and logical formulation of the weight improvement or adjustment is as follows.
w i t + 1 = w i t η δ Q ( t ) w i ( t )
where
  • w i t is the weight at time t for factor i (feedstock, machine degradation, processing quality and inspection).
  • η is the learning rate.
  • δ Q ( t ) w i ( t ) is the gradient.
This algorithm minimizes the error over time by adjusting the weights of the factors contributing to quality.

4. Implementation

The implementation phase of the cognitive twin system is basically the integration of the hardware with the software components. This process aims to establish an intelligent and adaptive environment for managing quality within the flexible manufacturing system (FMS). This section outlines the design, architecture, setup, and practical aspects of deploying the cognitive twin system.

4.1. System Architecture

The cognitive twin system was built around a hybrid architecture that integrates physical machines, microcontrollers, sensor networks, a central processing unit (PC), machine learning models, unity for digital twinning, and an AR interface. Unity3d is used in many research studies [42]. This structure facilitates continuous monitoring, real-time predictions, and adaptive adjustments to the manufacturing process.

4.1.1. Physical Components and Microcontroller Integration

The system included several critical physical components within the FMS:
  • Cutting machines (lathe, milling, engraving): each of these machines were equipped with sensors such as thermocouples for temperature monitoring, accelerometers for vibration analysis, and piezoelectric sensors for tool wear detection. The data collected by the sensor was sent to the microcontrollers embedded in the framework for constant monitoring of health parameters.
  • Conveyor system: the conveyor was used for the transportation of raw materials and finished products in order to ensure the smooth flow of the workpieces.
  • Robotic arm: a robotic arm was used for material handling, primarily. The robot handled the transfer between the moving raw stock and the machines. It also transferred work-in-process (WIP) items between the machines and returned the finished products to the conveyor.
Data acquisition, device control, and communication with the central processing unit were achieved by serial communications. Figure 3 shows the physical FMS.

4.1.2. Central Data Processing Unit and Communication

The quality data from the FMS components were transmitted to a central processing unit (PC). In the CPU, the data were aggregated, analyzed, and used for predictive modeling. The microcontrollers of each FMS component communicated with the central PC through serial communication, with a baud rate of 115,200 bytes per second (bps). Serial communication ensures that real-time data flow uninterruptedly between devices.

4.1.3. Unity 3D for Digital Twinning and Augmented Reality

The processing platform for DT and augmented reality was Unity 3D. The DT simulated the behavior of the system. This is the base engine for real-time analysis and predictions. The unity engine was used to visualize the system in a 3D environment, as many researchers have done [42]. This offers the operators and the floor personnel a virtual representation of the machines, conveyor, and robotic arm, as it can be seen in Figure 4.
The augmented reality interface was also supported by Unity 3D. The AR glasses displayed real-time data about machine health. They also displayed product quality along with predictive insights. The AR enabled the visualization as well as the interaction with the system through immersive digital overlays. These overlays help with decision making and immediate feedback mechanisms.

4.2. Data Integration

This section details the processing of real-time data. The integration of the data has a very significant effect on the functioning of the cognitive twin system. The system collects high-resolution data from various components of the FMS and processes them in real time for the predictions of machine health and product quality.
The system continuously collects real-time data from various sensors embedded in the machines and other FMS components.

4.2.1. Feedstock Quality Measurement

For the measurement of feedstock quality, two compact machines were used for measuring purity and consistency. The experiment materials were of two classes, primarily. The first one was aluminum, and the second one was hard foam. As the FMS was composed of several machine prototypes, it was only used to mimic the industrial setup. The FMS can handle softer materials to process.
(a)
Optical Emission Spectrometer (OES):
  • It essentially measures the elemental composition of the material to ensure purity. Through OES, the purity was measured and logged in the CPU of the CT framework.
(b)
X-Ray Computed Tomography Scanner:
  • It basically analyzes the density and internal consistency of materials. It was also used to detect internal defects and ensure dimensional accuracy.

4.2.2. Machine Sensors

Thermocouples were used for temperature measurements. They were used for monitoring the heat generation through operational stability. An accelerometer was used to measure vibrations. Piezoelectric sensors were used to measure tool wear by detecting the acoustic emissions generated during machine operations.

4.2.3. Data Streams Integration

It is central to the functionality of the CT system. The primary process began with data acquisition from multiple sources such as feedstock parameter measurements, sensors, machine controllers, and, afterward, quality inspection systems. After the acquisition, the data were normalized, and the noise was reduced in the CPU. The data streams were aggregated into a single framework that synchronizes the machine states, material properties, and product quality.
Timestamping was applied to each data point during acquisition. This ensured that all sensor data corresponded to the same operational event or timeframe. These data were fed into the CT framework’s machine learning algorithm.

4.2.4. Product Quality

Post-processing quality checks are important. They ensure that the final product aligns with the predefined standards. The CT system employs quality gages to measure the dimensions that are critical, such as shape of the contours and surface finish. It is usually performed by CMM.
These sensor data were transmitted via microcontrollers and serial communication to the central processing unit, where they were processed for analysis.

4.3. Data Preprocessing and Aggregation

The data collected from the sensors are usually raw and noisy. The data were preprocessed to clean them. By cleaning it is meant that the noise or any inconsistencies are removed. The cleaning of the data was achieved through normalization and standardization. The subsequent phase consisted of combining the data. Various data streams from the various machines were aggregated to provide a holistic view of the system’s health and performance.
Once the data were preprocessed and aggregated, they were fed into the cognitive twin’s simulation engine. This engine uses machine learning models to predict future system behaviors and adjust the parameters to optimize the performance. In addition to the simulation engine, the preprocessed data were also fed into the AR interface, allowing operators to see real-time feedback on machine status and product quality.

4.4. Machine Learning Models

As described earlier, cognitive twin is the next evolution of the DT and is achieved by integrating the DT with machine learning models. Hence, the machine learning models are of great significance for the continuous quality improvement and to adjust system parameters in real time. These models use both historical and real-time data to optimize the manufacturing process and ensure high-quality outputs.
Mainly, the historical data on machine performance and product quality are used to optimize the weights (w1, w2, w3, w4). These weights further influence the total quality (Qtotal) score. For the purpose of this optimization out of several ML models, only the random forest model was selected, as its optimization was extremely efficient. The model, after each manufacturing run, included the new data in the dataset and processed the new information along with the old data. This was done to make the system even more powerful in reading and controlling quality.
The historical data utilized in this study consisted of records from about 50 production cycles collected over a period of time. The cycles included details on machining parameters such as cutting speed, federate, depth of cut, and other quality metrics such as DPU, OEE, and scrap rate.
For the first run, the system was calibrated using historical data to ensure the accuracy of the machine learning models in predicting product quality and machine health. The operators were trained to use the AR interface, interpret quality predictions, and adjust machine settings to maintain optimal quality standards. Figure 5 shows the AR view of the cognitive twin. Figure 5 illustrates the AR device processing dual angle data streams for each machine, enhancing depth and spatial perception for the operators. The differences in the stereoscope presentation of each machine help the operators distinguish between processes and prioritize tasks effectively.

4.4.1. Model Training and Hyperparameters

The choice of random forest (RF) was justified by its ability to handle non-linear relationships, mitigating overfitting through ensemble learning. The RF model was used to provide interpretable feature importance scores, critical for proactive quality management.
Raw sensor data from all the sensors were normalized using min–max scaling to ensure uniformity. The model was trained on 70% of the 50 historical production cycles, with 30% reserved for testing. RF was implemented via Scikit-Learn, and the model utilized 100 decision trees with a maximum depth of 10. It was optimized through five-fold grid search cross-validation. A feature importance analysis revealed product quality as the most influential factor (35%), followed by feedstock quality (30%). This aligns perfectly with the framework’s focus on dynamic quality optimization. Validation metrics demonstrated a strong performance, with a 94.2% test accuracy, a 0.93 precision, a 0.95 recall, and an RMSE of 1.8, underscoring the model’s reliability in real-time weight adjustments.
RF training testing matrix are shown in Table 1. The RF algorithm showed a 96.5% training accuracy and a 94.2% test accuracy. The feature importance rankings were as follows: Qp > Qf > Qi > Qm. These metrics directly correlate with the reported 2% improvement in total quality scores and the 60% reduction in defects per unit (DPU).

4.4.2. Evaluation and Continuous Improvement

After deployment, the cognitive twin system undergoes continuous evaluation to ensure its predictive capabilities are accurate and reliable. The system constantly compares real-time data with historical benchmarks and refines its models to improve prediction accuracy. Over time, the machine learning models evolve, adapting to changes in the FMS environment, machine performance, and feedstock properties. This continuous learning process drives ongoing improvements in both product quality and system efficiency.
The insights generated by the system are fed back into its knowledge base. This allows for iterative improvements in the decision-making process. The system’s continuous learning mechanism is a key driver of operational excellence. It studies feedstock’s consistency and adjusts for it. It studies machine health dynamics and adjusts for them. It is always enhancing process efficiency. The system’s machine learning algorithm plays a pivotal role as new data are ingested; it becomes smarter with every run.

4.5. Comprehensive Integration and Deployment of the Cognitive Twin in the FMS

The integration of the CT framework into the FMS was conducted through a structured workflow. It encompassed the hardware setup, data acquisition and processing, and machine learning model deployment.
The hardware setup involved equipping the three primary workstations, namely lathe, milling, and engraving. To ensure smooth material handling, a robotic arm was programmed for precise movement trajectories, and a conveyor belt facilitated the continuous flow of workpieces.
The workstations were equipped with advanced sensors to monitor real-time parameters. Thermocouples were installed for temperature measurements, accelerometers for analyzing vibration levels, and piezoelectric sensors for detecting tool wear. These sensors transmitted their data to microcontrollers, such as Arduino Mega, and the data were communicated to a central processing unit (CPU) through serial connections.
The CT framework drives on data, with the high-resolution sensor data being continuously collected across the FMS. To maintain consistency and reliability, preprocessing steps such as normalization and noise reduction were applied to the raw data using Python-based algorithms. Each data point was tagged with a timestamp, ensuring synchronization across machines and alignment with the real-time operational timeline.
The framework is from [1]. Wang et al.’s framework consisted of four major components, namely feedstock quality, machine degradation quality, product processing quality, and inspection quality. A random forest algorithm was employed as the core of the CT framework [43]. The algorithm facilitates a multi-stage evaluation of quality metrics [44]. The algorithm first assessed Qf using data from the OES and CT scanners. The calculated Qf was weighted based on feature importance derived from the random forest model.
In the second stage, Qm was determined using sensor data. It was also weighted based on its importance score from the random forest (RF) model. The algorithm combined the input data to provide a comprehensive assessment for machine health.
The third stage evaluated Qp by integrating Qf and Qm with operational parameters. The RF model dynamically adjusted weights to reflect the relative influence of material quality and machine performance on the production process.
Finally, the Qi was calculated using Qp and data from a coordinate measuring machine. The weights for the Qi components were also optimized using feature importance scores.
The integration was tested on a prototype FMS that included three machines operating under two scenarios. In the first scenario, static conditions were maintained, with fixed machine parameters and no real-time adjustments. In the second scenario, the cognitive twin framework was activated, enabling dynamic monitoring and process optimization. The Unity-3D-based digital twin environment provided operators with a virtual representation of the system, including real-time machine statuses and predictive visualizations. Through this enhanced implementation, the system demonstrated its capability to predict and mitigate quality deviations effectively, ensuring a more robust and adaptive manufacturing process.

5. Results

This study was conducted using an FMS prototype. The prototype included a lathe, a milling machine, and an engraving machine. The machines were connected through a conveyor belt and a robotic arm for material handling. Only this FMS was used for experiments, and only two scenarios were selected, i.e., the CT framework and the traditional quality management. The system processed two types of materials, as stated above. Aluminum, representing high-precision industrial applications, and hard foam, which is suitable for prototyping. The aim of this study was to evaluate the performance of the CT framework.
The machining operations performed included turning on a lathe, milling on a flat surface, and engraving for various slots and designs. The lathe operations involved cylindrical workpieces with a diameter of 500 mm and a length of 1000 mm. The work-piece was machined by lathe to a tolerance of ±0.2 mm. Milling operations were carried out on 1000 mm × 1000 mm aluminum blocks. To ensure the validity of the experiment, various blocks were selected with different degrees of surface roughness. Engraving involved creating patterns and slots with a precision of 0.5 mm. The work cycles began with raw material loading, machining in sequence, and the final inspection using a coordinated measuring machine (CMM).
The system was tested under varying conditions, with real-time data from sensors mounted on the machines, to check the validity and efficiency of the proposed methodology. The procedure was as follows.
  • Data collection in real time for:
    • Feedstock quality (Qf) → Properties of materials (soft foam, aluminum).
    • Machine degradation (Qm) → Performance data from lathe, milling, and engraving (temperature, vibration, time).
    • Product processing quality (Qp) → Output quality of the products after processing, assessed by the system.
    • Quality inspection (Qi) → Inspection data gathered from sensors after product processing.
  • Real-time monitoring: data from the above was fed into the cognitive twin system for simulation and prediction of the future state of quality as well as for adjusting w1, w2, w3, and w4 for system operation.
  • Quality calculations: the total quality score Qscore(t) was calculated using the model.
  • Optimization and feedback: the predicted score of Q ^ total(t) was compared with the real-time quality score Qtotal (t). Errors were calculated and then subsequently used for optimization of the real-time data.
  • Augmented reality interface: the AR module visualized the system’s performance in real-time, showing heatmaps of machine health, alerts for quality deviations, and product quality.

5.1. Feedstock Quality (Qf)

Feedstock quality is very critical because the quality of the raw material is most important for the later processes and the final product. This was calculated from key material characteristics such as purity, consistency, or uniformity and from the type of raw material. As stated above, there were two types of materials in this experiment. Soft foam was used as a malleable material for prototyping and demonstrating the system’s ability to handle non-metal material. Aluminum was used as a standard industrial material due to its prevalence in manufacturing. It is in high demand for precision manufacturing.
Consistency refers to the uniformity of the material, usually through density. It ensures reliable and repeatable machining outcomes. It was measured through an x-ray tomography scanner available in the laboratory. It was quantified from 0 to 1, where a higher value represents fewer defects. The scale was chosen due to normalization, because of the range of various manufacturable materials in the industry.
Purity refers to the proportion of desired elements in the feedstock material. It was measured through an optical emission spectrometer (OES) to determine the elemental composition, expressed in percentages. For the sake of normalization, it was quantified in the 0 to 1 range.
The feedstock quality consistently influenced the total quality, a phenomenon which is imperative for the achievement of high quality. In Table 2, to calculate Qf, weights were used. The weights were calculated from historical data of the machine’s past performance and expert opinion. Qf was calculated as a weighted average, with a weight of 0.6 for purity and 0.4 for consistency.

5.2. Machine Degradation Quality (Qm)

Machine degradation quality was assessed through several factors. The factors consisted of the vibration of the machine, temperature, and tool wear. These factors are common for all three types of machines, i.e., lathe, mill, and engrave. As stated above, a thermocouple was used to measure the temperature. The sensor was installed near the cutting tool to assess thermal stability during machining. Excessive temperatures can affect the tool life and machining precision. Table 3 showcases the machine cutting parameters.
As vibration consists of oscillatory movements of the machine or, specifically, of the cutting tool during operation, it was measured through an accelerometer. It was measured in frequency (Hz) and amplitude using an accelerometer. Higher vibration levels typically indicate wear, instability, or suboptimal machining conditions.
Tool wear was calculated through a piezoelectric sensor that detects high frequency acoustic waves. These waves were produced during cutting operations. As the tool wears, the cutting processes generate higher acoustic energy due to increased friction, chip formation irregularities, and tool material adhesion. The sensor converts these acoustic waves into an electrical signal for analysis.
The individual machine, Qm, was calculated through a weighted average. The weights were 0.7 for vibrations and 0.25 for temperature and the rest of the tool wear. To finalize the Qm for the whole FMS, we took the average of the calculated Qm from all three machines. Table 4 consists of data from the three machines, i.e., lathe, mill, and engrave, on the FMS.

5.3. Product Processing Quality (Qp)

These data were influenced by feedstock quality (Qf) and machine degradation quality (Qm). They can be influenced by environmental or other types of factors as well, but, for the sake of simplicity, we only focused on the mentioned quality data. These data were the interplay of the Qf and Qm and were influenced by the machine performance along with material quality. Here, the weights of Qf was 0.65 and that of Qm was 0.35. The system Qp was simply an average.
Table 5 shows the calculations of the Qp through individual quality data from Qf and Qm of the corresponding machines, resulting in the Qp of the system. The experimental data show that machine degradation was still in the acceptable range.

5.4. Inspection Quality (Qi)

Inspection is always conducted at the end of the manufacturing cycle to verify if the finished product is within the acceptable limit of the designed product. The coordinate measuring machine was used for this purpose. Dimensional accuracy was ensured. These data predict the deviation from the expected standard. In the conventional manufacturing paradigm, only this type of quality is considered in relation to the final finished product, giving no foresight or patterns in the actual tracing of the quality degradation. Qi was correlated with the dimensional accuracy against machine cutting parameters. However, this was achieved through the previously calculated Qp, used to ensure consistency throughout the model.
As previously calculated, Qp was used to calculate the Qi of the corresponding machine via the inspection coefficient, resulting in a very comprehensive data model. The calculations can be seen from Table 6.

5.5. Total Calculated Quality (Qtotal)

The total quality of the model of the FMS was calculated by combining feedstock quality, machine degradation quality, product processing quality, and inspection results, using predefined weights that represent the relative importance or significance of each factor. Total quality is majorly influenced by feedstock quality and product processing quality. Through this calculated value, the FMS’s performance can be elevated.
The assumed weight was as follows:
  • w1 = 0.3 → Feedstock quality.
  • w2 = 0.2 → Machine degradation.
  • w3 = 0.3 → Product processing quality.
  • w4 = 0.2 → Quality inspection.
In Table 7, we can observe that the greater the feedstock quality, the higher the quality of the products that are manufactured. The same goes for product processing quality as well.

5.6. Estimated Total Quality ( Q ^ total)

By using the cognitive twin model, we can estimate the total quality. The equation was re-written. The cognitive model is a prediction model that simulates the behavior of the flexible manufacturing system based on historical data and real-time measurements.
Q ^ t o t a l t = w ^ 1 · Q f t + w ^ 2 · Q m t + w ^ 3 · Q p t + w ^ 4 · Q i ( t )
where w ^ 1 ,   w ^ 2 ,   w ^ 3 , and w ^ 4 are the estimated weights which are derived from historical data and machine learning models and represent the optimized influence of each factor.
These estimated weights were learned by machine learning algorithms such as linear regression and neural networks, composed of historical data.
  • w ^ 1 = 0.32 → Feedstock quality.
  • w ^ 2 = 0.18 → Machine degradation quality.
  • w ^ 3 = 0.30 → Product processing quality.
  • w ^ 4 = 0.20 → Inspection quality.
These factors are different from the calculated total quality, in the sense that feedstock quality was higher, and the machine degradation quality was lower than the calculated qualities. Table 8 displays the sample data on the estimated Qtotal.
From the table, it can be observed that the estimated qualities were slightly higher than their calculated counterparts because of dynamic optimization. Nevertheless, the values closely aligned with the calculated real-time qualities.

5.7. Relative Error

The error between the actual and estimated qualities was used to assess the cognitive twin model. The results in Table 9 show that, throughout the series, the error was relatively small. This concludes that the estimated model is highly reliable and closely matches the real-time calculations. The acceptable limits for relative errors are usually based on industry standards and application-specific requirements.
The negative error shows that the estimated value was slightly higher than its calculated counterparts, a fact which is normally an indication of an overly optimized model. Figure 6 shows the comparative chart of the estimated Qtotal and the calculated Qtotal. The trend shows that the estimated Qtotal was higher than the calculated Qtotal. It is due to the fact that the model shows the theoretical and standard Qtotal rather than the real-world value. The error was within the acceptable region. Figure 7 shows the boxplot between the calculated and the estimated Qtotal. Through the visual, it can be confirmed that there was only a slight difference between them.

5.8. Conventional Quality Management

Traditional quality management usually involves a reactive style, meaning that, once the product or a batch of products is manufactured and validated, then the changes are suggested. There are few traditional quality management techniques, including quality control (QC), quality assurance (QA), total quality management (TQM), six sigma, and lean manufacturing. A detailed explanation of each is provided in later sections.
For the sake of simplicity, we used a comprehensive set of techniques to produce a standard set of matrixes to calculate quality. These matrixes can be used with any of the above quality techniques.
The comparison with our proposed model using conventional data was achieved through the following metrics.
  • Defects per unit (DPU): it represents the number of defects detected per unit produced.
  • Overall equipment efficiency (OEE%): the percentage of the total equipment time that is productive.
  • Downtime (hrs): time when the equipment is not functioning, typically due to maintenance or unexpected failures.
  • Production rate (units/hr): the number of units produced per hour.
  • Scrap rate (%): percentage of items produced which are discarded due to defects.
Table 10 and Table 11 present the practical results of the traditional quality management and cognitive twin quality management, respectively. The controlled experiments were performed on the FMS. In Table 9, the data were recorded after the machining and quality inspection, following standard practices such as end-of-line inspection and periodic sampling. The cycle represents a single run of machining and inspection for a batch of products.
In Table 10, real-time monitoring and feedback loops were implemented to continuously optimize machining parameters and predict quality deviations. From the values, it is observable that the cognitive twin outperforms the traditional quality techniques. Figure 8 portrays all the detailed visuals to showcase that the cognitive twin outperforms the traditional quality system.
The data demonstrates that the cognitive twin framework outperform the conventional system. By using real-time monitoring and dynamic responses via continuous feedback loops, our proposed system ensures high quality. The reactive nature of the conventional quality techniques and systems experience more delays and shutdowns to address any issues, thus creating more fluctuations in production.

5.9. Augmented Reality Results

After applying the cognitive twin framework to the prototype FMS, the augmented reality of the system was simulated. Figure 9 displays the overlaying information shown to the operator wearing the head-mounted display (HMD). The display detects the milling machine and instantaneously displays the DT of the milling along with the quality matrixes. As it can be seen in the mentioned figure, the display shows the state of the machine, whether it is off, idle, or performing activity. It also classifies the process being performed by the machine. The quality matrices, along with their constituent parameters, are shown as well.
The system employed Microsoft HoloLens 2 for hands-free operation. It was achieved with Unity 3D and AR foundation SDK, enabling real-time 3D visualization. Data streams from the cognitive twin’s central processing unit were transmitted via an MQTT protocol, achieving an end-to-end latency of <200 ms to ensure responsiveness. The AR interface featured dynamic heatmaps (color-coded machine health status), auditory/visual alerts for threshold breaches (e.g., vibration > 3.5 Hz), and predictive overlays projecting Qtotal trends. The AR system reduced the operator response time to defects by 40% (from 12 to 7.2 min).
The AR interface reduced human error by 25% via real-time alerts, aligning with the 33.3% scrap rate reduction. The RF’s computational overhead may scale poorly for larger datasets.
The display also shows the heatmap of the process if the operator wants to check it out. Figure 10 shows the heatmap calculated by the cognitive twin. The correlations in the heatmap are measured through a Pearson’s correlation coefficient. This shows the effect of one constituent of the quality matrix on the others, giving a very holistic view to the operator or floor personnel.

5.10. Hierarchical Quality Assessment

This section provides a detailed breakdown of the CT framework’s hierarchical quality assessment and validation. First, the feedstock quality was calculated through data from OES and CT scanners, using the RF model. Machine degradation quality was determined using sensor data. Product processing quality integrated feedstock and machine degradation qualities through weighted calculations, while the inspection quality used product processing quality and data from CMM. A controlled experiment compared traditional and CT conditions, demonstrating a 40% reduction in scrap rates and an 11.8% increase in overall equipment efficiency.

5.11. Details of Machined Components

The experimental validation involved machining two distinct categories of components to assess the cognitive twin framework’s performance in diverse scenarios. Industrial grade aluminum parts were selected to simulate high-precision manufacturing applications, while hard foam prototypes were used to validate rapid prototyping.
For the part components, cylindrical shafts with dimensions of 500 mm in diameter and 1000 mm in length were machined to a tolerance of ±0.2 mm, featuring turned grooves (2 mm depth, 5 mm width). Surface roughness was maintained at ≤1 µm to ensure compliance with industrial standards. Flat milling blocks (1000 mm × 1000 mm × 50 mm) were processed to achieve tight geometric tolerances, including a flatness of ≤0.1 mm and a perpendicularity of ≤0.05 mm, with pocket milling (10 mm depth) and threaded holes (M12 × 1.75) for functional testing. Figure 11 portrays the various design samples of the workpieces through experimentations.
The production workflow began with robotic-arm-assisted loading of raw aluminum billets or foam blocks. Sequential machining steps included lathe operations for roughing and finishing, milling for planar and pocket features, and engraving for textures or markings. Post-processing stages incorporated deburring and cleaning to remove residual material. A final inspection was conducted through automated CMM measurements to ensure compliance with quality standards. This structured workflow highlighted the system’s capability to dynamically adjust parameters in real time, driven by the cognitive twin’s predictive analytics and feedback loops.

6. Discussion

This study aims to compare traditional quality management (TQM) with the cognitive twin framework (CTF). The findings reveal that the CTF represents a significant advancement over TQM in various aspects of quality management, driven by advancements in technology and data analytics. Table 12 shows the comparison of TQM with the CTF through various dimensions.
One of the most prominent differences lies in the nature of control mechanisms. Traditional quality management operates reactively, addressing defects post-production through end-of-line inspections and random sampling. This approach inherently delays defect detection and correction, potentially allowing defective products to reach the market. In contrast, the cognitive twin framework employs a proactive control mechanism, leveraging real-time monitoring and intervention to predict and prevent defects before they occur. This shift from reactive to proactive control not only minimizes waste and inefficiencies, but also enhances product reliability and customer satisfaction.
It was highlighted in the literature review that traditional quality management did not have any adaptability. Our findings corroborate these observations by demonstrating that the CTF offers superior flexibility, continuous monitoring, and predictive capabilities. However, while some researchers argue that traditional methods remain effective in stable production environments, our analysis suggests that the real-time data utilization and adaptive adjustments provided by the CTF offer tangible benefits even in such settings, challenging the notion that TQM is sufficient for all production contexts.
Industries adopting the CTF can expect substantial improvements in efficiency, cost management, and product quality. Real-time, data-driven decisions enable immediate responses to anomalies, reducing downtime and operational costs. Additionally, the ability to customize and personalize production based on real-time customer feedback enhances customer satisfaction and accelerates the time to the market.
This comparison contributes to the evolving body of knowledge in quality management by integrating concepts from IoT, machine learning, and real-time analytics. It proposes a framework where quality management is not just a post-production activity, but an integral, continuous process embedded within the production lifecycle.
This study has several limitations. Firstly, it was confined to manufacturing only prototype FMSs, an approach which may limit the generalizability of the findings to other sectors. Secondly, the reliance on continuous data streams and advanced analytics necessitates robust infrastructure and cybersecurity measures, neither of which was the focus of this study. Additionally, the initial transition from the TQM to the CTF may pose challenges related to technology adoption, employee training, and integration with existing systems, potentially impacting the scalability of CTF implementations.
The comparative analysis underscores the transformative potential of the cognitive twin framework in revolutionizing quality management practices. By shifting from reactive to proactive control, enabling continuous real-time monitoring, and fostering adaptive and predictive capabilities, the CTF offers a more efficient, accurate, and customer-centric approach to quality management. As industries continue to evolve in the digital age, the adoption of cognitive twin frameworks is poised to become a cornerstone of advanced manufacturing and production excellence.

7. Conclusions

The integration of cognitive twin technology into flexible manufacturing systems (FMSs) marks a significant advancement in real-time, data-driven quality management. The research highlights that DT coupled with machine learning and augmented reality can improve production quality. By leveraging data analytics and simulations, the system optimizes key quality parameters such as feedstock quality, machine health, and product inspection. This leads to reduced defects, minimized downtime, and increased product consistency.
The key findings include a total of 2% improvement in the overall quality. There is a 93–96% improvement in the overall equipment efficiency (OEE) compared to 80–84% in conventional systems. The scrap rate is down to 40% from 60%. Furthermore, augmented reality (AR) gives extra clarity and decision-making capabilities to the operators in real time. The cognitive twin’s ability to continuously learn from real-time data allows it to adapt and improve over time. Overall equipment efficiency increased by 11.8%, on average, from 82% to 93%, and the scrap rate decreased by 33.3% from 60% to 40%.
In conclusion, cognitive twin technology bridges the gap between traditional manufacturing systems and intelligent automation. It offers a transformative solution for enhancing FMSs. Future research could focus on refining machine learning models. Another focus could be the exploration of other fields where cognitive twins can be applied, especially in supply chain management and production scheduling.

Author Contributions

Conceptualization, M.S.S., A.U. and M.Y.; methodology, M.S.S. and A.U.; formal analysis, A.U.; investigation, A.U., M.Y. and M.S.S.; data curation, A.U., M.Y. and M.S.S.; writing—original draft preparation, A.U., M.Y. and M.S.S.; writing—review and editing, A.U., M.Y. and M.S.S.; visualization, A.U., M.Y. and M.S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors would like to acknowledge the Faculty of Mechanical Engineering, Ghulam Ishaq Khan Institute of Engineering and Technology, Topi, Swabi, KpK, Pakistan, and the School of Computing and Engineering Technology, Robert Gordon University, Aberdeen, UK, for their valuable support and resources.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Wang, X.; Ke, Y.; Cai, Z.; Ye, Z. Operation risk assessment of Flexible Manufacturing Networks subject to quality-reliability coupling. Reliab. Eng. Syst. Saf. 2024, 250, 110282. [Google Scholar] [CrossRef]
  2. Grieves, M.W. PLM—Beyond lean manufacturing. Manuf. Eng. 2003, 130, 23. [Google Scholar]
  3. PLM-Beyond Lean Manufacturing Digital Twin View Project DFAM-Design for Additive Manufacturing and Additive Manufacturing Evaluation View Project Michael Grieves Digital Twin Institute. Available online: www.sme.org/manufacturingengineering (accessed on 2 December 2024).
  4. Qi, Q.; Tao, F. Digital Twin and Big Data Towards Smart Manufacturing and Industry 4.0: 360 Degree Comparison. IEEE Access 2018, 6, 3585–3593. [Google Scholar] [CrossRef]
  5. Tao, F.; Zhang, H.; Zhang, C. Advancements and challenges of digital twins in industry. Nat. Comput. Sci. 2024, 4, 169–177. [Google Scholar] [CrossRef] [PubMed]
  6. Ma, X.; Qi, Q.; Tao, F. A Digital Twin–Based Environment-Adaptive Assignment Method for Human–Robot Collaboration. J. Manuf. Sci. Eng. 2014, 146, 1–26. [Google Scholar] [CrossRef]
  7. Zhang, M.; Tao, F.; Nee, A. Digital Twin Enhanced Dynamic Job-Shop Scheduling. J. Manuf. Syst. 2021, 58, 146–156. [Google Scholar] [CrossRef]
  8. Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital twin-driven product design, manufacturing and service with big data. Int. J. Adv. Manuf. Technol. 2018, 94, 3563–3576. [Google Scholar] [CrossRef]
  9. D’amico, R.D.; Erkoyuncu, J.A.; Addepalli, S.; Penver, S. Cognitive Digital Twin: An Approach to Improve the Maintenance Management; Elsevier Ltd.: Amsterdam, The Netherlands, 2022. [Google Scholar] [CrossRef]
  10. Li, Y.; Chen, J.; Hu, Z.; Zhang, H.; Lu, J.; Kiritsis, D. Kiritsis. Co-simulation of complex engineered systems enabled by a cognitive twin architecture. Int. J. Prod. Res. 2022, 60, 7588–7609. [Google Scholar] [CrossRef]
  11. D’Amico, R.; Sarkar, A.; Karray, H.; Addepalli, S.; Erkoyuncu, J.A. Detecting failure of a material handling system through a cognitive twin. In IFAC-PapersOnLine; Elsevier: Amsterdam, The Netherlands, 2022; pp. 2725–2730. [Google Scholar] [CrossRef]
  12. Sicard, B.S.; Butler, Q.; Ziada, Y.; Gadsden, S.A. Cognitive dynamic digital twin: Enhancements for digital twin platforms based on human cognition. Big Data V: Learning, Analytics, and Applications. In Proceedings of the SPIE Defense + Commercial Sensing, Orlando, FL, USA, 30 April–5 May 2023; Available online: https://www.spiedigitallibrary.org/conference-proceedings-of-spie/12522/125220B/Cognitive-dynamic-digital-twin--enhancements-for-digital-twin-platforms/10.1117/12.2664017.full?tab=ArticleLink (accessed on 2 December 2024).
  13. Somers, S.; Oltramari, A.; Lebiere, C. Cognitive Twin: A Personal Assistant Embedded in a Cognitive Architecture. In Proceedings of the 18th International Conference on Cognitive Modelling, University Park, PA, USA, 20 July–1 August 2020. [Google Scholar]
  14. Iacono, W.G.; Heath, A.C.; Hewitt, J.K.; Neale, M.C.; Banich, M.T.; Luciana, M.M.; Madden, P.A.; Barch, D.M.; Bjork, J.M. The Utility of Twins in Developmental Cognitive Neuroscience Research: How Twins Strengthen the ABCD Research Design; Elsevier Ltd.: Amsterdam, The Netherlands, 2018. [Google Scholar] [CrossRef]
  15. Yin, Y.; Zheng, P.; Li, C.; Wang, L. A State-of-the-Art Survey on Augmented Reality-Assisted Digital Twin for Futuristic Human-Centric Industry Transformation; Elsevier Ltd.: Amsterdam, The Netherlands, 2023. [Google Scholar] [CrossRef]
  16. Ho, P.T.; Albajez, J.A.; Santolaria, J.; Yagüe-Fabra, J.A. Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review. Appl. Sci. 2022, 12, 1961. [Google Scholar] [CrossRef]
  17. Yoo, J. The effects of perceived quality of augmented reality in mobile commerce-an application of the information systems success model. Informatics 2020, 7, 14. [Google Scholar] [CrossRef]
  18. Alves, J.B.; Marques, B.; Dias, P.; Santos, B.S. Using augmented reality for industrial quality assurance: A shop floor user study. Int. J. Adv. Manuf. Technol. 2021, 115, 105–116. [Google Scholar] [CrossRef]
  19. Szajna, A.; Stryjski, R.; Woźniak, W.; Chamier-Gliszczyński, N.; Królikowski, T. The production quality control process, enhanced with augmented reality glasses and the new generation computing support system. In Procedia Computer Science; Elsevier: Amsterdam, The Netherlands, 2020; pp. 3618–3625. [Google Scholar] [CrossRef]
  20. Franciosa, P.; Sokolov, M.; Sinha, S.; Sun, T.; Ceglarek, D. Deep learning enhanced digital twin for Closed-Loop In-Process quality improvement. CIRP Ann. 2020, 69, 369–372. [Google Scholar] [CrossRef]
  21. Zheng, X.; Lu, J.; Kiritsis, D. The emergence of cognitive digital twin: Vision, challenges and opportunities. Int. J. Prod. Res. 2020, 60, 7610–7632. [Google Scholar] [CrossRef]
  22. Zhu, X.; Ji, Y. A digital twin–driven method for online quality control in process industry. Int. J. Adv. Manuf. Technol. 2022, 119, 3045–3064. [Google Scholar] [CrossRef]
  23. Zheng, X.; Psarommatis, F.; Petrali, P.; Turrin, C.; Lu, J.; Kiritsis, D. A quality-oriented digital twin modelling method for manufacturing processes based on a multi-agent architecture. In Procedia Manufacturing; Elsevier: Amsterdam, The Netherlands, 2020; pp. 309–315. [Google Scholar] [CrossRef]
  24. Johansen, S.T.; Unal, P.; Albayrak, Ö.; Ikonen, E.; Linnestad, K.J.; Jawahery, S.; Srivastava, A.K.; Løvfall, B.T. Hybrid and cognitive digital twins for the process industry. Open Eng. 2023, 13, 20220418. [Google Scholar] [CrossRef]
  25. Tao, K.; Lei, J.; Huang, J. Physical Integrated Digital twin-based Interaction Mechanism of Artificial Intelligence Rehabilitation Robots Combining Visual Cognition and Motion Control. Wirel. Pers. Commun. 2024, 1–16. [Google Scholar] [CrossRef]
  26. Ojstersek, R.; Acko, B.; Buchmeister, B. Simulation study of a flexible manufacturing system regarding sustainability. Int. J. Simul. Model. 2020, 19, 65–76. [Google Scholar] [CrossRef]
  27. Khan, W.A.; Raouf, A.; Cheng, K. Virtual Manufacturing. In Springer Series in Advanced Manufacturing; Springer: London, UK, 2011. [Google Scholar] [CrossRef]
  28. Preethi, V.; Soosairaj, J.; Kandavel, V.P.V. Optimization of Flexible Manufacturing Systems Using IoT. BOHR Int. J. Eng. 2022, 1, 39–43. [Google Scholar] [CrossRef]
  29. Hu, X.; Assaad, R.H. A BIM-enabled digital twin framework for real-time indoor environment monitoring and visualization by integrating autonomous robotics, LiDAR-based 3D mobile mapping, IoT sensing, and indoor positioning technologies. J. Build. Eng. 2024, 86, 108901. [Google Scholar] [CrossRef]
  30. Samir, K.; Maffei, A.; Onori, M.A. Onori. Real-Time asset tracking; a starting point for Digital Twin implementation in Manufacturing. Procedia CIRP 2019, 81, 719–723. [Google Scholar] [CrossRef]
  31. Liu, J.; Cao, X.; Zhou, H.; Li, L.; Liu, X.; Zhao, P.; Dong, J. A digital twin-driven approach towards traceability and dynamic control for processing quality. Adv. Eng. Inform. 2021, 50, 101395. [Google Scholar] [CrossRef]
  32. Rožanec, J.M.; Lu, J.; Rupnik, J.; Škrjanc, M.; Mladenić, D.; Fortuna, B.; Zheng, X.; Kiritsis, D. Actionable cognitive twins for decision making in manufacturing. Int. J. Prod. Res. 2022, 60, 452–478. [Google Scholar] [CrossRef]
  33. Bagherian, A.; Chauhan, G.; Srivastav, A.L.; Sharma, R.K. Evaluating the Ranking of Performance Variables in Flexible Manufacturing System through the Best-Worst Method. Designs 2024, 8, 12. [Google Scholar] [CrossRef]
  34. Bozzi, A.; Graffione, S.; Jiménez, J.; Sacile, R.; Zero, E. Reliability Evaluation of Emergent Behaviour in a Flexible Manufacturing Problem. In Proceedings of the 2023 18th Annual System of Systems Engineering Conference, SoSe 2023, Lille, France, 14–16 June 2023; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2023. [Google Scholar] [CrossRef]
  35. Daniyan, I.; Mpofu, K.; Ramatsetse, B.; Zeferino, E.; Monzambe, G.; Sekano, E. Design and simulation of a flexible manufacturing system for manufacturing operations of railcar subassemblies. In Procedia Manufacturing; Elsevier: Amsterdam, The Netherlands, 2020; pp. 112–117. [Google Scholar] [CrossRef]
  36. Howard, F.M.; Gao, C.A.; Sankey, C. Implementation of an automated scheduling tool improves schedule quality and resident satisfaction. PLoS ONE 2020, 15, e0236952. [Google Scholar] [CrossRef] [PubMed]
  37. Havard, V.; Sahnoun, M.; Bettayeb, B.; Duval, F.; Baudry, D. Data architecture and model design for Industry 4.0 components integration in cyber-physical production systems. Proc. Inst. Mech. Eng. B J. Eng. Manuf. 2021, 235, 2338–2349. [Google Scholar] [CrossRef]
  38. Wang, C.; Jiang, P.; Ding, K. A hybrid-data-on-tag-enabled decentralized control system for flexible smart workpiece manufacturing shop floors. Proc. Inst. Mech. Eng. C J. Mech. Eng. Sci. 2017, 231, 764–782. [Google Scholar] [CrossRef]
  39. Sarkar, M.; Chung, B.D. Flexible work-in-process production system in supply chain management under quality improvement. Int. J. Prod. Res. 2020, 58, 3821–3838. [Google Scholar] [CrossRef]
  40. Ullah, A.; Younas, M. Development and Application of Digital Twin Control in Flexible Manufacturing Systems. J. Manuf. Mater. Process. 2024, 8, 214. [Google Scholar] [CrossRef]
  41. Ullah, A.; Younas, M.; Saharudin, M.S. Digital Twin Framework Using Real-Time Asset Tracking for Smart Flexible Manufacturing System. Machines 2025, 13, 37. [Google Scholar] [CrossRef]
  42. Zhuang, C.; Miao, T.; Liu, J.; Xiong, H. The connotation of digital twin, and the construction and application method of shop-floor digital twin. Robot. Comput. Integr. Manuf. 2021, 68, 102075. [Google Scholar] [CrossRef]
  43. Myśliwiec, P.; Kubit, A.; Szawara, P. Optimization of 2024-T3 Aluminum Alloy Friction Stir Welding Using Random Forest, XGBoost, and MLP Machine Learning Techniques. Materials 2024, 17, 1452. [Google Scholar] [CrossRef] [PubMed]
  44. Bin, F.; Hosseini, S.; Chen, J.; Samui, P.; Fattahi, H.; Armaghani, D.J. Proposing Optimized Random Forest Models for Predicting Compressive Strength of Geopolymer Composites. Infrastructures 2024, 9, 181. [Google Scholar] [CrossRef]
Figure 1. AR-assisted 3-dimensional digital twin [15].
Figure 1. AR-assisted 3-dimensional digital twin [15].
Jmmp 09 00079 g001
Figure 2. Data structure and data flow.
Figure 2. Data structure and data flow.
Jmmp 09 00079 g002
Figure 3. Physical flexible manufacturing system.
Figure 3. Physical flexible manufacturing system.
Jmmp 09 00079 g003
Figure 4. Digital twin of the flexible manufacturing system.
Figure 4. Digital twin of the flexible manufacturing system.
Jmmp 09 00079 g004
Figure 5. Augmented reality view of cognitive twins.
Figure 5. Augmented reality view of cognitive twins.
Jmmp 09 00079 g005
Figure 6. Calculated vs. Estimated Qtotal.
Figure 6. Calculated vs. Estimated Qtotal.
Jmmp 09 00079 g006
Figure 7. Boxplot of the calculated vs. estimated values.
Figure 7. Boxplot of the calculated vs. estimated values.
Jmmp 09 00079 g007
Figure 8. Comparative observations between traditional and cognitive twin qualites.
Figure 8. Comparative observations between traditional and cognitive twin qualites.
Jmmp 09 00079 g008
Figure 9. Augmented reality information display.
Figure 9. Augmented reality information display.
Jmmp 09 00079 g009
Figure 10. Heatmap of the cognitive twin constituents.
Figure 10. Heatmap of the cognitive twin constituents.
Jmmp 09 00079 g010
Figure 11. Samples of workpieces.
Figure 11. Samples of workpieces.
Jmmp 09 00079 g011
Table 1. Training Testing Matrix of the Random Forest Algorithm.
Table 1. Training Testing Matrix of the Random Forest Algorithm.
MetricTraining Data Test Data
Accuracy (%)96.594.2
Precision0.950.93
Recall0.970.95
Feature Importance RankingQp (35%) > Qf (30%) > Qi (20%) > Qm (15%)
Table 2. Sample data on feedstock quality (Qf).
Table 2. Sample data on feedstock quality (Qf).
S. NoMaterial TypePurity (%)Consistency (Index)Feedstock Quality (Qf)
1Hard Foam950.850.92
2Aluminum880.750.81
3Hard Foam940.820.89
4Aluminum900.780.84
5Hard Foam960.880.91
Table 3. Machine cutting parameters.
Table 3. Machine cutting parameters.
Cutting ParameterMeasurement Method Typical Values/Range
Cutting SpeedCNC Machine Input50–150 m/min
Feed RateCNC Machine Input 0.05–0.3 mm/rev
Depth of Cut CNC Machine input0.5–3.0 mm
Table 4. Sample data on machine degradation (Qm).
Table 4. Sample data on machine degradation (Qm).
LatheMillEngrave
S. NoVib (Hz)Temp (°C)Tool WearQmVib (Hz)Temp (°C)Tool WearQmVib (Hz)Temp (°C)Tool WearQmSystem Qm
13.0330.0510.702.8340.0610.602.5320.0410.3010.53
23.2340.0610.903.0350.0711.102.6330.0510.5010.83
33.4360.0811.203.2360.0911.402.7340.0610.8011.13
43.6370.1011.503.4370.1111.802.8350.0711.1011.47
53.8390.1211.803.6380.1312.102.9360.0811.4011.77
Table 5. Sample data on processing quality (Qp).
Table 5. Sample data on processing quality (Qp).
S. NoLathe QmMill QmEngrave QmQf (Feedstock Quality)Qp (Lathe)Qp (Mill)Qp (Engrave)System Qp
110.7010.6010.309060.3560.3060.1560.27
210.9011.1010.509261.4061.6061.2561.42
311.2011.4010.809362.2062.3062.3062.26
411.5011.8011.109463.2063.3063.2063.23
511.8012.1011.409564.3064.4064.3064.33
Table 6. Sample data on inspection quality (Qi).
Table 6. Sample data on inspection quality (Qi).
S. NoQp (Lathe)Qp (Mill)Qp (Engrave)Inspection
Coefficient (k)
Qi (Lathe)Qi (Mill)Qi (Engrave)Average Qi
160.3560.3060.150.9859.8259.6959.5559.69
261.4061.6061.250.9759.5059.8059.6559.65
362.2062.3062.300.9659.7159.8159.7859.77
463.2063.3063.200.9560.0460.0960.0460.06
564.3064.4064.300.9460.3560.4860.3360.39
Table 7. Sample data on total quality of the system (Qtotal).
Table 7. Sample data on total quality of the system (Qtotal).
S. NoQf (Feedstock Quality)Qm (System)Qp (Avg)Qi (Avg)Qtotal
19012.4660.2759.6949.15
29212.7061.4259.6549.66
39313.0062.2659.7749.92
49413.2363.2360.0650.30
59513.5064.3360.3950.75
Table 8. Sample data on the estimated total quality.
Table 8. Sample data on the estimated total quality.
S. NoQf (Feedstock Quality)Qm (System)Qp (Avg)Qi (Avg)Estimated Qtotal
19012.4660.2759.6949.66
29212.7061.4259.6550.18
39313.0062.2659.7750.43
49413.2363.2360.0650.74
59513.5064.3360.3951.10
Table 9. Error calculations table.
Table 9. Error calculations table.
Calculated Qtotal(t) Estimated   Q ^ t o t a l ( t ) Error
49.1549.66−0.51
49.6650.18−0.52
49.9250.43−0.51
50.3050.74−0.44
50.7551.10−0.35
Table 10. Conventional data matrix.
Table 10. Conventional data matrix.
CycleDPUOEEDowntime (hrs)Scrap Rate (%)Production Rate (Units/hr)
10.068523.5148
20.07832.53.8145
30.05871.52.8152
40.098134.2140
50.08823.53.9143
60.068413.5150
70.07822.83.6142
80.08793.24.5137
90.05861.23.7152
100.068333.8145
Table 11. Cognitive twin data matrix.
Table 11. Cognitive twin data matrix.
CycleDPUOEEDowntime (hrs)Scrap Rate (%)Production Rate (Units/hr)
10.03921.22158
20.04901.52.3155
30.02940.81.5163
40.039312.1160
50.02950.91.8165
60.01960.71.2170
70.02940.81.3161
80.03931.11.4157
90.02990.60.5172
100.03950.81.7160
Table 12. Comparison of the traditional quality management and the cognitive twin framework.
Table 12. Comparison of the traditional quality management and the cognitive twin framework.
DimensionTraditional Quality ManagementCognitive Twin Framework
Nature of ControlReactive (occurs after production)Proactive (real-time monitoring and intervention)
Inspection MethodEnd-of-line inspection, random samplingContinuous monitoring throughout production in real time
Key FocusDetect defects after they occurPredict and prevent defects before they occur
Response TimeCan be slow due to inspection delays or post-production checksImmediate response to detected anomalies or predictive warnings
AccuracyLimited by random sampling or batch inspectionHigh accuracy through continuous data analysis and simulation
Process MonitoringManually adjusted based on historical data or set schedulesContinuous real-time monitoring using sensor data and machine learning
Predictive CapabilityLimited predictive capacity (often based on historical trends)High-level predictive capability based on dynamic data input (e.g., IoT sensors)
Process AdjustmentsAchieved based on predefined rules or operator inputAdaptive adjustments in real time using AI and predictive algorithms
Data UsagePrimarily historical data or scheduled auditsReal-time, data-driven decisions and machine-learning-driven insights
Decision MakingBased on fixed rules or past trends, less agileBased on continuous learning and evolving insights from real-time data
Data Processing SpeedTypically slow due to batch processing or scheduled auditsNear-instantaneous processing of incoming data streams
Data IntegrationOften siloed or disconnected across different departmentsSeamless integration of data across production systems, sensors, and analytics platforms
System FlexibilityRigid, with changes being implemented based on audits or schedulesHighly flexible, adaptive to real-time changes and external factors
Adaptation to ChangeSlow adaptation (e.g., fixed schedules, processes)Rapid adaptation through predictive models and real-time adjustments
Learning CapabilityLimited to periodic reviews and adjustments based on experienceContinuously learning and improving through machine learning algorithms
Improvement ProcessManual, often driven by employee feedback or fixed improvement cyclesAutomated, continuous improvement through real-time analytics and machine learning
Role of EmployeesHeavy reliance on employee training, audits, and manual improvement effortsEmployees interpret automated insights, with much improvement being system driven
Impact on EfficiencyImprovements can be slow, often requiring large, manual interventionsContinuous improvements built into the system, with ongoing incremental gains
Resource AllocationBased on fixed schedules, standard operating procedures, or historical trendsDynamically optimized based on real-time conditions, reducing unnecessary resource use
Waste ReductionOften limited to after-the-fact waste audits and correctionsProactively reduces waste by predicting problems before they happen, adjusting processes accordingly
Operational CostsCan be high due to inefficiencies, waste, and downtimeLower operational costs due to real-time optimization, predictive maintenance, and waste reduction
Customer FeedbackOften analyzed after production, in the form of complaints or returnsReal-time insights into product quality, allowing for faster responses to customer needs
Customization/PersonalizationLimited ability to adjust to individual customer needs in real-timeHighly flexible, with the ability to customize or adjust production based on real-time customer demands or feedback
Time to MarketCan be slow, with delays due to quality checks or adjustmentsFaster time to market, thanks to predictive quality management and real-time decision making
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ullah, A.; Younas, M.; Saharudin, M.S. Transforming Manufacturing Quality Management with Cognitive Twins: A Data-Driven, Predictive Approach to Real-Time Optimization of Quality. J. Manuf. Mater. Process. 2025, 9, 79. https://doi.org/10.3390/jmmp9030079

AMA Style

Ullah A, Younas M, Saharudin MS. Transforming Manufacturing Quality Management with Cognitive Twins: A Data-Driven, Predictive Approach to Real-Time Optimization of Quality. Journal of Manufacturing and Materials Processing. 2025; 9(3):79. https://doi.org/10.3390/jmmp9030079

Chicago/Turabian Style

Ullah, Asif, Muhammad Younas, and Mohd Shahneel Saharudin. 2025. "Transforming Manufacturing Quality Management with Cognitive Twins: A Data-Driven, Predictive Approach to Real-Time Optimization of Quality" Journal of Manufacturing and Materials Processing 9, no. 3: 79. https://doi.org/10.3390/jmmp9030079

APA Style

Ullah, A., Younas, M., & Saharudin, M. S. (2025). Transforming Manufacturing Quality Management with Cognitive Twins: A Data-Driven, Predictive Approach to Real-Time Optimization of Quality. Journal of Manufacturing and Materials Processing, 9(3), 79. https://doi.org/10.3390/jmmp9030079

Article Metrics

Back to TopTop