Next Article in Journal
H Differential Game of Nonlinear Half-Car Active Suspension via Off-Policy Reinforcement Learning
Previous Article in Journal
Symmetric ADMM-Based Federated Learning with a Relaxed Step
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Reference Architecture for the Integration of Prescriptive Analytics Use Cases in Smart Factories †

1
Fraunhofer Institute for Mechatronic Systems Design, Digital Transformation, 33102 Paderborn, Germany
2
Center for Applied Data Science, Bielefeld University of Applied Sciences and Arts, 33330 Gütersloh, Germany
3
Chair of Advanced Systems Engineering, Heinz Nixdorf Institute, Fürstenallee 11, 33102 Paderborn, Germany
*
Author to whom correspondence should be addressed.
This paper is an extended version of our published paper in the 2023 IEEE ADACIS.
Mathematics 2024, 12(17), 2663; https://doi.org/10.3390/math12172663
Submission received: 25 July 2024 / Revised: 20 August 2024 / Accepted: 21 August 2024 / Published: 27 August 2024

Abstract

:
Prescriptive analytics plays an important role in decision making in smart factories by utilizing the available data to gain actionable insights. The planning, integration and development of such use cases still poses manifold challenges. Use cases are still being implemented as standalone versions; the existing IT-infrastructure is not fit for integrative bidirectional decision communication, and implementations only reach low technical readiness levels. We propose a reference architecture for the integration of prescriptive analytics use cases in smart factories. The method for the empirically grounded development of reference architectures by Galster and Avgeriou serves as a blueprint. Through the development and validation of a specific IoT-Factory use case, we demonstrate the efficacy of the proposed reference architecture. We expand the given reference architecture for one use case to the integration of a smart factory and its application to multiple use cases. Moreover, we identify the interdependency among multiple use cases within dynamic environments. Our prescriptive reference architecture provides a structured way to improve operational efficiency and optimize resource allocation.

1. Introduction

Humans regularly make decisions that are influenced by a variety of factors such as uncertainty and time pressure, leading to potential human mistakes [1]. Decision making is particularly critical in manufacturing or smart factories, where the consequences of decisions can be profound. In smart factories, decisions can have a substantial impact on operations and outcomes, which shows the importance of understanding the impact of decisions [2]. Poor decisions in this context can trigger chain reactions and impact entire systems, highlighting the interconnected nature of these environments [3] (p. 338).
Factories are increasingly equipped with sensors and data is recorded. Incorporating data into the decision making process eliminates human uncertainty and improves overall decision efficiency [4]. Approaches used for data-driven decision making are usually structured in reference to data analytics levels. Within this framework, the different levels of data analytics—descriptive, diagnostic, predictive and prescriptive—provide different impacts on manufacturing decision making, with the human influence continuously decreasing and the degree of automation increasing across the four levels [5].
Descriptive analytics is the process of using historical data to gain insight into what happened in the past [6]. It offers insight into what has happened in the past to support data-driven decision making based on historical patterns. Diagnostic analytics determines the causes of certain results or problems. It combines historical data and statistics to find patterns, correlations, and anomalies. This yields actionable knowledge for error correction and process optimization. Predictive analytics seeks to predict future trends, occurrences, and results [7]. Whereas the two previous stages focused on historical data, predictions are usually made using real-time data. Predictive analytics is a critical component for data-driven decision making and predicting future events. Furthermore, the decisions that need to be taken are often time-critical and involve human uncertainty. Prescriptive analytics aims to provide decision recommendations to support actions or to fully automate decisions so that human intervention in the decision-making process is no longer required [8]. It is the most advanced form of decision support system. To provide a practical example, data is continuously recorded in a production line. Based on the first three data analytics stages, the data is analyzed to first determine correlations and causes of errors, which can then be used to predict future errors. Based on the predictions in combination with the contextual knowledge of production (e.g., availability of personnel and resources), the best possible actions are then determined using prescriptive algorithms. This action is either presented to the plant employees, who then implement it, or the decision is simply provided to the user for confirmation or veto, but the execution is performed autonomously.
However, the lack of a comprehensive framework poses challenges in implementing effective prescriptive analytics solutions. For prescription to be integrated into the decision-making process, three to four consecutive steps are generally necessary [9]. The three-step framework consists of the conditional trigger (intelligence), prescription (assessment of alternatives and selection) and execution (implementation or automation). The conditional trigger represents the transformation of system-related input into information that builds the foundation for the prescriptive step. This could be based on human input, previous data analytics steps such as a prediction, external factors, or a combination. The prescription step is divided into two phases: The first phase focuses on the assessment of possible actions. An action space for the selection process is provided as a result of the step. The actions are drawn from a given knowledge representation. The knowledge representation can vary depending on the chosen implementation (e.g., graphs or machine model). As the framework is meant to be used as a first solution-independent step to approach a system, both steps could be implemented by the same algorithm (e.g., reinforcement learning) or separately (e.g., recommender engine and human decision or automated decision). The last step either automates or executes the given decision. An optional feedback loop to the knowledge representation can be used to create a learning system. The resulting decision–effect relation serves as a parameter changing the current system under observation. This change is then registered by the intelligence level and evaluated again. The introduction of a prescriptive framework is essential for the success of smart factories [10]. The framework serves as a reusable set of tools, libraries, and standards to streamline application development by providing predefined structures.
In addition to frameworks, platforms and reference architectures should also be analyzed in order to describe the overall state of the design aids provided by researchers for the implementation of sophisticated prescriptive analytics initiatives in companies. Platforms provide a basic environment that includes both the hardware and software elements with which a program is executed. These have already been covered in the context of prescriptive analytics, which is why we refer to [11]. However, reference architectures for prescriptive analytics in the context of smart factories have not yet been realized. Reference architectures for prescriptive analytics are important for standardizing best design practices, which speeds up development and minimizes the risk of errors. They also promote the scalability and reliability of systems by drawing on established solutions. The use of common architecture standards simplifies the exchange of knowledge between teams and improves collaboration. They also support adherence to security and compliance standards, resulting in robust and secure systems overall.
Our goal is to simplify the introduction of prescriptive analytics use cases into the smart factory by presenting both a prescriptive framework and a prescriptive reference architecture which seamlessly integrate. Compared to the existing literature, we contribute the following aspects:
  • We perform a literature review on reference architectures and their suitability for integrating prescriptive analytics in smart factories. We further show how prescriptive analytics reference architectures differ from platforms and frameworks;
  • We propose a framework for prescriptive analytics and validate it on a smart factory use case. The framework is derived from human decision making combined with prescriptive components;
  • Further, we propose a prescriptive reference architecture for multiple smart factory use cases. The linking of the use cases is handled via an orchestration layer.

2. Relevant Concepts and Definitions

In the section that follows, an overview about the relevant definitions and concepts for the analyzed work is provided. First, essential terms are defined. Based on this, different perspectives on prescriptive analytics are derived. This serves as a baseline for the analysis of existing use cases, frameworks and reference architectures in the context of both prescriptive analytics and smart factories. We describe the problems that arise for users of prescriptive analytics in smart factories and motivate the need for frameworks and reference architectures.
The concept of the smart factory stems directly from the Industry 4.0 initiative and represents the instantiation of the concept for the production and factory context [12] (p. 14). So-called cyber–physical systems (CPSs) form the cornerstones of the smart factory [13] (p. 795). CPSs are mechatronic systems with the ability to network across products, devices and objects and to interact beyond application boundaries. The concept of the cyber–physical production system (CPPS) is analogous to that presented in [14] (p. 29). A decision is defined as follows:
“A decision is an act in which one of several possible alternative courses of action is selected in order to achieve a certain goal”
[15] (p. 162).
These action alternatives are transformed into actions through the decision-making process [16] (p. 8). The decisions to be made often have factory-wide effects and should be carefully considered. This requires a great deal of knowledge about the production under consideration. However, this knowledge is still concentrated in the heads of a few experts within the company [17] (p. 58) [18] (p. 6). Due to demographic change, short-term layoffs or illness, this knowledge is lost over time and the decision makers are faced with increasingly complex systems [19,20]. This uncertainty should be compensated for by integrating data into the decision-making process [3] (p. 338). To ensure that decision makers do not have to acquire additional knowledge about data analysis, automated systems should be introduced to provide decision support immediately [21]. The most sophisticated form of data analytics that is relevant for this work is prescriptive analytics, in which concrete recommendations for action are made or automatically executed [22].
In the context of decisions made in a smart factory, additional constraints from a factory environment need to be considered (see Figure 1). Decisions are interconnected and vary in their complexity [23,24]. Decisions that are structured do exist, but there are already techniques to automate these (programmable decisions according to [25]). A factory always exists in an organizational context. Steering interests from production, maintenance and quality management collide to find a global optimum. Products, processes and resources need to be managed effectively to match production, maintenance and quality targets when disturbances or breakdowns occur [26]. This optimum is usually aimed at through the application of domain related strategies. These strategies could function like constraints or loss function equivalents for prescriptive analytics. This way, a decision can be made under the assumption that managerial strategies in the given time horizon (tactical, operational or strategic) can be supported [27].
Furthermore, Kumari et al. point out that decisions are always taken in a context (environment) and have effects on other decisions (e.g., hierarchical) [29]. We further motivate the need for an overarching framework and reference architecture by providing two extra views on prescriptive analytics (see Figure 2). Figure 2 serves as an overview on which perspectives one can take regarding prescriptive analytics use cases. We describe the two most relevant perspectives for this journal article:
  • Perspective 1: Decision Theory
Based on the proposed scope, prescriptive analytics can derive a lot of insight from decision theory. Different models to structure human-made decisions exist, like the proposed model from Simon or Panagiotou [30,31]. The main reoccurring elements are the differentiation between the actual decision and their execution. A well-drawn difference between the input for a possible decision and the knowledge that is inherent to the decision maker needs to be made. This knowledge can be enriched by learning from past decisions and the evaluation of their outcome or impact.
  • Perspective 2: Data Analytics and ML
The most prominent model in the domain of data analytics is the one provided by Gartner [22]. Gartner introduced the basic idea that the more advanced the analytics type gets, the less input is needed from humans during the usage of such sophisticated algorithms. Gartner also introduced the idea of differentiating between decision support and automation. No further information regarding the actual synergies with the other analytics types can be derived.
According to the Gartner definition, prescriptive analytics deal with the decision aspect and enables a direct call for action [22]. One needs to differentiate between making decisions and automating the decision outcome. This greatly restricts the amount of research that focuses on solutions with actual prescriptions in mind. If decisions are always crystal clear (structured), it is more of an automation task, and the actual prescription engine is very basic or non-existent at all (trivial). If they are semi structured, one might need to apply advanced systems and engines to derive an implementation strategy (e.g., graph-based systems with reasoning [17] or reinforcement learning [32]). Future research needs to address and evaluate possible archetypes of decisions that can be effectively supported and automated via analytics. The following requirements for a prescriptive analytics framework do result from the described challenges: The framework needs to match and expand the de facto standard of the Gartner analytics levels. Different levels of automation and human–decision-integration need to be possible (conformity). They are not the only dimensions that are key to develop prescriptive analytics implementations. It must be designed in such a way that existing prescriptive approaches can be incorporated into the model. An extension or specialization for future research must be possible (like that gathered by [17]).

3. State of the Art

In the section that follows, we examine different perspectives on prescriptive analytics use cases in the context of a smart factory. This knowledge base serves as a baseline for the construction of the reference architecture in Section 5. Section 3.1 examines different prescriptive analytics use cases. Section 3.2 focuses on prescriptive analytics platforms and their implications for a reference architecture. Section 3.3 focuses on adjacent reference architectures in the context of a smart factory that have implications for the given scope. This approach ensures a holistic view on the system under observation (smart factory) because approaches with a different degree of granularity from different research streams are being considered.
To lay a baseline for the following discussion, we differentiate between the often-mixed terms framework, platform, and reference architecture. A framework is used to structure the solution space of something that is to be built [33] (p. 511). A platform focuses on the architecture of different system components for a predefined purpose and outcome (of something that is built) [34]. An architecture (e.g., of a platform) can be derived from a reference architecture via instantiation [35] (p. 2). Other uses include the comparison and benchmarking of existing architectures [35] (p. 2). In the context of this journal, we use this differentiation to include approaches from different levels of abstraction and granularity. In general, these terms are often mixed because of the fuzziness in between the different levels of abstraction.

3.1. Prescriptive Analytics Use Cases

Based on existing literature reviews for prescriptive analytics algorithms (see [17,28,36]), systems and use cases, we summarize some main findings in the section that follows. Most prescriptive analytics use cases’ overarching different domains are still being developed in a specialized manner in regards to their implementation in real world scenarios [17]. Wissuchek et al. summarize that the human perspective in prescriptive analytics use cases is still under-represented, because most implementation approaches focus purely on the technical aspects [36]. In ref. [28], 35 different use case implementations for prescriptive analytics in the context of a smart factory were examined and clustered (see Figure 3). For additional details on the structured literature review, we refer to [28] (translation table for sources, see Appendix A). These use cases mostly focus on the following:
Focus area 1: prescriptive production planning and control [28];
Focus area 2: holistic prescriptive maintenance systems [28];
Focus area 3: improvement cycle-related prescriptive analytics [28].

3.2. Prescriptive Analytics Platforms

In ref. [11], a systematic literature review was performed to identify the technical readiness of prescriptive analytics platforms. While many papers exist that review data analytics platforms in an extensive manner (e.g., [37]), this is the only known work that investigates prescriptive analytics platforms in particular. The main findings are that no platform currently exists that goes beyond decision augmentation, with the majority of platforms only providing decision support. It is concluded that prescriptive platforms are still at an early stage which is largely due to missing frameworks and reference architectures. To summarize the elements present in prescriptive analytics platforms, a concept matrix was constructed (see Figure 4). A translation table from platform to source is provided in the appendix (see Appendix A).

3.3. Smart Factory Reference Architectures

On the highest level of abstraction, reference architectures are analyzed. Sixteen reference architectures in the overall scope of “reference architecture for the smart factory domain” were considered. We took overarching studies as well as detailed architectures into account (for overarching studies, see [38,39,40]). The scope of the reference architectures was chosen to include architectures that generally address smart factories and the application of data-driven approaches to the smart factory (see Figure 5).
The analyzed architectures are all structured in either a clustered way, layered way, or a mix of the two for visualization purposes. They either focus the application on just one machine, the whole shopfloor or the factory. Some approaches take a system of systems standpoint and consider the whole network (including overarching aspects). Apart from these trivial combinations, no reference architecture was found that focuses on one use case and its vertical integration (specific for one use case). The scope of this journal article focuses on reference architectures with a relevance for prescriptive analytics.
We did not find any reference architecture that focuses on prescriptive analytics (see Figure 6). The respective sources can be found in Appendix A. Most analyzed architectures already include sufficient information about existing IT systems and shopfloor control systems. Data analytics, in general terms, is also addressed as a general concept in most architectures. A general lack of compatibility between the different architectures is observed. The domain specific approaches are especially tailored to specific needs, and an interconnection to a broader concept of how a smart factory might function is not provided. Most architectures are not derived from a more abstract version but rather derived in a bottom-up manner.
Based on the three previous subsections, we conclude that to the best of our knowledge, no current approach addresses the integration of prescriptive analytics into smart factories.

4. Research Methodology

For the design of a reference architecture, the method proposed by Galster and Avgeriou was used [41]. The method was chosen because it enables input both from a practical as well as a research-driven dimension. It is based on an empirically grounded approach. In the context of the proposed method, empirically grounded refers to the need of evidence/traceability for each building block of the architecture [41] (p. 154). The method established by Galster and Avgeriou prescribes seven key steps for developing a reference architecture [41]. In what follows, the instantiation of these steps for the given research scope will be explained. A summary of the overall procedure with an overview on the time scale and supporting methods is displayed in Figure 7.
Phase 1 deals with the preliminary tasks before starting the actual data gathering and construction phase for the reference architecture. The definition of the type of reference architecture (Step 1) leads to a defined purpose and a set role of the reference architecture. Based on this, a design strategy is selected. The scope assessment is based on [43,44] as suggested by Galster and Avgeriou [41]. An overview on the needed design decisions for tailoring the method is provided in Figure 8. Decisions regarding the overall design are affected by the type of reference architecture that is being aimed for. In the main section, a short reasoning for the undertaken choices is provided. This phase is supported by the integration of previous research on a conceptual overview on respective use cases from [23].
Phase 2 focuses on the acquisition of data for the construction of the reference architecture. For this purpose, a rigorous literature search was conducted (compare with Section 3). Based on the design strategy, needed data sources and types of knowledge are identified. The level of detail is decided upon to further refine the input literature for the design. Literature work from the following papers was taken as input for this journal:
  • Structured literature research with a focus on prescriptive analytics use cases in smart factories [28];
  • Structured literature research with a focus on prescriptive analytics platforms (broader scope and specific for manufacturing) [11];
  • Rigorous literature research on existing reference architectures (for results, see Section 3.3). For this search, we focused on reference architectures for the overall scope of the smart factory with a narrower focus on intersections with “industrial data science”, “industrial analytics” and “prescriptive analytics”. Sources that include the terminology of “industry 4.0” or “industrie 4.0”, “smart manufacturing”, “manufacturing analytics” or “big data analytics for manufacturing” were included in our search.
Phase 3 addresses the actual construction of the reference architecture. Variability is enabled through well-defined aspects that allow instantiation. By defining the different aspects and their purpose, a holistic approach for the reference architecture is ensured. Previous work on a framework for singular prescriptive analytics use cases is integrated into this work (see [42]). A framework and a black box view are provided to enable further detailing of the concept to other domains. Finally, the evaluation of the reference architecture is conducted through a real-world implementation.
The evaluation of the contribution is based on the reference architecture’s correctness and utility as well as the architecture’s support for efficient adaptation and instantiation. Part of the evaluation procedure is mapping existing reference architectures on the resulting one to check for compliance [41] (p. 157). The evaluation step is of utmost importance when building the reference architecture from scratch [41] (p. 157). The quality of the reference architecture can be measured based on its adaptability, understandability, accessibility within an organization and how well it addresses the inclusion of key challenges of a specific domain [41] (p. 157). We demonstrate the functionality of the reference architecture by applying it to a real-world setting. Based on this, we compare the resulting architecture with the requirements which are grounded in the literature. The intersection of our solution with existing architectures is examined. Finally, qualitative expert interviews with practitioners are conducted to evaluate the provided reference architecture. For the interviews, we use the method according to Myers and Newman [45].
Figure 8. Decision pathways for the design of the reference architecture based on [41,44,46].
Figure 8. Decision pathways for the design of the reference architecture based on [41,44,46].
Mathematics 12 02663 g008

5. Reference Architecture for Prescriptive Analytics in Smart Factories

The scope of this section is to present the development of the reference architecture based on the framework for one use case and input from the literature. The validation is prepared, and possible instantiation of the reference architecture is enabled. This section’s structure is based on the methodology. First, the framework for a singular use case is presented. Based on this, the scope and strategy for the design of the reference architecture are presented. With the findings from the literature (see Section 3), a conceptual reference architecture is constructed. The final reference architecture is presented in Section 5.6. The last subsection explores avenues of instantiating the created reference architecture.

5.1. Framework for One Use Case

The following subsection is part of the previously published conference paper [42]. The resulting framework for one prescriptive analytics use case (Figure 9) is constructed from the requirements from the three provided perspectives which are derived from the existing literature. The resulting framework is described in this section.
The three-step framework consists of the conditional trigger (intelligence), prescription (assessment of alternatives and selection) and execution (implementation or automation). Each step has a defined input and output (compare Figure 2).
The conditional trigger represents the transformation of system-related input into information that builds the foundation for the prescriptive step (Figure 2). The intelligence unit can be represented by either descriptive-, diagnosis- or predictive-based algorithms. They are usually developed with known procedure models like CRISP-DM (see [47,48] (p. 8)). This aligns our model with the analytics model from Gartner [22]. The output of the intelligence step is crucial because it lays the foundation for effective prescriptions. A prescription is only valid if the trigger is valid. This emphasizes the fact that a prescriptive algorithm can be separated from the actual trigger (e.g., in the case of prescriptive maintenance: prediction of breakdown, break down information or root cause analysis outcome). That way, prescriptive analytics is introduced with a focus on prescribing a recommendation for a decision. The prescription step is divided into two phases. The first phase focuses on the assessment of possible alternatives in the given situation. Prescriptive alternatives result as an output. A solution space for the selection process is provided. These are drawn from a given knowledge representation. Because the framework is meant to be used as a first, solution-independent step to approach a system, both steps could be implemented by the same algorithm (e.g., reinforcement learning) or separately (e.g., recommender engine and human decision or automated decision).
The knowledge representation can vary depending on the chosen implementation (graph, table, environment, machine model, etc.). Thus, it serves as an agnostic representation in the provided framework. The selection step (second phase) focuses on choosing, ranking or listing possible options for the given decision problem. A feedback loop is possible to reassess the alternatives if the selection process does not find a suitable option. Decisions that are based on continuous events can be dealt with in a discrete manner through setting a smart conditional trigger to evaluate (and re-evaluate) prescriptions only if necessary. Additional input (e.g., characteristics of the system) can be derived from the knowledge representation. The output is an evaluated decision which can be used as an input value for the next step. The last step either automates or executes the given decision. An optional feedback loop to the knowledge representation can be used to create a learning system. The resulting decision–effect relation serves as a parameter changing the current system under observation. This change is then registered by the intelligence level and evaluated again.
The framework should be applied to decision-making processes in a smart factory environment. Thus, it can be applied to decisions related to quality, production processes or maintenance (product, process or resource) cases. Supervising processes (e.g., logistics) and business processes are also part of the decision environment in the smart factory. Its focus is the concept for one singular use case.

5.2. Integration of Multiple Use Cases into a System under Observation

The framework contains characteristics that define its characteristics if implemented. Characteristics of the framework are:
  • The degree of automation in each step (human/machine) as introduced into prescriptive analytics by Gartner [22];
  • Possible feedback loops can turn the static system into a framework for learning from past decision processes (e.g., through preferences or metrics measuring the success of the prescribed and executed decision). Implications based on the needed real-time capability need to be regarded when choosing the right implementation pattern for prescriptive analytics [36];
  • The amount of interconnectivity between the given decision engines and engines on the same level or levels beneath that (e.g., in the automation pyramid or a decision on a different time horizon) needs to be regarded for both the input and output of each implemented solution (in regards to a system of systems approach) [23,24];
  • The way the framework is designed, it is applicable to one use case with a defined scope and system under observation. Resulting effects on other systems outside of the system under observation need to be addressed by additional vertically or horizontally connected frameworks (Figure 10). Thus, there are decisions taking place on the same level (e.g., in another robot cell) or above that (e.g., in the manufacturing execution system of the plant).
Singular use cases need to be interconnected and integrated into the existing legacy IT to function properly [12] (p. 6). Therefore, interfaces are essential to scale use cases [49]. Prescriptive analytics usually needs a high degree of interconnectivity of data [13] (p. 806) and decision cause–effect relations [50] (p. 141) [51] (p. 876). Existing implementation endeavors still result in systems with a low technical readiness level [17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52]. Problems in integrating AI and analytics into the smart factory (gap between desire and existing technical implementation [53]) also pose ongoing challenges to practitioners. Based on the examined state of the art (see Section 3), we conclude that there is a need for additional support for the development, integration and facilitation of prescriptive analytics use cases for smart factories.

5.3. Definition of Type of Reference Architecture

Reference architectures should serve the purpose of ensuring interoperability through standardization. They should facilitate instantiation of new architectures [41] (p. 154). To meet these goals, reference architectures are usually based on the viewpoint of a certain domain (technical-, application- or problem-related domain) [41] (p. 154). As elaborated on in Section 4, this subsection focuses on the definition of the type of reference architecture that is being designed. The role and purpose of the reference architecture will be defined to ensure an optimal tailoring to the given problem space.
Role: For the introduction of new prescriptive analytics use cases, a set of standard phases are usually executed. Use cases are planned, conceptualized, integrated, developed and later deployed [54]. If a greater set of use cases is being developed, synchronization efforts in between them might take place [54]. The integration of prescriptive analytics use cases into smart factories is the focus of this journal. This focus is primarily chosen to provide prescriptive advice for the development of future prescriptive analytics use cases in the smart factory environment.
Purpose: The reference architecture is supposed to enable variability and facilitate a discussion on the different aspects that need to be regarded when integrating use cases holistically. It should raise the overall technical readiness level of future implementations. Giving guidelines to practitioners on how to instantiate the reference architecture is part of the scope. We hope to create a common ground for discussion and the further development of the research field of prescriptive analytics. It should enable practitioners to make the next step in terms of data-driven decision making. It should also enable researchers to design and construct possible solutions on their respective scope (platforms, use cases or company-specific reference architectures). Finally, requirements towards the construction of a reference architecture are raised. These are later used to challenge the existing state of the art and check if a trivial combination of existing approaches might meet the requirements (REQ) posed for the given scope.
REQ-Category 1: Integration into an existing shopfloor legacy IT
  • REQ1: Existing Manufacturing IT system taken into consideration (based on Section 3). To ensure practical relevance, it is essential to build on the current state of a factory rather than creating an architectural vision that cannot be realized in the near future. This includes integrating existing approaches from all kinds of analytics as well as control theory and operations research (based on Section 2);
  • REQ2: Compatible with existing and established reference architectures. The relevant state of the art needs to be considered. This results in the integration or design of suitable interfaces to seamlessly integrate into the existing state of the art.
REQ-Category 2: Strong emphasis on prescriptive analytics
  • REQ3: Focus on actionable decisions (based on Section 2). This ensures a sufficient usage of prescriptive analytics [13] (p. 808) [28,55] [56] (p. 149). This also implies a compatibility with different levels of autonomy;
  • REQ4: Representation of different forms of decision interactions (based on Section 2). For prescriptive analytics, different maturity levels and degrees of autonomy from human decision makers are possible [21,36].
To summarize: The overall goal of this work is to provide an industry specific (in this case: smart factory) reference architecture for the integration of prescriptive analytics use cases (see Figure 11). It is of preliminary nature (“when”) because of the low maturity of existing use cases in the field [17]. It is primarily designed to facilitate the development of future use cases with a high degree of maturity (“why”). It should serve implementation in different (multiple) organizations.

5.4. Selection of Design Strategy

The following subsection provides an overview about the relevant preliminary design decisions that were undertaken to further refine the scope for the development of the reference architecture. We build on existing reference architectures to fulfill the stated requirements regarding compatibility. Our approach is to facilitate the development of prescriptive analytics solutions without creating another view on the already complex system of research on analytics for smart factories. Prescriptive analytics does add new nuances but does not completely change the way we perceive the overall functionality of a smart factory. We take a research-driven approach to address the uncertainty that practitioners are faced with when trying to raise their companies’ analytics capabilities to the next step. To summarize: The reference architecture will be built on existing reference architectures. Even though this might imply a practice-driven approach based on Galster and Avgeriou [41] (p. 156), we stick to a research-driven approach (see Figure 12). This is due to the research-driven nature of the input reference architectures.

5.5. Empirical Acquisition of Data

The knowledge base for the construction of the reference architecture is presented in this section. A content analysis through a literature analysis is one way of building up the knowledge base for the construction of the reference architecture [41] (p. 156). The descriptive parts of the findings were already presented in Section 3. Based on the concept matrix (see Figure 6) and the requirements from Section 5.3, we analyze if the existing state of the art fits our design objectives. The summary of the analysis is depicted in Figure 13.
None of the existing reference architectures proposes a sufficient solution for the integration of prescriptive analytics use cases into smart factories. No trivial combination of existing approaches can be constructed that satisfies all requirements. Thus, development efforts for a new reference architecture need to be made. In addition, the compliance to existing architectures is essential to ensure interoperability and compatibility with the existing state of the art.

5.6. Construction of Reference Architecture

This subsection proposes the main contribution of this journal article. The reference architecture for the integration of prescriptive analytics use cases into smart factories is presented. First, the views and mode of variability are defined. Expanding on this matter, a black box view of the reference architecture is proposed to enable a discussion on the influencing factors on the given system of interest. A conceptual architecture is proposed to derive the architecture’s high-level features, building blocks and interconnections.
Galster and Avgeriou propose modeling reference architectures based on the established IEC/IEEE ISO/42010 [57]. We only address the logical architecture view. We frame our concerns (compare [57]) in terms of the requirements (see Section 5.2) and research gap shown in Section 5.3 [57,58] (p. 49). When integrating different perspectives like processes, IT-, data- and physical-architecture on the shopfloor need to be considered. The concerns are addressed and summarized by our requirements which stem from a broad variety of the literature, addressing both engineering as well as data science concerns. Possible views are reduced to a logical view for the integration of prescriptive analytics into pre-existing smart factory concepts (retro fitting). Additional usage scenarios for the reference architecture like greenfield factory planning are out of scope in the proposed work. To summarize: We choose to address the logical view of a reference architecture. We enable variability by annotation and additional description (see Figure 14). In general, variability is introduced through provided guidelines, additional information or context to further detail the given abstract concept towards the chosen scope and system under observation.
In the context of production management, decisions are made with different time horizons and varying degrees of uncertainty. Different latency factors slow down the time from event to actual decision execution (action). Latencies result from the time required for data transfer, data analysis, decision making and implementation of the decision [3] (p. 334). The business value of a decision support solution depends on minimizing the time between event and decision [17] (p. 58). The quality of decisions made, and the evaluation of decision-making processes is a major challenge. Measuring the quality of decisions is a current subject of research (cf. concept of bounded rationality) [16] (p. 130). The quality of decisions does not necessarily correlate with the amount of data available. Rather, it is important to analyze each decision-making context in detail. Metrics for an overarching assessment of a decision are difficult to determine [59]. For the construction of the black box view, relevant perspectives and adjacent topics for the construction of the reference architecture are considered (see Figure 15). The black box view serves the purpose of summarizing relevant influences on the system of interest. It presents a summary of both data science as well as smart factory related aspects:
  • Time scale of decisions taken: Use cases differ in terms of the urgency of the decision (ad hoc to long-term) [60] (p. 11). The degree of interconnectedness and the validity of the scope of the decision vary greatly [23]. The same applies to the impact of an individual decision taken [12] (p. 36). The goals, types and results of decisions vary as well [61] (p. 115). The type of interaction with the decision maker (human/algorithm) also provides for different characteristics of a use case [62] (p. 10);
  • IT-stack: IT systems vary in terms of their overall role, scope and degree of interconnectivity. There are three key activities in the context of smart factory data which include the horizontal and vertical integration of systems and data streams as well as a holistic systems engineering approach [63] (p. 15). Horizontal integration describes networking via the control and process management level in the context of procurement and distribution to other companies or entities. It enables fast reaction times in the event of changes and facilitates decentralized and flexible production [63] (p. 13). The core of vertical integration is the linking of IT systems across hierarchical levels. The main advantage of this integration effort is the possibility of synchronizing business processes and workflows across different companies [63,64]. The perspective of holistic systems engineering pursues a life cycle approach. Further data is added from the product development and utilization phase. These can be made available implicitly at all levels on an event-based basis [63] (p. 14). These endeavor towards networking within the automation pyramid are slowly breaking it up. Concepts from the Internet of Things are creating bypasses that enable singular solutions with direct networking and relocation of sub-functions to the cloud. It is rarely about replacing existing systems. Rather, the rigid framework for action is being replaced by a network-like structure. It remains to be seen to what extent both trends will level out [65] (p. 7);
  • Relevant decision makers (departments): Which decision makers are involved (departments, individuals or groups) [16] (p. 13). The decision maker introduces a vital component into the decision-making process based on his preferences and goals.
  • Macro cycle for continuous improvement activities (CIP): Which CIP processes are interlocking with the given decision environment and changing its characteristics for long-term improvement? [66,67] (p. 377)
  • The system under observation (smart factory) itself (system under observation, see Section 2) is a relevant aspect. Here, it is briefly described by its resources, products and manufacturing processes [23]. They are interlocked with existing analytics solutions in the smart factory [68]. The existing analytics solutions usually consists of four layers: use case (business), analysis (algorithms), data pools (where the data is located) and data sources (e.g., resources and sensors) [69].
Firstly, a conceptual architecture is constructed to derive the overall logical connections of the reference architecture based on the black box view. In what follows, all key elements of the conceptual architecture will be described. Based on the key elements, their interconnection and interrelations are examined. Reasoning and sources are provided.
  • Overview of the respective architecture elements:
  • Existing shopfloor IT systems, existing analytics use cases and shopfloor layer: These elements were already described in the black box view. The shopfloor layer represents the as-is shopfloor with its inherent complexity;
  • System engagement layer: This level represents all real-time oriented systems in a smart factory, which are not based on analytics. This can represent all relevant data sources for analytics like control theory-based algorithms (e.g., compare [70]), digital twin-based technology ([71], descriptive) and other systems authorized to intervene with the shopfloor processes;
  • Data: As present in most other reference architectures, the data layer represents the data flow through the smart factory IT stacks (horizontal and vertical interconnection) [63];
  • Actionable decisions: They represent the core element in the conceptual architecture. They interconnect decision makers and respective systems [72] (p. 18);
  • Human decision maker: The human decision maker is an essential element. Possessing up to a full level of autonomy, they will always be involved in decisions or their governance. Based on Gartner’s seven levels of hybrid decision making [21], we differentiate between the following decision interaction schemes: decision confirmation, decision veto, decision audit and decision demand. If a decision is demanded, advice- or recommendation-based outputs can occur;
  • Decision Orchestrator: Lastly, some decision overview mechanisms need to be in place. Even though the concept of Industry 4.0 refers to implementing decentralized decision making, a global optimum still needs to be governed (and achieved). This might diverge from local optima of an area of the smart factory.
To summarize: Connections in the system under observation (smart factory) are visualized in Figure 16. The data streaming from the shopfloor to traditional IT systems is mostly present in modern day factories. The challenge of curating and choosing the right data remains [73]. Analytics needs to be seamlessly integrated to enable actionable decisions and in a cooperative approach with the human decision maker. Some kind of orchestrating element to interconnect the human decision maker, different prescriptive analytics use cases and traditional IT systems is needed.
Based on the conceptual architecture, the final reference architecture for the integration of prescriptive analytics use cases into a smart factory is constructed. The reference architecture integrates all aspects of the conceptual one and adds necessary details for efficient use and instantiation. In what follows, the reference architecture (see Figure 17) is briefly explained. The framework for one use case seamlessly integrates into the big picture of the reference architecture.
In addition to the reasoning provided, all building blocks are derived from the studied literature, expanded by our own knowledge and understanding of the domain. We integrate additional input from smaller-scale concepts for use cases and the development of use cases (see Section 3 and Section 5.1). We include common knowledge of the field like the analytics levels according to Gartner [22]. Additional inspiration is drawn from neighboring fields like CPS and intelligent systems reference architectures with a focus on singular products [70,74,75].
In what follows, only the elements not explained in the conceptual architecture and black box view will be described (Elements that were already explained before include the following: existing IT and system engagement layer, actionable decisions and human decision maker):
  • Levels: The reference architecture is organized in three time-related levels. The action level focuses on real-time (matter of urgency) elements. The planning level addresses smart factory related processes on a higher level of abstraction. The business level gathers input from the surroundings of the system under observation (smart factory). This is in line with structuring approaches that often refer to operational, tactical and strategic decisions (e.g., [15] (p. 163)). All levels are in line with the general understanding of IT systems based on the automation pyramid;
  • Decision triggers: There are different triggers (and connections) that may trigger the need for an actionable decision [42]. Business triggers from outside of the system under observation serve as an input. Direct shopfloor triggers may stem from sensors or partially autonomous systems that diagnose the need for actions to be taken. If a demand for a decision occurs from the human decision maker, another trigger type needs to be regarded;
  • Data: The data layer is refined with additional information. Different kinds of data is regarded that is relevant for prescriptive analytics and the non-prescriptive types of analytics (expert knowledge [17] (p. 67), decision data and historical factory data [4]). These data can be available in the form most suitable to the given environment (e.g., graph based [76]).
The integration of the conceptual model for one prescriptive analytics use case is realized by using it as a building block for singular prescriptive analytics use cases. This way, it is integrated into the holistic system under observation. Interfaces to both existing IT systems and existing analytics use cases need to be analyzed and implemented.

5.7. Variability of Reference Architecture

The instantiation of the reference architecture needs to be enabled through a well-refined mode of variability. Instantiation can be enabled through defining concrete elements and guidelines on how to realize the provided conceptual blocks. All possible building blocks need to be detailed based on the system of interest. A more detailed version can be created per company (if multiple smart factories with different scopes do exist). The provided data concept enables variability through providing insights into what kind of interfaces are needed. The conceptual reference model for single prescriptive use cases serves as a baseline for the design of the use cases. From there on, the mode of interaction needs to be refined. Some concepts are kept vague (e.g., data storage mode and security). These aspects are addressed by other domains and reference architectures and are not part of the inherent scope of the presented reference architecture. This enables practitioners to update these aspects according to the current state of the art in these domains.
The decision interaction concept (all loops, see Figure 18) for the decision orchestrator can be structured based on if a central, decentralized, or multi-level hierarchy of decisions should result. The decision concept defines which prioritization approach is chosen if singular decisions in parts of the given system under observation might be contradicting or affecting each other. This is depending on the use cases that do exist in the given factory and the IT systems in place for the interconnection in between different decision-making entities. Here, we refer to concepts of observer and controller architectures (compare [70,71,72,73,74,75,76,77]).
Another means to facilitate variability is by supporting the instantiation of the reference architecture by the choice of suitable workshop methods. Workshops can help enable a discussion about the reference architecture. To allow this discussion, we provide a blueprint for using the conceptual reference architecture as a canvas and to enrich it with the derived workshop elements. The workshop elements are derived from the ArchiMate standard (see [78], compare Figure 19) to ensure compatibility with other reference architectures in the domain of enterprise architectures and enterprise architecture management.
For efficient usage in a workshop format, we propose the following method. First, use the conceptual reference architecture as a baseline and add the respective IT infrastructure and existing use cases. Based on this, execute a fit–gap analysis in comparison to the reference architecture. Lastly, deduct steps based on the existing gaps, and find suitable measures to counteract these.

6. Evaluation of the Constructed Reference Architecture

The evaluation procedure is organized into three subsections to address the quality criteria and dimensions for evaluation based on Section 3. Section 6.1 demonstrates the general applicability of the reference architecture for a provided real-world usage scenario. The demonstration serves as a case study to provide additional insights into possible loopholes and challenges in the practical application of the proposed architecture.
Section 6.2 provides an overview about how the reference architecture is meeting the requirements from Section 6.3 and how compliant it is to existing reference architectures for the same domain with a different scope (e.g., RAMI, I4.0 Automation Pyramid). The last subsection provides qualitative expert interview-based insights into the overall quality of the result. This creates a baseline for the discussion in Section 7.

6.1. Demonstration and Application of the Proposed Reference Architecture

The reference architecture is further demonstrated on several use cases in the “IoT factory Gütersloh”. An overview of the factory is provided in Figure 20. First, the use cases are described. Based on the use cases, the instantiated version of the reference architecture was used to construct the implementation in the real-world scenario.
The IoT Factory (IoT Factory Gütersloh is the name provided by the research facility; all characteristics of a smart factory (compare Section 2) are fulfilled) consists of 19 production stations with which IoT products are manufactured and then disassembled in a continuous cycle. The stations include 5 robot arms, 3 quality control stations and 1 external and 1 internal warehouse, as well as a fleet of 4 autonomous guided vehicles (AGVs). Each station is equipped with sensors designed for the tasks performed, e.g., force and torque sensors on the robot arm, acceleration and laser sensors on the AGVs and power meters on the conveyors. The extended equipment in combination with the continuous production maximizes data generation without additional costs. Further, each produced IoT device (see Figure 20) is equipped with multiple sensors to diagnose its state (e.g., quality data) during the assembly and disassembly process. The combination of both machine and product data allows for different viewpoints on the production process [79]. The IoT Factory is used to validate the presented reference architecture with various use cases that are controlled via a decision orchestrator.

6.1.1. Enabler Use Case: Structured Gathering of Prescriptive Analytics Knowledge

The first use case involves the formalization of different knowledge sources for the prescriptive process. This is being tested with a robotic gripper in the IoT factory, which has the task of continuously assembling and disassembling IoT devices. During production, the manufacturing execution system’s data is recorded and combined with existing documents. Text data from documents and time series data from IoT sensors are processed with individual pipelines.
To process the documents, they are divided into small text sections with overlapping windows. Each text section is then stored with meta information, e.g., document name, page number and sender affiliation. In addition, the embeddings of the individual text sections are constructed in order to determine the similarities on this basis. Each text section can now be saved as a node in a knowledge graph. A knowledge graph is a collection of linked descriptions of entities, and the relationships can be determined using the embeddings and meta information. Nodes are used to describe concepts (about the knowledge that is being structured) and (abstract) relations between them are represented by edges [18]. This knowledge graph makes it possible to create a context and is used to perform data integration and analytics, serving as a baseline for different prescriptive analytics use cases in the IoT factory.
The time-series data is continuously gathered and processed from the IoT sensors. The processing is performed with a classical descriptive-to-predictive pipeline. First, the raw data is cleaned from faulty and missing samples. Streaming features are then extracted from the data. For sensors, e.g., force sensors that measure during plugging and drawing processes, maximum, minimum and average values are determined, among other things. For audio signals from microphones, on the other hand, features are determined in the frequency range. The features are further combined to define a health index for each device and station. Using similarity measures, we can predict future machine or product errors.

6.1.2. Use Case: Prescriptive Resource Management

The second use case deals with prescriptive resource management for a robot gripper cell. Decisions need to be made in case a possible system breakdown is predicted (compare Figure 21 and Section 6.1.1). Some errors, but not all, that can occur involve the gripper losing its capacity to grip the product correctly due to deformation of either the gripper or the product, the assembly station or the conveyor belts are already occupied or one of the various sensors detects that a worker has performed a manual task that is not part of the automated production line. Based on a prediction under uncertainty (intelligence), an assessment of prescriptive alternatives is made (assessment of alternatives). The assessment of alternatives is based on the knowledge graph which stores relevant production decision knowledge (Section 6.1.1). A set of possible alternatives is provided in step three (selection of alternatives). This could be, for example, that the gripper pressure should be adjusted to also be capable of gripping objects despite a deformed gripper. The sets of recommendations are based on the expert knowledge from the different factory-related expert documents (e.g., FMEA, reports and operation guidelines). The resulting solution space is evaluated based on the existence of possible solutions in the knowledge graph. A prescription is chosen (step three). The following execution of the provided prescription is independent of the actual prescription engine of the robot gripper cell because it just executes on what has been decided.
A task within the robot cell, e.g., picking the top cover from the box and placing it on the IoT product or picking and moving a new product from the conveyor belt to the processing station, takes between 15 and 20 s. This time window between production tasks allows our prescriptive engine to process the data and provide a recommendation. Ideally, a prescriptive decision should be made before the start of the next planned action.
Uncertainties in the proposed action are naturally high, in particular for newly setup systems. In the current state of implementation, recommendations are provided that are fed back via a human operator. This feedback allows for continuous improvement of future recommendations and therefore reduces the uncertainty over time. Nevertheless, at this stage of research, we suggest always having an expert approve or veto the recommendations and to not automatically execute them.
When we build our prescriptive model, we explicitly assumed that the set of actions available contains all possible actions and that they are all physically feasible to execute. In a real production environment, this set of actions may change over time and requires continuous control and update by an expert. Furthermore, triggers executed by our predictive engine are assumed to be correct. The possibility of error propagation from a faulty prediction is neglected in the current state. These two assumptions further increase the need of a human in the loop.

6.1.3. Use Case: Prescriptive Quality Management

The third use case is related to quality management (product) and has been conceptualized and is currently being implemented and validated. While descriptive, diagnostic and predictive analytics can help to identify, classify and forecast quality problems, the root cause of the defect is not addressed. Prescriptive analytics can be used to recommend actions that restore the quality of the finished products, for example, recommending certain reconfiguration of machines or discarding a batch of faulty input material. Although the maintenance use case can also prescribe reconfigurations of a machine, the prescriptions for maintenance try to maximize availability, efficiency and longevity; prescriptions for this use case are made to maximize product quality. When a defect is detected with data from the three quality control stations of the IoT factory (intelligence), information about machines involved in the fabrication of the product are gathered from the knowledge base. To achieve this, the knowledge base can be extended with quality specific documents, like inspection reports, product defect logs and material certificates. Using this information, a set of alternative prescriptions is generated, which are then assessed using IoT data collected from machines and conveyor belts during production (selection). The assessment involves comparing the alternatives to previously well-rated prescriptions by employing a fault similarity metric. The rating, a numerical score assigned by a service worker after executing the prescription (implementation), reflects the effectiveness of the solution in resolving the quality issue.

6.1.4. Application of the Reference Architecture to the IoT Factory

Based on the use cases, we demonstrate the instantiated version of the reference architecture in the context of the IoT Factory. In the factory, various databases are in place (e.g., MariaDB, InfluxDB and Neo4j). They are supported by existing shopfloor systems like manufacturing execution systems. The analytics use cases described above are visualized in Figure 22. They are supported by non-prescriptive use cases which serve as input (descriptive factory dashboard, RUL-prediction and robot cell soft sensor). The system engagement layer consists of the constituting elements of the programmable controllers of both the inherent factory components as well as their resources (e.g., robots and automated guided vehicles). The shopfloor layer consists of the processes steering the production of the IoT product. The interconnection to other use cases is ensured via the usage of a microservice architecture for the shopfloor analytics IT backend.
The decision orchestrator is implemented in a rudimentary form (interface to human with prioritization). To ensure an efficient integration of the decisions into one user interface for the decision maker, an overarching user interface was developed. For the design of the decision interface, the human factor needs to be considered (e.g., visual analytics and information visualization [80]). For all three use cases, the single source of access for the decision maker serves as a baseline to enable a holistic approach for the decision confirmation and decision demands from the human decision maker.

6.2. Integration of the Reference Architecture into the Existing State of the Art

To create a compliant result with the existing state of the art, we compare our results to the existing reference architectures and the posed requirements from Section 5.3. All posed requirements are fulfilled by the resulting reference architecture.
  • REQ1: Existing manufacturing IT systems are taken into consideration through integrating them into the core of the reference architecture. Variability is enabled through only defining interaction schemes and leaving room for the usage of different IT systems in the reference architecture;
  • REQ2: The approach is compatible with existing and established reference architectures. This is ensured by building upon the existing state of the art. The detailed interconnection to other architectures is reasoned upon in the following section;
  • REQ3: A focus on actionable decisions was ensured by integrating them into the architecture as a key element. Variability is enabled by defining different interaction modes;
  • REQ4: The representation of different forms of decision interactions is enabled by the elements described for REQ3.
We compared the existing reference architectures to the resulting reference architecture for the integration of prescriptive analytics use cases into smart factories (see Figure 23). All included elements represent interconnections to already established domains. We focus on contributing how prescriptive analytics is supposed to be integrated into the preexisting big picture of a holistic legacy IT landscape in smart factories. Based on this, the following statements regarding the compliance to existing reference architectures can be made:
  • The contribution of this article is the integration of prescriptive analytics into the existing logical connections regarding data flow and IT systems. Based on the state of the art (other analyzed reference architectures, compare Section 3.3), we conclude that the characteristics of prescriptive analytics were not fully addressed in the previously contributed architectures. Reference architectures like [81] already provide good insights into the overall integration of machine learning applications into the shopfloor. Use cases like [82,83] add value through defining specialized workflows that can be used as use case instance-based instantiations for the reference architecture;
  • The overall system under observation (smart factory) is already well analyzed in established reference architectures like the automation pyramid, RAMI 4.0 and countless use case-specific analytics-based use case implementations. Thus, we only integrate the already existing findings and focus the value add on the prescriptive analytics-specific elements of the reference architecture.

6.3. Evaluation through Expert Interviews

We conducted six expert interviews to determine if all quality criteria (compare Section 4) are met. We followed the method of Myers and Newman [45]. An overview of the interviewees’ background is visualized in Table 1. All interview partners stem from the German research or industry sector. Based on a questionnaire for the semi structured interviews, key results were transcribed. After clustering the results, the following key takeaways were generated.
The interviewees where asked to judge the reference architecture and the demonstration based on the architecture’s overall adaptability, understandability and accessibility, as well as if it addresses key domain challenges (compare [41]). All interviewees judged the understandability and accessibility as high (4 out of 5). The overall adaptability was rated as high. Interviewees remarked (I1, I4) that further evaluation is needed after implementation. I1 and I4–I6 mentioned that key domain challenges like the integration are addressed. They mentioned that future work is needed to further detail possible modes of interaction. Interviewee I2 emphasized that the integration into the existing IT infrastructure is a vital aspect to ensure interoperability and profitability. If conducted well, a seamless integration of prescriptive analytics use cases into the factory bears the potential of making the smart factory less dependent on tacit expert knowledge. One main use case-related challenge that was acknowledged in the context of the reference architecture is the complexity of interlocking all the IT systems for the prescriptive analytics use case (I1–I6).
Based on this, the following section discusses the integration of the results into the current state of the art. Interviewee I1 and I2 emphasized the focus on human factors in decision making. They raised awareness for the importance of the integration of human feedback and the overall change in how the interaction takes place if more prescriptive analytics use cases are present in the smart factory. The most mentioned concern was that the user interaction needs to be designed in a way that the use cases will be accepted by the broad spectrum of personnel in the context of the shopfloor (I4, I2).

7. Discussion

In the following discussion, the implications regarding the framework for one use case and the reference architecture for the integration of prescriptive analytics in smart factories are analyzed. The underlying limitations of the presented work are discussed. A focus is placed on the implications based on the current existing state of the art.

7.1. Discussion of the Framework for Singular Use Cases

The following section is part of the previously published conference paper [42]. A reflection of the main usage and core restrictions of the framework will be described in this paragraph. The main advantage of the framework lays in its applicability to different implementation approaches of realizing prescriptive analytics. The framework presents a novel view on the implementation of deployed prescriptive analytics algorithms. It enables a modularization of prescriptive analytics approaches. It can be used to structure the general architecture of possible prescriptive analytics solutions (e.g., compare Section 3.1).
Firstly, the framework assumes the availability of high quality and reliable data. Insufficient or biased data can hinder the accuracy and effectiveness of the prescriptive models (general machine learning challenge [84]). Thus, real world restrictions might pose boundaries on fully automating decision workflows [36,85] (p. 16). The overall complexity of the presented framework is high. Elements like IT systems were conceptualized to raise accessibility. This approach reduces complexity but leads to fuzziness for the future implementation approaches.
While the prescriptive analytics framework proposes a structured approach, it also presents certain restrictions. A methodology to develop prescriptive analytics implementations is not presented. Even though the framework is derived from decision theory, additional concepts from decision theory like human behavior under uncertainty (e.g., compare [86,87]) need to be considered (perspective: decision theory). The framework does not assist with choosing a suitable implementation strategy (perspective data analytics). It provides no recommendations on how to choose the degree of automation and how to handle the interconnectivity of different decisions in a smart factory (perspective: smart factory). In the state of the art, we pointed out that existing research in the context of smart factories in combination with prescriptive analytics is rather scarce. Thus, more input from case studies needs to be introduced into the model in the future. To differentiate prescriptive analytics from already existing (and well-established) fields of research in analytics, we emphasize the point of understanding prescriptive analytics as a method that expands existing analytics solutions. Basic automations of already-developed analytics solutions provide great value but should not be regarded as prescriptive per se if they do not provide a prescription. It is essential to narrow down what can be regarded as prescriptive analytics. This way, automations without prescriptive character should be regarded as automations and systems with a prescriptive element should be differentiated from these. Basic cause-and-effect relationships are indeed structured decisions and should be tackled that way because of a good effort-to-effect ratio.

7.2. Discussion on the Reference Architecture for the Integration of Prescriptive Analytics Use Cases into Smart Factories

The constructed reference architecture is built upon the existing literature (compare Section 3). The applicability was shown in one scenario; additional applications in the future are needed to further enhance the reference architecture with additional context for variation and instantiation (according to [41]). Our main contribution focuses on the integration of aspects into existing reference architectures like the automation pyramid (compare [65]). It serves as an addition, not a substitution of pre-existing and de facto standards like RAMI 4.0 or the big data reference architecture (compare [57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88]). In Section 6.2, the interconnection to existing reference architectures was examined. The interconnection to concepts with a higher level of abstraction (e.g., maturity models) was not made (e.g., compare [89]). Existing limitations stem from the bias introduced by the choice of literature. To the best of our knowledge, we carefully examined reference architectures and their suitability to serve as an input value for our design process. The gap between the literature and industry (real-world cases) is partly addressed via the evaluation through industry expert-based interviews. The validation was conducted only in one system of interest with a restricted degree of complexity. The defined mode of variability is partially built on a high notion of abstraction. Only one viewpoint is provided. The provided workshop method can serve as a baseline but has its limits in more complex scenarios. For these, we advise practitioners to refer to model-based tools (e.g., from enterprise architecture management, e.g., with Archi [90]). For future research, we call practitioners and researchers alike to contribute to defining other viewpoints as well.
Despite our efforts, there are some existing challenges in the examined field of research that are currently not addressed by our current work. Prescriptive analytics use cases are still being developed and are currently implemented on a low technical and organizational readiness level [28,54]. The continuous development of new algorithms for prescriptive analytics is still an immature research field [17] (p. 59). Interconnections to adjacent research fields like generative AI for decision making, causal AI and decision intelligence need to be made. Part of this challenge is that many authors still have different perspectives on what is included in prescriptive analytics and where to draw boundaries to other fields of research. The effect on human factors was regarded but needs to be researched more extensively (e.g., compare [91]). Especially, the change in interaction between different departments and hierarchy levels in the factory might pose new challenges.

8. Conclusions

A reference architecture for the integration of prescriptive analytics in smart factories was developed. Based on the problem analysis, the current state of the art for use cases, platforms and reference architectures regarding the deployment of prescriptive analytics in smart factories was examined. First, the type of reference architecture was determined (industry specific). A research-driven approach was chosen and supported by a literature review on prescriptive platforms. The reference architecture is constructed based on a black box view, conceptual architecture and a framework for prescriptive analytics use cases. Variability is enabled through annotation. The evaluation is based on a threefold approach. The instantiation of the reference architecture is demonstrated. The provided approach is compared with the derived requirements and evaluated further through semi structured expert interviews. Our main contribution is a framework to structure prescriptive analytics, which ensures that related research can be streamlined and focused on its essence—the assessment and support of prescribing decisions or decision options to create actions in the given environment. It is supposed to create a baseline to develop future deployable prescriptive analytics concepts, solutions and techniques.
Prescriptive analytics framework for one use case: The presented steps are categorized into conditional trigger, prescription and execution. They were derived from requirements based on the existing literature on decision theory, smart factories and prescriptive analytics. The framework provides a structured approach for organizations and researchers to develop and deploy effective prescriptive analytics solutions. It is supported by examples and a decision environment to specify which kind of prescriptive analytics we refer to in a smart factory environment.
Reference architecture for the integration of prescriptive analytics into smart factories: We contributed a reference architecture that focuses on the integration of prescriptive analytics and the resulting actionable decisions into the existing shopfloor IT landscape. The reference architecture is supported by different viewpoints (black box and conceptual architecture). We provide means of instantiating the reference architecture. The evaluation focuses on a) showing the overall applicability and b) the accessibility and compliance from the viewpoint of practitioners.
The results of this paper need to be expanded towards other domains and serve as a means of deducting concepts to a broader field of application. Synergies to other domains like healthcare analytics and business process analytics should be researched in the future. Additional research questions include, but are not limited to the following:
  • Related to this specific journal article: The provided view can be expanded to a set of views (framework) to address the conceptualization and implementation of prescriptive analytics. For this, additional views need to be examined and structured. Validation is needed to explore the interconnections to adjacent domains as well as more complex use cases and their connections;
  • Related to the overall field of research: Additional use cases with a higher technical readiness level need to be developed to be able to further judge implications on organizations and overall decision optima when multiple use cases are at use. The establishment of a clearer understanding of what prescriptive analytics provides is needed so the term will be understood more clearly (and used accordingly) in the future.

Author Contributions

Conceptualization, J.W., N.M., T.H. and Y.N.; methodology, J.W.; validation, J.W., N.M. and Y.N.; writing—original draft preparation, J.W., N.M. and T.H.; writing—review and editing, J.W., N.M., T.H., M.K., W.S. and R.D.; visualization, J.W. and Y.N.; supervision, M.K., W.S., A.K. and R.D. All authors have read and agreed to the published version of the manuscript.

Funding

This journal article was funded via the open access program from Fraunhofer.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The results of this paper are based on two research projects. This research was supported by Project Results from the VIP4PAPS Project governed by the German Federal Ministry of Education and Research (BMBF, grant number 03VP10031). The results of this paper are based on the research project Kompetenzzentrum Arbeitswelt.Plus. This research and development project is funded by the German Federal Ministry of Education and Research (BMBF) within “The Future of Value Creation—Research on Production, Services and Work” program and managed by the Project Management Agency Karlsruhe (PTKA). The authors are responsible for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of this study; in the collection, analyses or interpretation of data; in the writing of the manuscript or in the decision to publish the results.

Appendix A

In what follows, translation tables for the studied literature from the state of the art and main section are provided.
Table A1. Use Case.
Table A1. Use Case.
Use Case 1[92]Use Case 18[93]
Use Case 2[94]Use Case 19[95]
Use Case 3[96]Use Case 20[97]
Use Case 4[98]Use Case 21[99]
Use Case 5[100]Use Case 22[10]
Use Case 6[101]Use Case 23[102]
Use Case 7[29]Use Case 24[83]
Use Case 8[66]Use Case 25[24]
Use Case 9[103]Use Case 26[104]
Use Case 10[105]Use Case 27[106]
Use Case 11[107]Use Case 28[108]
Use Case 12[109]Use Case 29[82]
Use Case 13[110]Use Case 30[111]
Use Case 14[112]Use Case 31[113]
Use Case 15[114]Use Case 32[115]
Use Case 16[116]Use Case 33[117]
Use Case 17[118]Use Case 34[119]
Use Case 35[120]
Table A2. Platforms.
Table A2. Platforms.
Platform 1[121]Platform 9[122]
Platform 2[105]Platform 10[123]
Platform 3[124]Platform 11[125]
Platform 4[126]Platform 12[127]
Platform 5[96]Platform 13[128]
Platform 6[129]Platform 14[130]
Platform 7[131]Platform 15[132]
Platform 8[133]Platform 16[134]
Table A3. Reference Architectures (RA).
Table A3. Reference Architectures (RA).
RA01[135]RA09[136]
RA02[137]RA10[138]
RA03[139]RA11[140]
RA04[141]RA12[142]
RA05[143]RA13[144]
RA06[145]RA14[146]
RA07[147]RA15[148]
RA08[149]RA16[81]

References

  1. Gachet, A.; Haettenschwiler, P. Development Processes of Intelligent Decision-making Support Systems: Review and Perspective. In Intelligent Decision-Making Support Systems: Foundations, Applications and Challenges; Gupta, J.N.D., Forgionne, G.A., Mora, M., Eds.; Springer: London, UK, 2006; pp. 97–121. ISBN 978-1-84628-228-7. [Google Scholar]
  2. de Almeida, A.T.; Bohoris, G.A. Decision theory in maintenance decision making. J. Qual. Maint. Eng. 1995, 1, 39–45. [Google Scholar] [CrossRef]
  3. Kiesel, R.; Gützlaff, A.; Schmitt, R.H.; Schuh, G. Methods and Limits of Data-Based Decision Support in Production Management. In Internet of Production; Brecher, C., Schuh, G., van der Aalst, W., Jarke, M., Piller, F.T., Padberg, M., Eds.; Springer International Publishing: Cham, Switzerland, 2024; ISBN 978-3-031-44496-8. [Google Scholar]
  4. Bousdekis, A.; Lepenioti, K.; Apostolou, D.; Mentzas, G. A Review of Data-Driven Decision-Making Methods for Industry 4.0 Maintenance Applications. Electronics 2021, 10, 828. [Google Scholar] [CrossRef]
  5. Duan, L.; Da Xu, L. Data Analytics in Industry 4.0: A Survey. Inf. Syst. Front. 2021, 1–17. [Google Scholar] [CrossRef]
  6. Bell, D.E.; Raiffa, H.; Tversky, A. (Eds.) Descriptive, Normative, and Prescriptive Interactions in Decision Making. In Decision Making; Cambridge University Press: Cambridge, UK, 2011; pp. 9–30. ISBN 9780521351492. [Google Scholar]
  7. von Enzberg, S.; Naskos, A.; Metaxa, I.; Köchling, D.; Kühn, A. Implementation and Transfer of Predictive Analytics for Smart Maintenance: A Case Study. Front. Comput. Sci. 2020, 2, 578469. [Google Scholar] [CrossRef]
  8. Schuetz, C.G.; Selway, M.; Thalmann, S.; Schrefl, M. Discovering Actionable Knowledge for Industry 4.0 from Data Mining to Predictive and Prescriptive Analytics. In Digital Transformation: Core Technologies and Emerging Topics from a Computer Science Perspective; Vogel-Heuser, B., Wimmer, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2023; p. 337. ISBN 978-3-662-65003-5. [Google Scholar]
  9. Weller, J.; Migenda, N.; Naik, Y.; Heuwinkel, T.; Kühn, A.; Kohlhase, M.; Schenk, W.; Dumitrescu, R. Reference Architecture for the Integration of Prescriptive Analytics Use Cases in Smart Factories. In Mathematics, Special Issue for Selected Papers From the 2023 IEEE International Conference on Advances in Data-Driven Analytics and Intelligent Systems, Marrakech, Morocco, 23–25 November 2023; MDPI: Basel, Switzerland, 2024. [Google Scholar]
  10. Choubey, S.; Benton, R.G.; Johnsten, T. A Holistic End-to-End Prescriptive Maintenance Framework. Data-Enabled Discov. Appl. 2020, 4, 11. [Google Scholar] [CrossRef]
  11. Niederhaus, M.; Migenda, N.; Weller, J.; Schenck, W.; Kohlhase, M. Technical Readiness of Prescriptive Analytics Platforms—A Survey. In Proceedings of the 35th FRUCT Conference—Open Innovations Association FRUCT, Tampere, Finland, 24–26 April 2024. [Google Scholar]
  12. Budde, L.; Hänggi, R.; Friedli, T.; Rüedy, A. Smart Factory Navigator: Identifying and Implementing the Most Beneficial Use Cases for Your Company—44 Use Cases That Will Drive Your Operational Performance and Digital Service Business; Springer: Berlin/Heidelberg, Germany, 2023. [Google Scholar]
  13. Zenkert, J.; Weber, C.; Dornhöfer, M.; Abu-Rasheed, H.; Fathi, M. Knowledge Integration in Smart Factories. Encyclopedia 2021, 1, 792–811. [Google Scholar] [CrossRef]
  14. Roth, A. (Ed.) Einführung und Umsetzung von Industrie 4.0: Grundlagen, Vorgehensmodell und Use Cases aus der Praxis; Springer: Berlin/Heidelberg, Germany, 2016; ISBN 3662485044. [Google Scholar]
  15. Mockenhaupt, A. Digitalisierung und Künstliche Intelligenz in der Produktion; Springer: Wiesbaden, Germany, 2021; ISBN 978-3-658-32772-9. [Google Scholar]
  16. Becker, W.; Ulrich, P.; Botzkowski, T. Data Analytics im Mittelstand: Aus: Management ud Controlling im Mittelstand; Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
  17. Lepenioti, K.; Bousdekis, A.; Apostolou, D.; Mentzas, G. Prescriptive analytics: Literature review and research challenges. Int. J. Inf. Manag. 2020, 50, 57–70. [Google Scholar] [CrossRef]
  18. von Rueden, L.; Mayer, S.; Beckh, K.; Georgiev, B.; Giesselbach, S.; Heese, R.; Kirsch, B.; Walczak, M.; Pfrommer, J.; Pick, A.; et al. Informed Machine Learning—A Taxonomy and Survey of Integrating Prior Knowledge into Learning Systems. IEEE Trans. Knowl. Data Eng. 2021, 35, 614–633. [Google Scholar] [CrossRef]
  19. Pfeiffer, S. The Vision of “Industrie 4.0” in the Making-a Case of Future Told, Tamed, and Traded. Nanoethics 2017, 11, 107–121. [Google Scholar] [CrossRef]
  20. Auer, J. Industry 4.0—Digitalisation to Mitigate Demographic Pressure: Germany Monitor—The Digital Economy and Structural Change; Deutsche Bank Research: Frankfurt am Main, Germany, 2018. [Google Scholar]
  21. Gartner. When to Augment Decisions with Artificial Intelligence: Guides for Effective Business Decision Making; Guide 3 of 5 2022; Gartner: Stamford, CT, USA, 2022. [Google Scholar]
  22. Steenstrup, K.; Sallam, R.L.; Eriksen, L.; Jacobson, S.F. Industrial Analytics Revolutionizes Big Data in the Digital Business; Gartner, Inc.: Stamford, CT, USA, 2014. [Google Scholar]
  23. Weller, J.; Migenda, N.; Liu, R.; Wegel, A.; von Enzberg, S.; Kohlhase, M.; Schenck, W.; Dumitrescu, R. Towards a Systematic Approach for Prescriptive Analytics Use Cases in Smart Factories. In ML4CPS—Machine Learning for Cyber Phyisical Systems; Springer: Cham, Switzerland, 2023. [Google Scholar]
  24. Stein, N.; Meller, J.; Flath, C.M. Big data on the shop-floor: Sensor-based decision-support for manual processes. J. Bus. Econ. 2018, 88, 593–616. [Google Scholar] [CrossRef]
  25. Lu, J.; Yan, Z.; Han, J.; Zhang, G. Data-Driven Decision-Making (D 3 M): Framework, Methodology, and Directions. IEEE Trans. Emerg. Top. Comput. Intell. 2019, 3, 286–296. [Google Scholar] [CrossRef]
  26. Luetkehoff, B.; Blum, M.; Schroeter, M. Development of a Collaborative Platform for Closed Loop Production Control. In Collaborative Networks of Cognitive Systems; Camarinha-Matos, L.M., Afsarmanesh, H., Rezgui, Y., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 278–285. ISBN 978-3-319-99126-9. [Google Scholar]
  27. Shivakumar, R. How to Tell Which Decisions are Strategic. Calif. Manag. Rev. 2014, 56, 78–97. [Google Scholar] [CrossRef]
  28. Weller, J.; Migenda, N.; von Enzberg, S.; Kohlhase, M.; Schenck, W.; Dumitrescu, R. Design decisions for integrating Prescriptive Analytics Use Cases into Smart Factories. In Proceedings of the 34rd CIRP Design Conference, Bedford, UK, 3–5 June 2024. [Google Scholar]
  29. Kumari, M.; Kulkarni, M.S. Developing a prescriptive decision support system for shop floor control. Ind. Manag. Data Syst. 2022, 122, 1853–1881. [Google Scholar] [CrossRef]
  30. Simon, H.A. The New Science of Management Decision; Harper & Brothers: New York, NY, USA, 1960. [Google Scholar]
  31. Panagiotou, G. Conjoining prescriptive and descriptive approaches. Manag. Decis. 2008, 46, 553–564. [Google Scholar] [CrossRef]
  32. Haas, S.; Hüllermeier, E. A Prescriptive Machine Learning Approach for Assessing Goodwill in the Automotive Domain. In Proceedings of the Machine Learning and Knowledge Discovery in Databases, European Conference, ECML PKDD 2022, Proceedings, Part VI, Grenoble, France, 19–23 September 2022. [Google Scholar]
  33. Partelow, S. What is a framework? Understanding their purpose, value, development and use. J. Environ. Stud. Sci. 2023, 13, 510–519. [Google Scholar] [CrossRef]
  34. Piezunka, H. Technological platforms. J. Betriebswirtsch 2011, 61, 179–226. [Google Scholar] [CrossRef]
  35. ISO/IEC 20547-3:2020; Information Technology—Big Data Reference Architecture—Part 3: Reference Architecture. ISO: London, UK, 2020.
  36. Wissuchek, C.; Zschech, P. Survey and Systematization of Prescriptive Analytics Systems: Towards Archetypes from a Human-Machine-Collaboration Perspective; ECIS: London, UK, 2023. [Google Scholar]
  37. Arnold, L.; Jöhnk, J.; Vogt, F.; Urbach, N. IIoT platforms’ architectural features—A taxonomy and five prevalent archetypes. Electron. Mark. 2022, 32, 927–944. [Google Scholar] [CrossRef]
  38. Moghaddam, M.; Cadavid, M.N.; Kenley, C.R.; Deshmukh, A.V. Reference architectures for smart manufacturing: A critical review. J. Manuf. Syst. 2018, 49, 215–225. [Google Scholar] [CrossRef]
  39. Ismail, A.; Truong, H.-L.; Kastner, W. Manufacturing process data analysis pipelines: A requirements analysis and survey. J. Big Data 2019, 6, 1. [Google Scholar] [CrossRef]
  40. Soares, N.; Monteiro, P.; Duarte, F.J.; Machado, R.J. Extending the scope of reference models for smart factories. Procedia Comput. Sci. 2021, 180, 102–111. [Google Scholar] [CrossRef]
  41. Galster, M.; Avgeriou, P. Empirically-grounded reference architectures. In Proceedings of the Joint ACM SIGSOFT Conference—QoSA and ACM SIGSOFT Symposium—ISARCS on Quality of Software Architectures—QoSA and Architecting Critical Systems—ISARCS. Comparch ‘11: Federated Events on Component-Based Software Engineering and Software Architecture, Boulder, CO, USA, 20–24 June 2011; Crnkovic, I., Stafford, J.A., Petriu, D., Happe, J., Inverardi, P., Eds.; ACM: New York, NY, USA, 2011; pp. 153–158, ISBN 9781450307246. [Google Scholar]
  42. Weller, J.; Migenda, N.; Wegel, A.; Kohlhase, M.; Schenck, W.; Dumitrescu, R. Conceptual Framework for Prescriptive Analytics Based on Decision Theory in Smart Factories. In Proceedings of the ADACIS-ADACIS 2023: International Conference on Advances in Data-driven Analytics and Intelligent Systems, Marrakesh, Morocco, 23–25 November 2023; pp. 1–7. [Google Scholar] [CrossRef]
  43. Angelov, S.; Grefen, P.; Greefhorst, D. A classification of software reference architectures: Analyzing their success and effectiveness. In Proceedings of the 2009 Joint Working IEEE/IFIP Conference on Software Architecture & European Conference on Software Architecture, 3rd European Conference on Software Architecture (ECSA), Cambridge, UK, 14–17 September 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 141–150, ISBN 978-1-4244-4984-2. [Google Scholar]
  44. Vogel, O.; Arnold, I.; Chughtai, A.; Ihler, E.; Kehrer, T.; Mehlig, U.; Zdun, U. Software-Architektur: Grundlagen-Konzepte-Praxis, 2. Aufl. 2009; Spektrum Akademischer Verlag: Heidelberg, Germany, 2009; ISBN 9783827422675. [Google Scholar]
  45. Myers, M.D.; Newman, M. The qualitative interview in IS research: Examining the craft. Inf. Organ. 2007, 17, 2–26. [Google Scholar] [CrossRef]
  46. Angelov, S.; Trienekens, J.J.M.; Grefen, P. Towards a Method for the Evaluation of Reference Architectures: Experiences from a Case. In Proceedings of the Second European Conference on Software Architecture, Paphos, Cyprus, 29 September–1 October 2008. [Google Scholar]
  47. Shearer, C. The CRISP-DM Model: The New Blueprint for Data Mining. J. Data Warehous. 2000, 5, 13–22. [Google Scholar]
  48. Martinez, I.; Viles, E.; Olaizola, I.G. Data Science Methodologies: Current Challenges and Future Approaches. Big Data Res. 2021, 24, 100183. [Google Scholar] [CrossRef]
  49. Mora, M.; Cervantes, F.; Forgionne, G.; Gelman, O. On Frameworks and Architectures for Intelligent Decision-Making Support Systems. In Encyclopedia of Decision Making and Decision Support Technologies; Adam, F., Humphreys, P., Eds.; IGI Global: London, UK, 2008; pp. 680–690. ISBN 9781599048437. [Google Scholar]
  50. Meister, F.; Khanal, P.; Daub, R. Digital-supported problem solving for shopfloor steering using case-based reasoning and Bayesian networks. Procedia CIRP 2023, 119, 140–145. [Google Scholar] [CrossRef]
  51. Trunk, A.; Birkel, H.; Hartmann, E. On the current state of combining human and artificial intelligence for strategic organizational decision making. Bus. Res. 2020, 13, 875–919. [Google Scholar] [CrossRef]
  52. Big Data Value Association. Big Data Challenges Big Data Challenges: A Discussion Paper on Big Data challenges for BDVA and EFFRA Research & Innovation Roadmaps Alignment; Version 1; Big Data Value Association: Brussels, Belgium, 2018. [Google Scholar]
  53. Gabriel, S.; Falkowski, T.; Graunke, J.; Dumitrescu, R.; Murrenhoff, A.; Kretschmer, V.; Hompel, M.T. Künstliche Intelligenz und industrielle Arbeit–Perspektiven und Gestaltungsoptionen: Expertise des Forschungsbeirats Industrie 4.0; Acatech Expertise–Deutsche Akademie der Technikwissenschaften: München, Germany, 2024. [Google Scholar]
  54. Weller, J.; Migenda, N.; Kühn, A.; Dumitrescu, R. Prescriptive Analytics Data Canvas: Strategic Planning For Prescriptive Analytics In Smart Factories. In Proceedings of the CPSL-Conference on Production Systems and Logistics, Honululu, HI, USA, 9–12 July 2024; Publish-Ing: Hannover, Germany, 2024. [Google Scholar]
  55. Thiess, T.; Müller, O. Towards Design Principles for Data-Driven Decision Making—An Action Design Research Project in the Maritime Industry. In ECIS 2018 Proceedings; AIS Electronic Library (AISeL): Atlanta, GE, USA, 2018. [Google Scholar]
  56. Karim, R.; Galar, D.; Kumar, U. AI factory: Theories, Applications and Case Studies, 1st ed.; CRC Press Taylor & Francis Group: Boca Raton, FL, USA, 2023; ISBN 9781003208686. [Google Scholar]
  57. ISO/IEC/IEEE 42010:2022; Software, Systems and Enterprise—Architecture Description. ISO: London, UK, 2022.
  58. Lankhorst, M. Enterprise Architecture at Work: Modelling, Communication and Analysis, 4th ed.; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar] [CrossRef]
  59. Ghasemaghaei, M.; Turel, O. The Duality of Big Data in Explaining Decision-Making Quality. J. Comput. Inf. Syst. 2023, 63, 1093–1111. [Google Scholar] [CrossRef]
  60. Edwards, J.S.; Rodriguez, E. Analytics and Knowledge Management—Chapter 1: Knowledge Management for Action-Oriented Analytics; CRC Press Taylor & Francis Group: Boca Raton, FL, USA; London, UK; New York, NY, USA, 2018; ISBN 9781315209555. [Google Scholar]
  61. Longard, L.; Bardy, S.; Metternich, J. Towards a Data-Driven Performance Management in Digital Shop Floor Management; Publish-Ing: Hannover, Germany, 2022. [Google Scholar]
  62. Ransbotham, S.; Khodabandeh, S.; Kiron, D.; Candelon, F.; Chu, M.; LaFountain, B. Expanding AI’s Impact with Organizational Learning; MITSloan Management Review Research report in collaboration with BCG; MIT: Boston, MA, USA, 2020. [Google Scholar]
  63. Dumitrescu, R.; Gausemeier, J.; Kühn, A.; Luckey, M.; Plass, C.; Schneider, M.; Westermann, T. Auf dem Weg zur Industrie 4.0–Erfolgsfaktor Referenzarchitektur; It’s OWL Clustermanagement: Paderborn, Germany, 2015. [Google Scholar]
  64. Kagermann, H.; Wahlster, W.; Helbig, J. Forschungsunion Wirtschaft-Wissenschaft. Im Fokus: Das Zukunftsprojekt Industrie 4.0; Handlungsempfehlungen zur Umsetzung; Bericht der Promotorengruppe Kommunikation; Forschungsunion. 2012. Available online: https://www.acatech.de/wp-content/uploads/2018/03/industrie_4_0_umsetzungsempfehlungen.pdf (accessed on 1 May 2024).
  65. Meudt, T. Die Automatisierungspyramide-Ein Literaturüberblick. 2017. Available online: https://www.researchgate.net/profile/Tobias-Meudt/publication/318788885_Die_Automatisierungspyramide_-_Ein_Literaturuberblick/links/619f8d18b3730b67d5679e63/Die-Automatisierungspyramide-Ein-Literaturueberblick.pdf (accessed on 1 May 2024).
  66. Schuh, G.; Prote, J.-P.; Busam, T.; Lorenz, R.; Netland, T.H. Using Prescriptive Analytics to Support the Continuous Improvement Process; Springer International Publishing: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
  67. Meister, M.; Böing, T.; Batz, S.; Metternich, J. Problem-solving process design in production: Current progress and action required. Procedia CIRP 2018, 78, 376–381. [Google Scholar] [CrossRef]
  68. Geissbauer, R.; Bruns, M.; Wunderlin, J. PwC Digital Factory Transformation Survey: Digital Backbone, Use Cases and Technologies, Organizational Setup, Strategy and Roadmap, Investment Focus 2022. Available online: https://theonliner.ch/uploads/heroes/pwc-digital-factory-transformation-survey-2022.pdf (accessed on 1 May 2024).
  69. Kühn, A.; Joppen, R.; Reinhart, F.; Röltgen, D.; von Enzberg, S.; Dumitrescu, R. Analytics Canvas—A Framework for the Design and Specification of Data Analytics Projects. Procedia CIRP 2018, 70, 162–167. [Google Scholar] [CrossRef]
  70. Dumitrescu, R.; Anacker, H.; Gausemeier, J. Design framework for the integration of cognitive functions into intelligent technical systems. Prod. Eng. Res. Devel. 2013, 7, 111–121. [Google Scholar] [CrossRef]
  71. Lick, J.; Disselkamp, J.-P.; Kattenstroth, F.; Trienens, M.; Rasor, R.; Kühn, A.; Dumitrescu, R. Digital Factory Twin: A Practioner-Driven Approach for for Integrated Planning of the Enterprise Architecture. In Proceedings of the 34th CIRP Design Conference, Cranfield, UK, 3–5 June 2024. [Google Scholar]
  72. Cao, L. Data Science. ACM Comput. Surv. 2018, 50, 1–42. [Google Scholar] [CrossRef]
  73. Siemens. Senseye Predictive Maintenance-Whitepaper True Cost of Downtime 2022. 2023. Available online: https://assets.new.siemens.com/siemens/assets/api/uuid:3d606495-dbe0-43e4-80b1-d04e27ada920/dics-b10153-00-7600truecostofdowntime2022-144.pdf (accessed on 1 May 2024).
  74. Wegel, A.; Sahrhage, P.; Rabe, M.; Dumitrescu, R. Referenzarchitektur für Smart Services. Stuttgarter Symposium für Produktentwicklung SSP 2021: Stuttgart, 20. Mai 2021, Wissenschaftliche Konferenz; Fraunhofer-Institut für Arbeitswirtschaft und Organisation IAO: Stuttgart, Germany, 2021. [Google Scholar] [CrossRef]
  75. Rabe, M. Systematik zur Konzipierung von Smart Services. Ph.D. Dissertation, Universität Paderborn, Paderborn, Germany, 2019. [Google Scholar]
  76. Hodler, A.E. Artificial Intelligence & Graph Technology: Enhancing AI with Context & Connections; Neo4j, Inc.: San Mateo, CA, USA, 2021. [Google Scholar]
  77. Branke, J.; Mnif, M.; Müller-Schloer, C.; Prothmann, H.; Richter, U.; Rochner, F.; Schmeck, H. Organic Computing-Addressing Complexity by Controlled Self-Organization. In Proceedings of the Second International Symposium on Leveraging Applications of Formal Methods, Verification and Validation (isola 2006), Paphos, Cyprus, 15–19 November 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 185–191, ISBN 978-0-7695-3071-0. [Google Scholar]
  78. OpenGroup. The ArchiMate® Enterprise Architecture Modeling Language: About the ArchiMate Modeling Language. Available online: https://www.opengroup.org/archimate-forum/archimate-overview (accessed on 1 May 2024).
  79. HSBI, Center for Applied Data Science. IoT-Factory. Available online: https://www.hsbi.de/ium/cfads/projekte/iot-factory (accessed on 26 March 2024).
  80. North, C. INFORMATION VISUALIZATION: Chapter 43. In Handbook of Human Factors and Ergonomics, 4th ed.; Salvendy, G., Ed.; Wiley: Hoboken, NJ, USA, 2012; ISBN 978-0-470-52838-9. [Google Scholar]
  81. Wostmann, R.; Schlunder, P.; Temme, F.; Klinkenberg, R.; Kimberger, J.; Spichtinger, A.; Goldhacker, M.; Deuse, J. Conception of a Reference Architecture for Machine Learning in the Process Industry. In Proceedings of the 2020 IEEE International Conference on Big Data (Big Data), Atlanta, GA, USA, 10–13 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1726–1735, ISBN 978-1-7281-6251-5. [Google Scholar]
  82. Padovano, A.; Longo, F.; Nicoletti, L.; Gazzaneo, L.; Chiurco, A.; Talarico, S. A prescriptive maintenance system for intelligent production planning and control in a smart cyber-physical production line. Procedia CIRP 2021, 104, 1819–1824. [Google Scholar] [CrossRef]
  83. Ansari, F.; Glawar, R.; Nemeth, T. PriMa: A prescriptive maintenance model for cyber-physical production systems. Int. J. Comput. Integr. Manuf. 2019, 32, 482–503. [Google Scholar] [CrossRef]
  84. von Enzberg, S.; Weller, J.; Brock, J.; Merkelbach, S.; Panzner, M.; Lick, J.; Kühn, A.; Dumitrescu, R. On the Current State of Industrial Data Science: Challenges, Best Practices, and Future Directions. In Proceedings of the 57th CIRP Conference on Manufacturing Systems 2024 (CMS 2024), Póvoa de Varzim, Portugal, 6–9 July 2024. [Google Scholar]
  85. Koot, M.; Mes, M.; Iacob, M.E. A systematic literature review of supply chain decision making supported by the Internet of Things and Big Data Analytics. Comput. Ind. Eng. 2021, 154, 107076. [Google Scholar] [CrossRef]
  86. Hüllermeier, E.; Waegeman, W. Aleatoric and Epistemic Uncertainty in Machine Learning: An Introduction to Concepts and Methods. Mach. Learn. 2021, 110, 457–506. [Google Scholar] [CrossRef]
  87. Hüllermeier, E. Prescriptive Machine Learning for Automated Decision Making: Challenges and Opportunities. 2021. Available online: http://arxiv.org/pdf/2112.08268v1 (accessed on 1 May 2024).
  88. Hankel, M.; Rexroth, B. The Reference Architectural Model Industrie 4.0; ZVEI: Frankfurt, Germany, 2015. [Google Scholar]
  89. Korsten, G.; Aysolmaz, B.; Turetken, O.; Edel, D.; Ozkan, B. ADA-CMM: A Capability Maturity Model for Advanced Data Analytics. In Proceedings of the 55th Hawaii International Conference on System Sciences, Virtual, 3–7 January 2022. [Google Scholar]
  90. Beauvoir, P.; Sarrodie, J.-B. Archi-Archimate Modelling (Website): Archi Mate Is a Registered Trademark of the Open Group. Available online: https://www.archimatetool.com/ (accessed on 7 May 2024).
  91. Gabriel, S.; Kühn, A.; Dumitrescu, R. Strategic planning of the collaboration between humans and artificial intelligence in production. Procedia CIRP 2023, 120, 1309–1314. [Google Scholar] [CrossRef]
  92. Lepenioti, K.; Pertselakis, M.; Bousdekis, A.; Louca, A.; Lampathaki, F.; Apostolou, D.; Mentzas, G.; Anastasiou, S. Machine Learning for Predictive and Prescriptive Analytics of Operational Data in Smart Manufacturing 2020; Springer International Publishing: Cham, Switzerland, 2020. [Google Scholar] [CrossRef]
  93. Jasiulewicz-Kaczmarek, M.; Gola, A. Maintenance 4.0 Technologies for Sustainable Manufacturing—An Overview. IFAC-PapersOnLine 2019, 52, 91–96. [Google Scholar] [CrossRef]
  94. Listl, F.G.; Fischer, J.; Rosen, R.; Sohr, A.; Wehrstedt, J.C.; Weyrich, M. Decision Support on the Shop Floor Using Digital Twins. In Advances in Production Management Systems. Artificial Intelligence for Sustainable and Resilient Production Systems; Dolgui, A., Bernard, A., Lemoine, D., von Cieminski, G., Romero, D., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 284–292. ISBN 978-3-030-85873-5. [Google Scholar]
  95. Matenga, A.; Murena, E.; Mpofu, K. Prescriptive Modelling System Design for an Armature Multi-coil Rewinding Cobot Machine. Procedia CIRP 2020, 91, 284–289. [Google Scholar] [CrossRef]
  96. Gröger, C. Building an Industry 4.0 Analytics Platform. Datenbank Spektrum 2018, 18, 5–14. [Google Scholar] [CrossRef]
  97. Saadallah, A.; Büscher, J.; Abdulaaty, O.; Panusch, T.; Deuse, J.; Morik, K. Explainable Predictive Quality Inspection using Deep Learning in Electronics Manufacturing. Procedia CIRP 2022, 107, 594–599. [Google Scholar] [CrossRef]
  98. Adesanwo, M.; Bello, O.; Olorode, O.; Eremiokhale, O.; Sanusi, S.; Blankson, E. Advanced analytics for data-driven decision making in electrical submersible pump operations management. In Proceedings of the SPE Nigeria Annual International Conference and Exhibition 2017, Lagos, Nigeria, 31 July–2 August 2017. [Google Scholar] [CrossRef]
  99. Silva, S.; Vyas, V.; Afonso, P.; Boris, B. Prescriptive Cost Analysis in Manufacturing Systems. IFAC-PapersOnLine 2022, 55, 484–489. [Google Scholar] [CrossRef]
  100. Beham, A.; Raggl, S.; Hauder, V.A.; Karder, J.; Wagner, S.; Affenzeller, M. Performance, quality, and control in steel logistics 4.0. Procedia Manuf. 2020, 42, 429–433. [Google Scholar] [CrossRef]
  101. Jin, Y.; Qin, S.J.; Huang, Q. Prescriptive analytics for understanding of out-of-plane deformation in additive manufacturing. In Proceedings of the 2016 IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–25 August 2016. [Google Scholar]
  102. Soltanpoor, R.; Sellis, T. Prescriptive Analytics for Big Data. Database Theory and Applications. In Proceedings of the 27th Australasian Database Conference, ADC 2016, Sydney, NSW, Australia, 28–29 September 2016; Volume 9877, pp. 245–256. [Google Scholar] [CrossRef]
  103. Vater, J.; Schamberger, P.; Knoll, A.; Winkle, D. Fault classification and correction based on convolutional neural networks exemplified by laser welding of hairpin windings. In Proceedings of the 9th International Electric Drives Production Conference, 2019-Proceedings, Esslingen, Germany, 3–4 December 2019. [Google Scholar]
  104. Ansari, F.; Glawar, R.; Sihn, W. Prescriptive Maintenance of CPPS by Integrating Multimodal Data with Dynamic Bayesian Networks. In Machine Learning for Cyber Physical Systems; Beyerer, J., Maier, A., Niggemann, O., Eds.; Springer: Berlin/Heidelberg, Germany, 2020; pp. 1–8. ISBN 978-3-662-59083-6. [Google Scholar]
  105. Vater, J.; Harscheidt, L.; Knoll, A. A Reference Architecture Based on Edge and Cloud Computing for Smart Manufacturing. In Proceedings of the Proceedings-International Conference on Computer Communications and Networks, ICCCN, Valencia, Spain, 29 July–1 August 2019; pp. 1–7. [Google Scholar] [CrossRef]
  106. González, A.G.; Nieto, E.; Leturiondo, U. A Prescriptive Analysis Tool for Improving Manufacturing Processes; Springer International Publishing: Cham, Switzerland, 2022; pp. 283–291. [Google Scholar] [CrossRef]
  107. Brodsky, A.; Shao, G.; Krishnamoorthy, M.; Narayanan, A.; Menasce, D.; Ak, R. Analysis and optimization in smart manufacturing based on a reusable knowledge base for process performance models. In Proceedings of the 2015 IEEE International Conference on Big Data (Big Data), Santa Clara, CA, USA, 29 October–1 November 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1418–1427, ISBN 978-1-4799-9926-2. [Google Scholar]
  108. Tham, C.-K.; Sharma, N.; Hu, J. Model-based and Model-free Prescriptive Maintenance on Edge Computing Nodes. In Proceedings of the 2023 IEEE 97th Vehicular Technology Conference (VTC2023-Spring), Florence, Italy, 20–23 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6, ISBN 979-8-3503-1114-3. [Google Scholar]
  109. Faisal, A.M.; Karthigeyan, L. Data Analytics based Prescriptive Analytics for Selection of Lean Manufacturing System. In Proceedings of the 6th International Conference on Inventive Computation Technologies, ICICT 2021, Coimbatore, India, 20–22 January 2021. [Google Scholar] [CrossRef]
  110. Kuzyakov, O.N.; Andreeva, M.A.; Gluhih, I.N. Applying Case-Based Reasoning Method for Decision Making in IIoT System. In Proceedings of the 2019 International Multi-Conference on Industrial Engineering and Modern Technologies (FarEastCon), Vladivostok, Russia, 1–4 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–6, ISBN 978-1-7281-0061-6. [Google Scholar]
  111. Matyas, K.; Nemeth, T.; Kovacs, K.; Glawar, R. A procedural approach for realizing prescriptive maintenance planning in manufacturing industries. CIRP Ann. 2017, 66, 461–464. [Google Scholar] [CrossRef]
  112. Thammaboosadee, S.; Wongpitak, P. An Integration of Requirement Forecasting and Customer Segmentation Models towards Prescriptive Analytics For Electrical Devices Production. In Proceedings of the 2018 International Conference on Information Technology (InCIT), Khon Kaen, Thailand, 24–26 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–5, ISBN 978-974-19-6032-3. [Google Scholar]
  113. Elbasheer, M.; Longo, F.; Mirabelli, G.; Padovano, A.; Solina, V.; Talarico, S. Integrated Prescriptive Maintenance and Production Planning: A Machine Learning Approach for the Development of an Autonomous Decision Support Agent. IFAC-PapersOnLine 2022, 55, 2605–2610. [Google Scholar] [CrossRef]
  114. Vater, J.; Harscheidt, L.; Knoll, A. Smart Manufacturing with Prescriptive Analytics. In Proceedings of the ICITM 2019, Cambridge, UK, 2–4 March 2019; IEEE: Piscataway, NJ, USA, 2019. ISBN 9781728132686. [Google Scholar]
  115. Das, S. Maintenance Action Recommendation Using Collaborative Filtering. Int. J. Progn. Health Manag. 2013, 4. [Google Scholar] [CrossRef]
  116. Gyulai, D.; Bergmann, J.; Gallina, V.; Gaal, A. Towards a connected factory: Shop-floor data analytics in cyber-physical environments. Procedia CIRP 2019, 86, 37–42. [Google Scholar] [CrossRef]
  117. John, I.; Karumanchi, R.; Bhatnagar, S. Predictive and Prescriptive Analytics for Performance Optimization: Framework and a Case Study on a Large-Scale Enterprise System. In Proceedings of the 2019 18th IEEE International Conference on Machine Learning and Applications (ICMLA), Boca Raton, FL, USA, 16–19 December 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 876–881, ISBN 978-1-7281-4550-1. [Google Scholar]
  118. Hribernik, K.; von Stietencron, M.; Ntalaperas, D.; Thoben, K.-D. Unified Predictive Maintenance System—Findings Based on its Initial Deployment in Three Use Case. IFAC-PapersOnLine 2020, 53, 191–196. [Google Scholar] [CrossRef]
  119. Bousdekis, A.; Papageorgiou, N.; Magoutas, B.; Apostolou, D.; Mentzas, G. Sensor-driven Learning of Time-Dependent Parameters for Prescriptive Analytics. IEEE Access 2020, 8, 92383–92392. [Google Scholar] [CrossRef]
  120. Mohan, S.P.; S, J.N. A prescriptive analytics approach for tool wear monitoring using machine learning techniques. In Proceedings of the 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC), Jalandhar, India, 26–28 May 2023; pp. 228–233. [Google Scholar] [CrossRef]
  121. Vater, J.; Schlaak, P.; Knoll, A. A Modular Edge-/Cloud-Solution for Automated Error Detection of Industrial Hairpin Weldings using Convolutional Neural Networks. In Proceedings of the 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC), Madrid, Spain, 13–17 July 2020; pp. 505–510. [Google Scholar] [CrossRef]
  122. Divyashree, N.; Nandini Prasad, K.S. Design and Development of We-CDSS Using Django Framework: Conducing Predictive and Prescriptive Analytics for Coronary Artery Disease. IEEE Access 2022, 10, 119575–119592. [Google Scholar] [CrossRef]
  123. Hentschel, R. Developing Design Principles for a Cloud Broker Platform for SMEs. In Proceedings of the 2020 IEEE 22nd Conference on Business Informatics (CBI), Antwerp, Belgium, 22–24 June 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 290–299, ISBN 978-1-7281-9926-9. [Google Scholar]
  124. Perea, R.V.; Festijo, E.D. Analytics Platform for Morphometric Grow out and Production Condition of Mud Crabs of the Genus Scylla with K-Means. In Proceedings of the 4th International Conference of Computer and Informatics Engineering (IC2IE), Depok, Indonesia, 14–15 September 2021; pp. 117–122. [Google Scholar] [CrossRef]
  125. Madrid, M.C.R.; Malaki, E.G.; Ong, P.L.S.; Solomo, M.V.S.; Suntay, R.A.L.; Vicente, H.N. Healthcare Management System with Sales Analytics using Autoregressive Integrated Moving Average and Google Vision. In Proceedings of the 2020 IEEE 12th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Manila, Philippines, 3–7 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6, ISBN 978-1-6654-1971-0. [Google Scholar]
  126. Bashir, M.R.; Gill, A.Q.; Beydoun, G. A Reference Architecture for IoT-Enabled Smart Buildings. SN Comput. Sci. 2022, 3, 493. [Google Scholar] [CrossRef]
  127. Lepenioti, K.; Bousdekis, A.; Apostolou, D.; Mentzas, G. Human-Augmented Prescriptive Analytics With Interactive Multi-Objective Reinforcement Learning. IEEE Access 2021, 9, 100677–100693. [Google Scholar] [CrossRef]
  128. Sam Plamoottil, S.; Kunden, B.; Yadav, A.; Mohanty, T. Inventory Waste Management with Augmented Analytics for Finished Goods. In Proceedings of the Third International Conference on Artificial Intelligence and Smart Energy (ICAIS), Coimbatore, India, 2–4 February 2023; pp. 1293–1299. [Google Scholar] [CrossRef]
  129. Filz, M.-A.; Bosse, J.P.; Herrmann, C. Digitalization platform for data-driven quality management in multi-stage manufacturing systems. J. Intell. Manuf. 2023, 35, 2699–2718. [Google Scholar] [CrossRef]
  130. Rehman, A.; Naz, S.; Razzak, I. Leveraging big data analytics in healthcare enhancement: Trends, challenges and opportunities. Multimed. Syst. 2022, 28, 1339–1371. [Google Scholar] [CrossRef]
  131. Ribeiro, R.; Pilastri, A.; Moura, C.; Morgado, J.; Cortez, P. A data-driven intelligent decision support system that combines predictive and prescriptive analytics for the design of new textile fabrics. Neural. Comput. Appl. 2023, 35, 17375–17395. [Google Scholar] [CrossRef]
  132. Adi, E.; Anwar, A.; Baig, Z.; Zeadally, S. Machine learning and data analytics for the IoT. Neural. Comput. Appl. 2020, 32, 16205–16233. [Google Scholar] [CrossRef]
  133. von Bischhoffshausen, J.K.; Paatsch, M.; Reuter, M.; Satzger, G.; Fromm, H. An Information System for Sales Team Assignments Utilizing Predictive and Prescriptive Analytics. In Proceedings of the 2015 IEEE 17th Conference on Business Informatics (CBI), Lisbon, Portugal, 13–16 July 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 68–76, ISBN 978-1-4673-7340-1. [Google Scholar]
  134. Mustafee, N.; Powell, J.H.; Harper, A. Rh-rt: A data analytics framework for reducing wait time at emergency departments and centres for urgent care. In Proceedings of the 2018 Winter Simulation Conference (WSC), Gothenburg, Sweden, 9–12 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 100–110, ISBN 978-1-5386-6572-5. [Google Scholar]
  135. DIN SPEC 91345:2016-04, 2016, ICS 03.100.01; 25.040.01; 35.240.50. Available online: https://www.dinmedia.de/en/technical-rule/din-spec-91345/250940128 (accessed on 1 May 2024).
  136. Ma, J.; Wang, Q.; Jiang, Z.; Zhao, Z. A hybrid modeling methodology for cyber physical production systems: Framework and key techniques. Prod. Eng. Res. Devel. 2021, 15, 773–790. [Google Scholar] [CrossRef]
  137. Fortoul-Diaz, J.A.; Carrillo-Martinez, L.A.; Centeno-Tellez, A.; Cortes-Santacruz, F.; Olmos-Pineda, I.; Flores-Quintero, R.R. A Smart Factory Architecture Based on Industry 4.0 Technologies: Open-Source Software Implementation. IEEE Access 2023, 11, 101727–101749. [Google Scholar] [CrossRef]
  138. Kahveci, S.; Alkan, B.; Ahmad, M.H.; Ahmad, B.; Harrison, R. An end-to-end big data analytics platform for IoT-enabled smart factories: A case study of battery module assembly system for electric vehicles. J. Manuf. Syst. 2022, 63, 214–223. [Google Scholar] [CrossRef]
  139. Parri, J.; Patara, F.; Sampietro, S.; Vicario, E. A framework for Model-Driven Engineering of resilient software-controlled systems. Computing 2021, 103, 589–612. [Google Scholar] [CrossRef]
  140. Bozhdaraj, D.; Lucke, D.; Jooste, J.L. Smart Maintenance Architecture for Automated Guided Vehicles. Procedia CIRP 2023, 118, 110–115. [Google Scholar] [CrossRef]
  141. Malburg, L.; Hoffmann, M.; Bergmann, R. Applying MAPE-K control loops for adaptive workflow management in smart factories. J. Intell. Inf. Syst. 2023, 61, 83–111. [Google Scholar] [CrossRef]
  142. Friederich, J.; Francis, D.P.; Lazarova-Molnar, S.; Mohamed, N. A framework for data-driven digital twins of smart manufacturing systems. Comput. Ind. 2022, 136, 103586. [Google Scholar] [CrossRef]
  143. Guha, B.; Moore, S.; Huyghe, J.M. Conceptualizing data-driven closed loop production systems for lean manufacturing of complex biomedical devices—A cyber-physical system approach. J. Eng. Appl. Sci. 2023, 70, 50. [Google Scholar] [CrossRef]
  144. Woo, J.; Shin, S.-J.; Seo, W.; Meilanitasari, P. Developing a big data analytics platform for manufacturing systems: Architecture, method, and implementation. Int. J. Adv. Manuf. Technol. 2018, 99, 2193–2217. [Google Scholar] [CrossRef]
  145. Zhang, X.; Ming, X. A Smart system in Manufacturing with Mass Personalization (S-MMP) for blueprint and scenario driven by industrial model transformation. J. Intell. Manuf. 2023, 34, 1875–1893. [Google Scholar] [CrossRef]
  146. Kaniappan Chinnathai, M.; Alkan, B. A digital life-cycle management framework for sustainable smart manufacturing in energy intensive industries. J. Clean. Prod. 2023, 419, 138259. [Google Scholar] [CrossRef]
  147. Farbiz, F.; Habibullah, M.S.; Hamadicharef, B.; Maszczyk, T.; Aggarwal, S. Knowledge-embedded machine learning and its applications in smart manufacturing. J. Intell. Manuf. 2023, 34, 2889–2906. [Google Scholar] [CrossRef]
  148. García, Á.; Bregon, A.; Martínez-Prieto, M.A. Digital Twin Learning Ecosystem: A cyber–physical framework to integrate human-machine knowledge in traditional manufacturing. Internet Things 2024, 25, 101094. [Google Scholar] [CrossRef]
  149. Simeone, A.; Grant, R.; Ye, W.; Caggiano, A. A human-cyber-physical system for Operator 5.0 smart risk assessment. Int. J. Adv. Manuf. Technol. 2023, 129, 2763–2782. [Google Scholar] [CrossRef]
Figure 1. Overview of the respective manufacturing decision environment for prescriptive analytics use cases based on [28].
Figure 1. Overview of the respective manufacturing decision environment for prescriptive analytics use cases based on [28].
Mathematics 12 02663 g001
Figure 2. Influencing factors on the terminology of prescriptive analytics based on [30].
Figure 2. Influencing factors on the terminology of prescriptive analytics based on [30].
Mathematics 12 02663 g002
Figure 3. Overview on prescriptive analytics use cases (UCs) in the context of smart factories based on [28].
Figure 3. Overview on prescriptive analytics use cases (UCs) in the context of smart factories based on [28].
Mathematics 12 02663 g003
Figure 4. Overview of the analyzed reference architectures for prescriptive analytics platforms based on [11].
Figure 4. Overview of the analyzed reference architectures for prescriptive analytics platforms based on [11].
Mathematics 12 02663 g004
Figure 5. Structuring approaches from existing reference architectures based on the analyzed literature (visual based and expanded on [3]).
Figure 5. Structuring approaches from existing reference architectures based on the analyzed literature (visual based and expanded on [3]).
Mathematics 12 02663 g005
Figure 6. Concept matrix to analyze the reference architectures (RAs) for smart factories.
Figure 6. Concept matrix to analyze the reference architectures (RAs) for smart factories.
Mathematics 12 02663 g006
Figure 7. Instantiation of the research method for the construction of a reference architecture based on Galster and Avgeriou [11,23,28,41,42].
Figure 7. Instantiation of the research method for the construction of a reference architecture based on Galster and Avgeriou [11,23,28,41,42].
Mathematics 12 02663 g007
Figure 9. Framework for one prescriptive analytics use case [42].
Figure 9. Framework for one prescriptive analytics use case [42].
Mathematics 12 02663 g009
Figure 10. Deployment of a prescriptive analytics use case in a smart factory environment based on [42].
Figure 10. Deployment of a prescriptive analytics use case in a smart factory environment based on [42].
Mathematics 12 02663 g010
Figure 11. Scope of the reference architecture based on [41,44,46].
Figure 11. Scope of the reference architecture based on [41,44,46].
Mathematics 12 02663 g011
Figure 12. Design strategy for the reference architecture based on [41,44,46].
Figure 12. Design strategy for the reference architecture based on [41,44,46].
Mathematics 12 02663 g012
Figure 13. Comparison of the reference architectures (RAs) from the state-of-the-art section and the posed requirements.
Figure 13. Comparison of the reference architectures (RAs) from the state-of-the-art section and the posed requirements.
Mathematics 12 02663 g013
Figure 14. Construction of the reference architecture based on [41,44,46].
Figure 14. Construction of the reference architecture based on [41,44,46].
Mathematics 12 02663 g014
Figure 15. Influencing factors for the integration of prescriptive analytics use cases into smart factories.
Figure 15. Influencing factors for the integration of prescriptive analytics use cases into smart factories.
Mathematics 12 02663 g015
Figure 16. Conceptual reference architecture for the integration of prescriptive analytics use cases into smart factories.
Figure 16. Conceptual reference architecture for the integration of prescriptive analytics use cases into smart factories.
Mathematics 12 02663 g016
Figure 17. Reference architecture for the integration of prescriptive analytics use cases into smart factories.
Figure 17. Reference architecture for the integration of prescriptive analytics use cases into smart factories.
Mathematics 12 02663 g017
Figure 18. Possible ways of introducing variability into the reference architecture for the interconnection of the prescriptive analytics use cases based on [77].
Figure 18. Possible ways of introducing variability into the reference architecture for the interconnection of the prescriptive analytics use cases based on [77].
Mathematics 12 02663 g018
Figure 19. Elements for the usage in a workshop to enable variability.
Figure 19. Elements for the usage in a workshop to enable variability.
Mathematics 12 02663 g019
Figure 20. (Left) IoT Factory demonstrator used to validate the reference architecture. (Right) the IoT device that is continuously being manufactured [79].
Figure 20. (Left) IoT Factory demonstrator used to validate the reference architecture. (Right) the IoT device that is continuously being manufactured [79].
Mathematics 12 02663 g020
Figure 21. Resulting use case pipeline for prescriptive resource management in the IoT Factory.
Figure 21. Resulting use case pipeline for prescriptive resource management in the IoT Factory.
Mathematics 12 02663 g021
Figure 22. Resulting interconnection of analytics use cases in the IoT Factory.
Figure 22. Resulting interconnection of analytics use cases in the IoT Factory.
Mathematics 12 02663 g022
Figure 23. Compliance with existing smart factory concepts and reference architectures.
Figure 23. Compliance with existing smart factory concepts and reference architectures.
Mathematics 12 02663 g023
Table 1. Overview of the respective interviewees (interv.) and their demographic data.
Table 1. Overview of the respective interviewees (interv.) and their demographic data.
Interv.ExperienceIndustryJob Title
I18 years ManufacturingIndustry 4.0 manager
I211 yearsMechanical EngineeringIndustry 4.0 manager
I3–I6 4 years, 4 years,
3 years, 1 year
Industrial Data ScienceResearcher
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Weller, J.; Migenda, N.; Naik, Y.; Heuwinkel, T.; Kühn, A.; Kohlhase, M.; Schenck, W.; Dumitrescu, R. Reference Architecture for the Integration of Prescriptive Analytics Use Cases in Smart Factories. Mathematics 2024, 12, 2663. https://doi.org/10.3390/math12172663

AMA Style

Weller J, Migenda N, Naik Y, Heuwinkel T, Kühn A, Kohlhase M, Schenck W, Dumitrescu R. Reference Architecture for the Integration of Prescriptive Analytics Use Cases in Smart Factories. Mathematics. 2024; 12(17):2663. https://doi.org/10.3390/math12172663

Chicago/Turabian Style

Weller, Julian, Nico Migenda, Yash Naik, Tim Heuwinkel, Arno Kühn, Martin Kohlhase, Wolfram Schenck, and Roman Dumitrescu. 2024. "Reference Architecture for the Integration of Prescriptive Analytics Use Cases in Smart Factories" Mathematics 12, no. 17: 2663. https://doi.org/10.3390/math12172663

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop