**1. Introduction**

Our civilization faces acute and critical challenges, such as climate change, safe drinking water availability, food scarcity, and secure energy supplies, which endanger current and future generations. Therefore, society and industry need to shape their activities based on sustainable principles and to efficiently adopt the rapidly evolving new technologies which can potentially handle the challenges mentioned above. Precisely, this work focuses on the integration of new technologies in the decision-making of process and manufacturing industries. Thus, the proposed methodology applies to the workflow of any productive sector or area where decision-making plays a crucial role.

As for the process and manufacturing industries, complex decision-making lurks at all enterprise levels and the whole product lifecycle, ranging from product conception, design, development, production, commercialization, and delivery. The need to consider highly complex scenarios results in involved and non-trivial decision-making. However, the advent of new technologies supports the successful development and systematization

**Citation:** Muñoz, E.;

Capon-Garcia, E.; Martinez, E.; Puigjaner, L. A Systematic Model for Process Development Activities to Support Process Intelligence. *Processes* **2021**, *9*, 600. https:// doi.org/10.3390/pr9040600

Academic Editor: Mohd Azlan Hussain

Received: 30 January 2021 Accepted: 24 March 2021 Published: 30 March 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

of new structures and frameworks for reaching informed, reasonable, and wise decisions. Therefore, this work aims to integrate new technologies systematically in the decisionmaking workflow of the process and manufacturing industries. Therefore, this work presents a framework for developing process and product-related activities. Thus, the proposed model allows to unveiling the most efficient way towards integrating enterprise decision support systems and process intelligence into actual enterprise processes.

As discussed in Section 1.1 Decision making in the enterprise, companies have recently adopted decision support systems to handle the complexity of decision-making. However, such systems only tackle part of the complete enterprise structure, and it is necessary to understand the whole picture to reach sensible solutions, as pointed out in Section 1.2 Enterprise Integration. Therefore, this work combines Knowledge managemen<sup>t</sup> (Section 1.3) and Data managemen<sup>t</sup> (Section 1.4) to propose a framework for reaching integration of different systems and efficiently apply new technological solutions for decision-making.

#### *1.1. Decision-Making in the Enterprise*

Process and manufacturing industries can be regarded as highly involved systems consisting of multiple business and process units. The organization of the different temporal and geographical scales in such units, as well as the other enterprise decision levels, is crucial to understand and analyze their behavior. The key objectives are to gain economic efficiency, market position, product quality, flexibility, or reliability [1]. Recently, indicators related to sustainability and environmental impact have also been included as drivers for decision-making. The basis for solving an enterprise system problem and further implement any action is the actual system representation in a model, which captures the observer's relevant features. Such a model is the basis for decision-making, which is a highly challenging task in these industries due to their inherent complexity.

Therefore, companies have devoted efforts to reach better decisions during the last decades. Indeed, they have invested a large number of resources in exploiting information systems, developing models, and using data to improve decisions. Decision support systems (DSS) are responsible for managing the necessary data and information that allow making decisions. Thus, those systems aim to integrate data transactions with analytical models supporting the decision-making activity at different organizational levels. The work in [2] defines DSS as aiding computer systems at the managemen<sup>t</sup> level of an organization that combines data with advanced analytical models. The work in [3] presents four components for supporting classic DSS. The components comprise (i) a sophisticated database for accessing internal and external data, (ii) an analytical model system for accessing modeling functions, (iii) a graphical user interface for allowing the interaction of humans and the models to make decisions, and (iv) an optimization-engine based on mathematic algorithms or intuition/knowledge. Traditionally, DSS focus on a single enterprise unit and lack the vision of the boundaries. Thus, DSS rely heavily on rigid data and model structures, and they are difficult to adapt to include new algorithms and technologies.

#### *1.2. Enterprise Integration*

Current trends in the process industry outline the importance of being agile and fully integrated to improve decision-making at all scales in the company. Indeed, integration comprises the whole organizational activities from operation to planning and strategic, which differ in physical and temporal scope, but are directly related to each other as decisions made at one level directly affect others. Therefore, companies pursuing integration among different decision levels in the production managemen<sup>t</sup> environment report substantial economic benefits [4,5]. Therefore, to coordinate and integrate information and decisions among the various functions are crucial for improving global performance.

The use of Standards is the primary conducted method for enterprise integration labor. Groups, commities, and societies have developed those standards in different geopolitical

areas where they are of application. Next, the use of some standards serves as well as a brief introduction of their content.

First, the European Committee for Standardization (CEN) and the European Committee for Electrotechnical Standardization (CENELEC) provide standards to characterize, guide, and rule SMEs' activities [6]. CEN standards comprise European Standards (E.N.s), drafts standards (prENs), Technical Specifications (CEN TSs), sDocuments (HDs), Technical Specifications (TSs), Technical Reports (TRs), and CEN Workshop Agreements (CWAs). Finally, CEN work is coordinate with the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).

Next, the National Institute of Standards and Technology (NIST) creates the Integrated Definition Methods (IDEF). IDEF comprises a standard fir function modeling (IDEF 0), Information Modelling (IDEF 1), Data Modelling (IDEF 1X), Process Modelling (IDEF 3), Object-Oriented Design (IDEF 4), and Ontology Description (IDEF 5) currently maintained by the Knowledge-Based Systems, INC. (KBSI) [7]. The standards were funding and are now in use by the United States Air Force and United States Department of Defense agencies. Moreover, many organizations for business process capturing and improvement.

Following, the International Electrotechnical Commission (IEC) develops the International Standards and Conformity Assessment covering areas such as industrial control programming standards (IEC 61131-3) or field devise integration (IEC 61804-2). The standards aim at allowing interoperability, efficiency, the safety of electrical, electronic, and information systems [8].

The International Organization for Standardization (ISO) standards are well-known and widely used standards, covering managemen<sup>t</sup> systems, quality management, information security management, etc. [9]. Thus, criteria for integration comprises the Enterprise Modeling and Architecture (ISO TEC184 SC5 WG1), Electronic Business Extensible Markup Language (ISO 15000), and the Asset Management System (ISO 55003), among others.

Manufacturing Execution Systems Association (MESA) presents a set of best management practices and information technology aiming to improve business. MESA focuses on asset performance management, lean manufacturing, product lifecycle management, manufacturing performance metrics, quality, regulatory compliance, and return to investment [10].

Next, Machinery Information Management Open Systems Alliance (MIMOSA) association presents the Open Standards for Physical Asset managemen<sup>t</sup> for information managemen<sup>t</sup> (I.M.), and information technologies (I.T.) applied to manufacturing environments [11]. MIMOSA standards recently focus on enabling digital twins, big data, industrial internet of things, and analytics specifications.

The Object Management Group (OMG) is dedicated to developing technological standards for enterprise integration and distributed broad-interoperability [12]. The OMG comprises the following standards: Business Process Model and Notation (BPMN), Common Object Request Broker Architecture (CORBA), Common Warehouse Metamodel (CWM), Data-Distribution Service for Real-Time Systems (DDS), Unified Modeling Language (UML), and the Model Driving Architecture (MDA) applied to software visual design, execution, and support.

Next, the Process Industry Practices (PIP) consortium collaborates to define common industry standards and best practices focused on design, maintenance, and procurement activities [13]. Besides, PIP practices facilitate knowledge capturing of process control, mechanical, data management, and Piping and Instrumentation Diagrams.

A key element of integration directly points to enterprise models: computational applications within organizations aiming to represent processes, activities, resources, or physical phenomena. These models are essential for driving design, analysis, management, and prognosis in enterprise functions. Nevertheless, the spread of these models confronts several issues in practice. First of all, the independent creation of systems supporting functions at the enterprise during past years, resulting in heterogeneous enterprise models; that is, the so-called correspondence problem. Different enterprise models refer to the same concept, for example, an activity, each model will probably apply other names, following the example activity, operation, or task. Therefore, most of the time, interpreting agents are necessary to allow communication among those enterprise functions. However, no matter how rational the idea of renaming the concepts is, organizational barriers usually impede it. Furthermore, these representations lack an adequate specification of what the model objects mean; they lack the terminology's actual semantic definition. Instead, concepts are poorly defined, and their interpretations overlap, leading to inconsistent understandings and uses of the knowledge. Finally, the cost of designing, building, and maintaining a model of the enterprise is high. Each model tends to be unique to the enterprise, and objects are enterprise-specific.

Therefore, some efforts have addressed the issues mentioned above using model standardization. On the other hand, the American National Standards Institute (ANSI), developed the Instrumentation, Systems and Automation Society (ISA) standards, known as ANSI/ISA standards for automation and control within the enterprise [14] with wide recognition for process integration. Figure 1 presents the main integration aspects of these standards.

**Figure 1.** Instrumentation, Systems and Automation Society ISA-95 integration of information schema.

On the other hand, the Purdue reference model provides an "environment" for discrete parts manufacturing and stands for the basis for the other models [15]. In this case, certain activities are identified as directly related to shop floor production and organized in a six-level hierarchical model as depicted in Figure 2. Specific applications may require more or fewer than six levels, but six was deemed sufficient for identifying where integration standards are needed. The following list shows the name of each level and gives its primary responsibility.



These activities apply to manual operations, automated operations, or a mixture of the two at any level. It is worth mentioning the accessible subdivision of the six tasks into control enforcement, systems coordination and reporting, and reliability assurance. In the context of any large industrial plant or an entire industrial company based on one location, the tasks would take place at each level of the hierarchy.

Thus, the Common Information Model (CIM) reference model stands for a reference for computer-integrated manufacturing. It consists of a detailed collection of generic information managemen<sup>t</sup> and automatic control tasks and their necessary functional requirements for a manufacturing plant. Nevertheless, the CIM reference model scope is limited to the integrated information managemen<sup>t</sup> and automation system elements. As a result, the company's management, including planning function, financial, purchasing, research, development, engineering, and marketing and sales are all treated as external influences.

The adoption of standard models is the basis for the integration of enterprise processes. Thus, decision-making heavily relies on both the process models and the technologies which tackle the problem. Therefore, this work considers the systematization of data and knowledge managemen<sup>t</sup> to reach integration in decision-making.

#### *1.3. Knowledge Management*

The development of better practices, strategies, and policies is highly related to how organizations use experiences and ideas from customers, suppliers, and employees. Thus, capturing, storing, sharing, and applying knowledge enables the construction of organization intelligence and intellectual assets. Two types of knowledge sources can be generally defined: tangible and intangible. On the one hand, intangible assets are related to skills, expertise, and human resources knowledge. On the other hand, tangible assets are related to data, information, historical records found on databases of customers, suppliers, and employees of the organization [16].

The bases of knowledge managemen<sup>t</sup> tools can include distributed databases, ontologies, or network maps. This work focuses on formal domain ontologies development as the primary technology for knowledge management. Besides, the use of terms *Semantic Web* or *Web 3.0* can be used to refer to this technology. Ontologies and logic serve as conceptual graphs for knowledge representation in constructing computable models within a specific

domain [17]. Additionally, ontologies are defined as formal structures facilitating acquiring, maintaining, accessing, sharing, and reusing information [18,19]. Over the last decades, the Semantic Web pursued theoretical bases for developing knowledge-based applications software: One can communicate


Finally, knowledge managemen<sup>t</sup> systems benefit from ontologies that semantically enrich information and precisely define the meaning of various information artifacts.

#### 1.3.1. Bloom's Cognition Taxonomy

Bloom's Taxonomy is a framework that presents how educational objectives can guide and structure educational goals. This framework's latest work is entitled *A Taxonomy for Teaching, Learning, and Assessment* defining the cognitive processes related to knowledge [20], shown in Figure 3. The framework considers six major categories, with subactivities for better understanding, as follows:


**Figure 3.** Bloom's taxonomy by Vanderbilt University Center for Teaching.

Finally, the framework defines four types of knowledge used in cognition:

	- Knowledge of terminology
	- Knowledge of specific details & elements
	- Knowledge of classifications and categories
	- Knowledge of principles and generalizations
	- Knowledge of theories, models, and structures
	- Knowledge of subject-specific skills and algorithms
	- Knowledge of subject-specific techniques and methods
	- Knowledge of criteria for determining when to use appropriate procedures

• Metacognitive Knowledge

> Strategic Knowledge Knowledge about cognitive tasks (appropriate contextual and conditional knowledge) Self-knowledge

#### *1.4. Transactional System and Data Management*

The performance of enterprise processing activities highly depends on the transactional system's capacity and how well the data is managing.

#### 1.4.1. Transactional System

A transactional system comprises multiple operations that collect, store, modify, and retrieve data transactions within an enterprise. These systems must support a high number of concurrent users and transaction types along the time. Besides, enterprise data are identified by their purpose and type, comprising transactional, analytical, and master data. First, transactional data support the daily operations of an organization. Transactional data refer to data created or modified by the operational systems, such as time, place, number, date, price, paymen<sup>t</sup> methods, etc. Next, define analytical data as numerical measurements that support activities, such as decision-making, reporting, query, or analysis. Thus, analytical data are stored and structured as numerical values in some dimensional models. Finally, master data represent the key business entities, involving creating a single view of the data in a master file or master record. Master data comprise data about sites, inventory, levels, demand, products, batches, etc.

#### 1.4.2. Data Management

Enterprise data managemen<sup>t</sup> aims to govern business data by retrieving, standardizing, storing, integrating, structuring, and disseminating requested data. The transactional system supports data managemen<sup>t</sup> by enhancing data transaction features for control, analysis, and decision-making. Thus, data management's essential feature comprises communicating all data from different data sources (sensors) and fragmented control systems with all enterprise applications, processes, and entities that require it. Another critical aspect of data managemen<sup>t</sup> is to store and make data available when needed securely.

#### **2. Materials and Methods**

New technologies can accomplish their implementation life cycle with a robust base supported by the proposed architecture, named Wide Intelligence Management Architecture. This presented architecture offers three central systems comprising development activities: *process, knowledge, and transactional systems*, shown in Table 1. First, the process system model introduces seven development activities systematically ordered towards a formalized process maturity process. Thus, the activities range from process definition. The main aspects of how enterprise processes perform are process intelligence, where human and environmental behavior is taken into account to enrich development activities. Next, the knowledge system model aims to strengthen the integration by formalized knowledge from three main perspectives: the domain area, the expertise area (functional activities), and the experience area, enhancing expertise knowledge with success and failure cases. Finally, the third model is related to the transactional data system. This model comprises four main areas: data definition, data improvement, data standardization, and data feeding.


**Table 1.** Overall Intelligence Management Architecture for technology integration through process activities.

#### *2.1. Process System Model*

The Modular Process Reference Model aims to define a coherent and structured manner of process evolution to integrate new technologies and business activities. The reference model comprises seven modules defined by the use of analytical tools and data linked to enterprise activities, represented in Figure 4.

**Figure 4.** Maturity echelons of the process system model.

#### 2.1.1. Process Definition

This module provides a set of activities aiming to assess enterprise processes performance and to support systematic formalization. On the one hand, this module verifies "if" and measures "how much", existing formalized process follows the current enterprise activities, named as verification phase. Otherwise, this methodology aims to define, design, and standardize enterprise processes, called a definition phase. Thus, a process design phase takes place, considering the followed validation and verification phases. The steps mentioned above (verification and definition) must apply to the enterprise transactional system parallel with the processes.

#### 2.1.2. Process Improvement

The process improvement module makes an exhaustive study of current processes to perform a re-design phase. This re-design phase considers new tendencies on standards, methods, and technologies. Moreover, good manufacturing practices (GMPs) and standard operating procedures (SOPs) are of paramount importance in the improvement task. Moreover, at the same time, a transactional data system must pass through a re-design phase to support the process improvements realized. Finally, updates and documentation regarding enterprise processes, resources, and data improvements must follow (Focus on management).

#### 2.1.3. Process Standardization

This module performs research over standards and models strongly related to main enterprise processes to consider future implementation. Finally, as exposed in the previous module, data and structures from the transactional system are correctly standardized.

#### 2.1.4. Process Optimization

The process optimization module comprises processes by using different rigorous and non-rigorous method approaches for optimization. The process optimization phase aims to provide necessary data and information due to fundamental calculations based on engineering approaches to decide on specific objectives and goals within processes. Thus, as the first step, knowledge, data, and information on processes and systems are crucial to understanding the problem. The development of model design occurs by defining an objective or multi-objective function, a single or multiple purposes, and single or multiple scenarios as a convenience. Finally, WIMa's optimization solutions are enriched by semantics, mathematical, and process semantics model, allowing easier integration within the enterprise.

#### 2.1.5. Process Automation

The process automation module comprises applications such as business process automation (BPA), digital automation (DA), and robotic process automation (RPA). First, BPA makes use of advanced technologies to reduce human intervention in processing tasks across the enterprise. Thus, BPA aims to enhance efficiency by automating (initialize, execute, and complete) the whole or some parts of a complicated process. Next, DA takes the BPA system, aiming to digitalize and improve processes automation, thus meeting the market dynamics customer environment. Finally, software agents carry out RPA, thus pointing to mimic human actions within digital systems to optimize business processes by using artificial intelligence agents.

#### 2.1.6. Process Digitalization

The digitalization module creates a digital integration of all the systems found in business processes. Digitalization encompasses process simulation, industrial augmented reality, predictive systems, proactive systems, industrial internet of things, expert systems, and process virtual twins. Finally, WiiMa's solutions facilitate the digitalization

technologies development due to the semantic structure that supports easy access to raw or structured data and processes' formal knowledge.

#### 2.1.7. Process Intelligence

This module aims to understand human behavior principles by reasoning for developing programs for problem solutions by machines, using artificially intelligent tools, computational intelligence systems, and formal knowledge models. One of the first tasks is to manage structured knowledge, which can facilitate and empower the system's understanding. Furthermore, this module is directly affected by the transactional system's efficiency, which considers the collection, structuring, and data communication.

#### *2.2. Data System Model*

The transactional system architecture set up data managemen<sup>t</sup> activity. We have described data managemen<sup>t</sup> comprising five main activities: data system definition, data system improvement, data standardization, data integration and feeding, and data system dynamics, as shown in Figure 5.

**Figure 5.** Maturity echelons of the data system model.

### 2.2.1. Definition

This activity takes into account the process definition (link) to create the data model. The data model establishes the relationship between the process model and the data generated by signals sources, such as process equipment, environment sensors, suppliers, or customers. Even more, transaction data protocols are defined, and the supported physical architecture must be capable of carrying those protocols. Finally, the data managemen<sup>t</sup> plan is set, providing guidelines and procedures for enhancing security, compliance, quality, efficiency, and access.
