*2.6. Robots*

Production process automation began in the 1960s with the introduction of industrial robots into the automotive manufacturing process. The automation of production systems by the introduction of industrial robots is an ongoing process and is now in line with the evolution of information technology [7]. Industrial robots, ranked in Industry 4.0, are divided into the following two types [60]:


The area of collaborative robots was extensively explored, but it is necessary to define precisely what type of robot can be specified as cooperative. Even with many products available [61] and after the completion of many research projects [62], the definition of a collaborative robot remains unclear. Based on SICK AG (sensor intelligence), there are three types of human–robot interaction [63]: coexistence, cooperation, and collaboration. Robots play an important role in the modern manufacturing industry. Since 2004, the number of multipurpose industrial robots developed by enterprises in the 4.0 sector in Europe almost doubled [15]. The number of installed industrial robots is calculated per 10,000 employees in the manufacturing industry. The highest robot densities in 2017 according to the International Federation of Robotics [62] were found in the Republic of Korea (710), Singapore (658), and Germany (322). The world average was 85 robots per 10,000 employees; however, during the period 2013–2017, global sales of industrial robots increased by 114%. The use of robots is expanding to include a variety of functions: production, logistics, office managemen<sup>t</sup> (for document distribution), maintenance, and repair of manufacturing defects [64]. An autonomous robot is a robotic device that works independently (it is not controlled in real time by a human, but by a program). In the future, they will be based on artificial intelligence and they will be capable of learning [65].

#### *2.7. M2M Communication*

Digital production includes a wide range of applied sciences. Studies in these fields attract a lot of effort both in academia and in industry, especially in connection with machine connectivity and communication (M2M), vitally important for machine collaboration and process optimization [66]. Computer-to-computer communication brings much greater efficiency and extraordinary security in production units, from factory halls to agriculture. Literally, machine-to-machine is synonymous with technology that communicates without human intervention. M2M communications change some processes by giving more data to the enterprises, and they will require companies to train employees for these purposes. In addition, the integration of M2M elements will require better integration capabilities and the creation of reliable complex networks with a higher level of security [67]. Rao [26]

and his team described a farm of "no farmers" where cows can be detected by the feeding machines through sensors and M2M communication, and where the digital sensor capsules inside the cow send reports that the cow is fertile. Worldwide, the automotive, energy, transportation, logistics, consumer electronics, and ultimately retail industries are becoming the new view of new M2M applications [66]. M2M communication o ffers autonomous communication between intelligent encoders and drives and delivers greater value in the transport sector [17]. M2M communication systems implement automated data communication between machine-type communication (MTC) devices, creating a basic communication infrastructure for IoT and 5G networks [68,69]. M2M communication will be provided both between physical objects and between their cloud-based digital counterparts [70]. Depending on the location of the distant objects relative to the network, cloud computer technology is referred to [71]. In the future, cloud robotics will be used with real-time connectivity. A higher level of M2M communication is related to the Internet of things (IoT), which is a designation for a much more intelligent interconnection of various products, devices, etc. [72]. The key elements are miniature sensors, representing an almost ubiquitous image recognition technology capable of recognizing people, buildings, and other objects [73].

#### *2.8. Sharing and Using Data with Suppliers and Customers*

Enterprises face a precarious environment and strive to achieve greater cooperation in the supply chain to leverage the resources and knowledge of their suppliers and customers [74]. In such a chain, this cooperation takes place through electronic data interchange (EDI) [53]. Using and evaluating multidimensional process knowledge is considered an e ffective strategy to improve the competitiveness of the enterprises [75]. Sharing forecasting information helps supply chain parties better match demand and supply [76]. The information is used to update variations in seasonal product demand [77]. Information needs to be shared to achieve an e fficient supply chain [78]. Optimum supply chain performance requires manufacturers to truly inform other partners of their original forecast [79]. By Croson and Donohue [80], it is useful for the enterprises to share sales data (POS — Point of Sale materials), especially to reduce the bullwhip e ffect. Christopher, in connection with data sharing and supplier and customer integration, discussed "demand chain management", linking supply chain managemen<sup>t</sup> with marketing, bringing agile and lean properties to chains [81]. Demand chain is defined by (1) managing integration between demand and supply processes, (2) managing the structure between integrated processes and customer segments, and (3) managing the working relationship between the marketing and supply chain [82]. In addition, the enterprises are able to share product life-cycle information and focus on product design [83]. In practice, it is the co-design, visualization and production analysis, and joint research and design [84]. This creates a variety of systems for exchanging and sharing product information between users and platforms [85,86].

#### *2.9. Use of Virtual Reality, Simulation, and Digital Twins*

Simulation is defined as an imitation of a real thing, a state, or a process. Generally, it implies displaying or modeling some key features and behavior of some physical or abstract systems for testing, optimization, and education. Product and process simulations are used extensively in production, especially processes of visualization, representation, simulation, modeling, and interpretation. Enriching digital simulations with sensor data brings reality closer and improves the accuracy of simulation results [87]. Virtual reality (VR) is broadly defined as a computer-generated three-dimensional (3D) world [88], and an environment that simulates complex situations and contexts in real life and allows people to immerse, navigate, and communicate [89]. A key feature of virtual reality is real-time interactivity. VR systems generally track the movement of hand-held objects and the user's head and limbs, and the received data are used to determine the user's view, navigation, interaction with objects, and possible movement of the virtual body, known as an "avatar" [90]. Virtual reality by Steuer [91] is technological hardware that includes a computer, an imaging helmet, headphones, and motion-sensory gloves. The main areas of VR application include healthcare [92]. The concept of

augmented reality must be distinguished from the concept of virtual reality. Augmented reality (AR) is a special application providing its users with a direct or indirect view of the real world, whose parts are complemented, expanded, and enriched with additional digital visual elements [93,94]. Examples include end-to-end applications, viewing glasses, and projection of information in a car directly onto the windshield. The use of simulation to control and optimize products and manufacturing systems in real time is a concept known as the digital twin [95], which is considered as another step in modeling, simulation, and optimization of technologies [96]. Digital twins are defined as a digital replication of both living and inanimate entities that enable seamless data transfer between the physical and virtual worlds [97]. Digital twins are a mirror image of a real-time physical process [98]. The concept of using "twins" comes from the Apollo NASA (National Aeronautics and Space Administration) program; later, it was used also in aviation, such as the "Iron Bird" [96]. Digital twin devices o ffer a platform for the development, testing, improvement, and upscale of the manufacturing environment [99]. Digital twin technology is considered a key technology for the realization of cyber physical systems [100]. The application of simulation techniques brings digital twins to life and makes them experimentable; the digital twins become known as experimentable digital twins (EDTs). Initially, these EDTs communicate with each other purely in the virtual world. In this way, complete digital representations of the respective real assets and their behavior are created. Real-estate EDT networking leads to hybrid application scenarios in which EDT is used in combination with real hardware, delivering complex control algorithms, innovative user interfaces, and smart models for smart systems [101].

#### **3. Materials and Methods**

The main aim of the paper was to analyze the readiness of enterprises to implement Industry 4.0 in the period 2018–2019. The first partial aim of the paper was to compile an index of evaluation of the level of Industry 4.0 in enterprises based on the results of a survey. The second partial aim was the statistical verification of the consistency of the index with further results from the questionnaire survey.

The preparation of the research sample firstly included the identification of the number of enterprises used for the questionnaire survey. Based on CSU (Czech Statistical O ffice) data, it was found that, in the Czech Republic, there were 175,894 enterprises in the manufacturing industry in 2017, of which 7.1% were small, medium, and large enterprises, i.e., a total of 12,470 subjects [102]. Approximately 2500 enterprises were approached to ensure that a 95% confidence level condition was met at a 5% margin of error and at a discarded 15% return on the questionnaires. The data were collected on the basis of interviews with business managers, firstly addressed electronically. Of the total number of respondents, 314 enterprises agreed to cooperate and participate in a questionnaire survey with a return rate of 12.5%. The authors of the paper and university students were present at the meetings with the enterprises and in the process of completing the questionnaires. The establishment of the research was approached in two stages (two research waves): first in February–March 2018 and then in January–May 2019.

As part of Industry 4.0 research, the research sample consisted of 276 enterprises reporting their data (38 out of 314 questionnaires were excluded based on these criteria: at least 10 employees, one year on the market, and completeness of survey). The amount of obtained data was further specified in terms of business characteristics, i.e., size and technology demands (Table 1). The first wave of the research was used to create the Industry 4.0 index (VPi4), whereas the second wave of the research was used to check and compare the results achieved. Characteristics of the research samples according to the research waves were as follows:



#### **Table 1.** Research sample characteristics.

Table 1 describes the research sample in terms of the size of the enterprises and their technological intensity.

Classification of the enterprises by size was based on the number of employees of the enterprise, as defined by the methodology of the European Commission [103]. Table 1 shows that, in the first wave sample, there were 39.0% small enterprises, 28.7% medium-sized enterprises, and 32.3% large enterprises as the most common. The composition of enterprises in the second wave of research was very similar.

Table 1 shows the distribution of enterprises in terms of their technological intensity, with the enterprises with higher technological intensity (HTI) and the enterprises with lower technological intensity (LTI) according to the methodology of the Czech Statistical Office [102]. In the Czech Republic and in our research in both waves, the groups were comparable. The only difference was the representation of the enterprises from the low-tech sector (LTS) and medium low-tech sector (MLTS) in the area with lower technological intensity in the first and second waves of research.


The questionnaire focused on main groups of Industry 4.0 characteristics (observed phenomena). The items of the questionnaire were defined with the support of 34 managers and their expert evaluation within the framework of the qualitative research. The main part of the questionnaire consisted of 17 variables characterizing different technologies of Industry 4.0 used by the enterprises (data collection, cloud storage, data analysis, people capability, IT infrastructure, information systems, M2M, robots, mobile terminals, using sensors, learning software, sharing data, virtual reality, additive manufacturing, i.e., 3D print, nanotechnology, drones, and autonomous vehicles). The areas are described in detail in Section 2. In addition, four identification characteristics were measured for the enterprises, i.e., size according to the number of employees, field of activity, technological intensity, and type of owner. The questionnaire also included questions about whether the enterprises had a formulated strategy, whether they planned on investing in technology, and a subjective assessment of the level of Industry 4.0 in their organization.

#### *3.1. Exploratory Factor Analysis*

The factor analysis was chosen to classify the most important variables affecting the level of enterprise readiness for Industry 4.0 into groups. The central aim of factor analysis is the orderly simplification [104] of several interrelated measures using mathematical procedures. The goal of the analysis is to reduce the number of variables through fewer common factors and to reveal the structure of relationships between the variables. Factor analysis in the broad sense comprises both a number of statistical models and a number of simplifying procedures for the approximate description of data [105]. The basis of factor analysis is the assumption that the observed covariance (relationships, i.e., correlations) between the variables is the result of the action of common factors and not the interrelationship between the variables. Gorsuch [106] pointed out that the aim of factor analysis is to summarize the interrelationships among the variables in a concise but accurate manner as an aid in conceptualization. Each factor represents an area of generalization that is qualitatively distinct from that represented by any other factor. A measure of the degree of generalizability found between each variable and each factor is calculated and referred to as a factor loading.

We used exploratory factor analysis (EFA) to explore the main dimensions and generate a new index of Industry 4.0. The scales of the items used in factor analysis were assessed on a scale of 1–4, using the same range as Veza [107] in the survey, to evaluate the Industry 4.0 maturity level of Croatian enterprises. This scale achieved better pilot research results than 1–5 used by Frank [108] or Schumacher [109] to determine the implementation of di fferent technologies in manufacturing companies.

The factor analysis helped in particular to determine the internal structure of covariance of variable indexes and to di fferentiate di fferent groups of the factors. The suitability of data structure for factor analysis was analyzed by Bartlett's test of sphericity [110] and the Kaiser–Meyer–Olkin (KMO) test [111]. Bartlett's test checked that the observed correlation matrix diverged significantly from the identity matrix at α = 0.05 with a *p*-value of 3.021 × 10−<sup>15</sup> (χ<sup>2</sup> = 96.243, degrees of freedom (df) = 12). Subsequently, the Kaiser–Meyer–Olkin sample adequacy ratio was calculated, and the value was 0.8495. Such a value was deemed high (higher than 0.7), making factor analysis very appropriate [112]. Tabachnick and Fidell [113] recommended inspecting the correlation matrix for correlation coe fficients over 0.30. Many correlation coe fficients do not meet this requirement, but almost all of these coe fficients were statistically significant at the level α = 0.05.

## *3.2. Statistical Analysis*

The results of the research were further processed using statistical analysis. The aim of this analysis was to compare the results with the Industry 4.0 VPi4 index.

Firstly, the VPi4 index distribution within the first wave of the research was compared with the index results in the second wave of the research. Due to the abnormality of the data, it was necessary to use the non-parametric Mann–Whitney–Wilcox test for the independent samples. In this case, we expected the samples to be similar. Working hypotheses, which formed the subject matter of verification at the 5% level of significance, were as follows:


Furthermore, the dependence between the subjective perception of Industry 4.0 level and the VPi4 index was tested using Person and Spearman correlation coe fficients. The index was expected to correlate to a certain extent with the subjective perception of the situation in the enterprise. Working hypotheses, which formed the subject matter of verification at the 5% level of significance, were as follows:


Furthermore, the hypotheses about the impact of technological intensity of the industry on the level of Industry 4.0 in the enterprises (expressed through the VPi4 index) were tested. The Mann–Whitney test was used for this purpose. It was assumed that the index would reach higher values in the enterprises with higher technological demands. For this purpose, the analysis was carried

out separately for high-tech and low-tech enterprises, and the results were then compared. Working hypotheses, which formed the subject matter of verification at the 5% level of significance, were as follows:


Statistical evaluation of tests was performed using Statistica 12 and R software.
