Next Article in Journal
Using Anchor-Free Object Detectors to Detect Surface Defects
Next Article in Special Issue
Real-Time Models for Manufacturing Processes: How to Build Predictive Reduced Models
Previous Article in Journal
Performance of CaO-Promoted Ni Catalysts over Nanostructured CeO2 in Dry Reforming of Methane
Previous Article in Special Issue
High Efficiency Producing Technology Applied in Metal Optical Lens by 3D Wax Printing Combined with Investment Casting
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Data Digitization in Manufacturing Factory Using Palantir Foundry Solution

1
Institute of Industrial Engineering, Management and Applied Mathematics, Technical University of Kosice, Park Komenského 5, 042 00 Košice, Slovakia
2
Institute of Logistics and Transport, Faculty of Mining, Ecology, Process Control and Geotechnology, Technical University of Kosice, Letná 9, 042 00 Košice, Slovakia
*
Author to whom correspondence should be addressed.
Processes 2024, 12(12), 2816; https://doi.org/10.3390/pr12122816
Submission received: 11 November 2024 / Revised: 2 December 2024 / Accepted: 6 December 2024 / Published: 9 December 2024

Abstract

:
This research describes an online solution for the collection and processing of production data, which are gathered from manufacturing and assembly processes at automotive companies. The solution describes the process for live monitoring of the production health and then evaluation through reports, with the option to generate reports for up to six months. Since the data are located in multiple sources, it is challenging to monitor them live or generate reports on demand. The solution described in this research outlines applications that simplify users’ tasks and provide immediate insights into the processes and health of production lines. Research will be divided into three applications which are delivered in one package, which is called Cycle Time Deviation (CTD): (i) workshop application for live monitoring; (ii) for evaluating data older than 24 h, the shift report application; and (iii) for comparing and monitoring the impact of process changes on the analysis, the before and after application—the Plant Improvement Tracker (PIT)—will be presented. The aim of the research is to describe the proposed solution that was implemented in a multinational automotive corporation and to outline the benefits gained from the implementation.

1. Introduction

Digitization, as part of Industry 4.0, is an important trend that affects companies, businesses and households today and will continue to do so in the future. It uses digital technologies, such as tools, systems and solutions that use information and communication technologies and processes, including hardware, software, networks and other digital components [1,2,3,4]. According to [5,6], from the digital perspective of Industry 4.0, the premise of creating an intelligent and autonomous factory was established, enabling machines to communicate with each other using technologies, such as the Internet of Things, big data, digital twins and simulation, additive manufacturing, autonomous robots, cyber-physical systems, virtual reality, cloud computing and artificial intelligence.
Digitization is important for ensuring efficient production management. It makes it possible to increase the flexibility and performance of not only production but also logistics processes and the competitive advantage of the company [7]. The relationship between the use of digital technologies and different production environments and the size of enterprises was investigated by Buer et al. [8]. They concluded that large enterprises have a significantly higher level of digitization of workshops and organizational information technology competences than small and medium-sized enterprises. Regarding the difference between the production environments, it was not possible to find a statistically significant difference in the level of implementation of the investigated aspects of digitization. The relationship between digitization and business performance was investigated by Gao et al. [9]. They found that digitization has a positive effect on business performance, which improves by reducing external administrative costs and strengthening internal controls. Radicic and Petković [10] investigated the impact of digitization on technological (product and process) innovations in Small and Medium-sized Enterprises (SMEs). They considered digitization in production and logistics, digital value chains and big data analysis. Research has confirmed that the impact of digitization on innovation activities is heterogeneous among SMEs. Innovation effects are weak and depend on the form of digitization and the type of innovation.
Digitization of data enables clarity and transparency in production processes. The evolution of data management has moved from creating tools and methodologies for conventional structured data to processing big data. This opens the way for innovative applications of data derived from new technologies. The effort is to ensure effective data management together with their archiving for future use [11,12].
Big data includes a large volume and a wide variety of raw information that is obtained from various sources [13]. Data used in industrial production, including automotive production, can be divided into three groups in terms of their complexity. Common data collected from production orders, delivery notes, etc., are the basic data on which production is planned. They are processed using simple tools such as Excel or other data processing databases. Process data are obtained using data collection tools in the production environment. They are considered more complex data in terms of their proper organization and summarization, as they usually involve the integration of multiple systems. Basic information about the product and its quality can be documented in databases that are part of the corporate information system, while the process data themselves are documented in full operation in real time and are processed using Programmable Logic Controllers (PLCs). Analytical data are a combination of data from the common data and process data groups and their detailed analysis. This requires a good understanding of the process steps and proper filtering of the analyzed data. If the data are further analyzed and evaluated by data analytics or artificial intelligence, the input data are referred to as raw data. Proper sorting and linking of data scales them and allows the system to analyze these production data. With the right application, data from multiple sources can be stored on cloud servers and evaluated by artificial intelligence, allowing for real-time monitoring [14,15]. Manufacturing Execution Systems (MESs) play a key role in informed decision-making, providing a central platform for connecting various technologies within a smart manufacturing environment. They provide data insights because they are directly connected to the shop floor’s operational structure, which includes machinery, equipment, sensors, PLCs and Supervisory Control And Data Acquisition (SCADA) [16,17].
Currently, more and more companies are considering the use of big data and its analysis in order to improve their products and services and support intelligent decision-making [18]. In manufacturing, it is expected to facilitate and improve the monitoring of business processes, improve production quality, improve supply chain management and grow business innovation [19,20,21]. In addition, big data analysis can optimize prices, increase profit and maximize sales, productivity and market share [22]. Several authors [23,24,25,26] deal with the issue of big data analysis in their works.
The authors of [15] focused on the digital transformation of data from a production environment to a cloud application. Data transformation is carried out using analytical techniques with a focus on merging data from several sources and subsequently obtaining a single analytical output. A Cloud-based integrated Data Management System (CDMS) was proposed by the authors of [27], in which they used asynchronous data transfer, a distributed file system and wireless network technology to collect, manage and share information. Gittler et al. [28] proposed a multi-step approach to align digital transformation projects with their expected benefits during design, development and implementation. They emphasized that digitization would eliminate human errors when working with data through automatic data extraction and conversion. Adrita et al. [29] presented a methodology to identify automation opportunities to eliminate manual processes through the analysis of digitized data. The application of this methodology in the production of the implant achieved a reduction in cycle time of 3.76% and an increase in production per day of 4.48%.
Locklin et al. [30] emphasized the need to use real-time data to automate and optimize manufacturing and logistics tasks. They report on six areas of research in the area under study. Clancy et al. [31] proposed the methodology “The hybrid digitization approach to process improvement”, which offers a procedure for the digital transformation of traditional production processes with the aim of enabling quality management using data. Pfirrmann et al. [32] present opinions regarding requirements and possible approaches to process data management. They emphasize that the digitization of production processes in the aviation and space industry is important because high demands are placed on the reliability of means of transport. The prerequisite is to record production processes using sensors and analyze them using process data management.
The authors of [33] emphasize that achieving “top” performance when using big data is the result of a perfect combination of corporate resources. These are organizational resources (big data analytics management), physical resources (information, technological infrastructure) and human resources (analytical skills, abilities and knowledge).
Despite the benefits that big data can bring to a company, some companies have decided not to invest in big data analysis. This applies primarily to those companies that have successfully adopted business intelligence [18]. On the other hand, the authors of [34] claim that business intelligence is an integral part of most business projects that use big data analysis.
The aim of this article is to present a solution to the problem of live monitoring of the production status and its evaluation using reports in an automotive company and to outline the benefits that result from the implementation of this solution. Due to the fact that production data are located in several sources, it is difficult to monitor them live and create reports on demand. The solution to this problem is based on the implementation of three applications adapted to specific production conditions, the first of which is focused on live monitoring, the second on the evaluation of data older than 24 h and the third on the comparison and monitoring of the impact of process changes on the analysis.
This study describes procedures for a large automotive company, tool used for this application is from Palantir, and the solution is called Foundry. The solution offers to develop multiple parallel applications with a front-end interface, which simplify daily usage for a big spectrum of users. The solution also brings another level of standardization which can be shared globally from monitoring and reporting perspectives.

2. Materials and Methods

The methodological proposal of this research is structured in two main stages. The first stage is focused on understanding the current state procedures and how the process is running now. Based on the understanding of the current state and recognition and definition of the problem, we were focused on the solution proposal, which should simplify the daily routine and minimize errors.
Since the solution processes a large amount of data, ensuring data quality is crucial. To maintain data consistency and quality, the Pipeline Builder module is used to filter and clean data formats, guaranteeing accurate calculations. In the logical layer, numerous operations and processes run in the background of the applications to prepare high-quality data and ensure proper formatting.

2.1. Current State

In the current state, data from multiple sources are pushed to the corporate MES. Once stored, the data are processed using various analytical tools, which may vary between manufacturing plants, and are then compiled into reports. These reports can be exported as files for further analysis. However, these exports, derived from real production data, are manually analyzed and compared with manual data, a process that is both time-consuming and prone to errors. Certain conclusions can be time-consuming and may lead to errors. Some conclusions can be drawn from the manual analysis (see Figure 1).
If engineers need to go through this process once, it is not a significant issue. However, challenges arise when the process must be repeated multiple times a week due to process optimization or rebalancing. Additionally, this requires the use of multiple tools to manually compare data, resulting in reduced transparency for the rest of the management. Current solutions do not provide the transparency or live overview necessary to monitor production effectively.
For these reasons, we sought to implement a new solution to simplify and automate all data computations and evaluations. The primary goal is to establish a single standard for real-time production monitoring across multiple sites within the corporation while enabling standardized, automated reporting and before-and-after state comparisons.

2.2. Solution Proposal

In the proposed solution, all data computations and analysis will be performed automatically using a cloud-based solution and fully automated computer-aided analysis. Manual comparisons will no longer be necessary. Instead, a single standard will be established and used across all manufacturing plants, bringing a higher level of standardization.
From the analysis, it will be possible to categorize the outcomes into three separate applications (see Figure 2)—workshop application, shift report and Plant Improvement Tracker (PIT). The project connected with this problematic is called Cycle Time Deviation (CTD).
To develop the proposed solution, we considered several available tools, including Grafana, Windows Report Builder, Power Business Intelligence (BI) and Asprova. All four options were internally tested for over a year across multiple locations. Based on user feedback and our trials, we decided to develop the proposed solution using the Palantir Foundry platform. This platform was chosen for its greater modularity and potential for future AI-assisted predictive analysis.

2.2.1. Workshop Application

The workshop application is designed for real-time production monitoring or, at most, for presenting data up to 24 h old.
Data collected from production lines by PLCs or other production equipment are stored in corporate MESs and Structured Query Language (SQL) databases before being pushed to the cloud. In the cloud, the system processes and evaluates the data using the Palantir Foundry solution. Real data are analyzed and compared with manual data, and all results are displayed online for users through a web front-end application.
When discussing “Cycle Time Deviation,” we are primarily focused on manufacturing times, known as cycle times. Using these statistics, we can compare scheduled times with actual times to assess how accurate and balanced the process is.
The statistics are generally divided into several groups—cycle punctuality (%), over target count and over target count in seconds.
Since our manufacturing processes are centered around assembly operations, we collect vast amounts of data. This allows us to also track statistics such as scanners RFT (right first time), fasteners RFT and scrap and reworks.
Data are visualized in the shop floor overview in a simplified manner to enable faster recognition of potential problems (see Figure 3).
The data used for evaluations in this solution can be divided into two groups—real production data/live data/online data and manual data/offline data.
Real Production Data. As previously mentioned, real data are gathered directly from the manufacturing process and collected in the corporate MES, stored in an SQL database. Since manufacturing data are continuously updated and collected in the MES, the upload of these data to the cloud are sequenced and scheduled at different intervals, depending on data volume. Each portion of the data can be uploaded separately and at varying intervals based on system requirements.
Groups of triggers and factors used for collecting real-time manufacturing data include:
  • Production line and station information;
  • Process start and stop triggers—timestamps marking the start and end of a process, which define cycle duration;
  • Logs showing which station a manual worker is logged into (active stations);
  • Scanner information with statistics (e.g., mismatch, bad scan);
  • Fastener information with statistics (e.g., fastening results, tool statuses);
  • Defects logged at the station (e.g., defect location, defect quantity);
  • Rework information and rework duration;
  • Jobs per hour as an output of the production line.
As the application is beneficial for all departments directly involved in manufacturing processes, we can define its usage frequency as follows:
  • Non-stop usage is not only intended for line or zone leaders but also for other plant users. The application can be run continuously on production screens, providing a general overview so that all users can directly see the health status of the line. The same screens can be displayed in offices to provide essential information and station performance at a glance.
  • Hourly usage is primarily for line or zone leaders, allowing them to quickly identify underperforming stations during the shift. This includes monitoring scanners, fasteners, testers, defects, reworks and other analyses provided by the application. Engineers can also use hourly statistics to assess the impact of changes or modifications on the line and track progress.
  • Daily statistics are ideal for reporting at Gemba meetings, where results from the previous day can be discussed and shared.
Manual Data. All data used for reference are referred to as manual data. These data describe scheduled times and process operations that should be guaranteed if the process is precisely defined. They are accessible to authorized users and should be updated immediately if there are any changes to the process. Inaccurate data results in inaccurate outcomes for front-end users, as the system compares real data with manual (reference) data.
The following data can be considered as manual data:
  • Time studies (e.g., Maynard Operation Sequence Technique studies—MOST studies);
  • Shift calendars, including definitions of breaks, team assignments and production date information;
  • Information about production targets and limits (for system notifications);
  • Specifications for system mapping (e.g., mapping station names in the MES to manual data);
  • Images of defect zones (for scrap and rework monitoring).
After data are ingested into the system, they must be sorted through a process known as data cleaning, which consists of multiple stages. The data cleaning procedure utilizes a tool within Palantir Foundry called Pipeline Builder. This tool enables the connection and integration of production data with manual data.
Logic of calculations. Cycle time statistics and analysis are divided into two groups—over and under target cycle time and over and under MOST.
In both scenarios, we measure the real cycle times required to build one assembly unit, evaluating each unit separately. To accurately define cycle times, we use PLC triggers in production to identify the exact start and end of the process, referred to as start and end triggers. From the perspective of continuous assembly processes, we can categorize several groups of start and end triggers:
  • Palette in Place—PIP signal;
  • Scanning—scanning of the barcode on the label;
  • End of fastening operation;
  • Manual start or stop trigger—touching the screen or pressing the button.
Target cycle time is the time limit for a station, specifying the duration within which all assembly units should be normally assembled, assuming there are no obstacles. The target cycle time is a fixed system value, which is considered a very important metric and should be used only for rough estimations. Each cycle that falls below this specific time is evaluated for general statistics and displayed in the shop floor overview as “cycle punctuality (%)”. Its calculation method is given by Equation (1). For a more precise analysis, the cycle analysis module is used to evaluate all over and under MOST statistics:
C y c l e   P u n c t u a l i t y   % = T o t a l   U n i t s   P r o d u c e d U n i t s   O v e r   T a r g e t T o t a l   U n i t s   p r o d u c e d · 100
Over and under MOST analysis. The MOST methodology is employed to sequence all activities at each station separately, based on the combinations of variants that need to be assembled. This approach is often referred to as time studies, which are used to establish standard times for workers to perform tasks. Additionally, it functions as a Predetermined Motion Time System (PMTS), primarily used to compare with actual in-process times required to complete tasks. Since each variant typically represents a different level of complexity, it is necessary to calculate different assembly times for each requirement.
Variant variations are identified by an option code, which contains all the necessary information about the components (options) used to build the assembly unit. To accurately define the exact time required to assemble a unit, the option code must be merged with manual data—specifically, the time studies. To achieve this, we utilize regular expressions (RegEx codes) to translate the option code (see Table 1).
When Foundry recognizes the options assembled at the station using RegEx translation, the system will calculate assembly times based on the time studies. The difference between the real cycle time and the standard time will define the under or over MOST times.
As processes continuously evolve or change, it is crucial to keep the MOST studies updated and ensure balanced workload distribution across all stations.

2.2.2. Shift Report

The shift report includes most of the same Key Performance Indicators (KPIs) as the workshop application but offers a longer data range. Within the shift report application, users can evaluate data up to six months old and generate reports at various intervals, such as during the shift, at the end of the shift, daily, weekly and monthly. The shift report (see Figure 4) is built as a foundational application using the contour application within the Foundry platform. It establishes a new reporting standard across the plant while remaining fully customizable, allowing trained users to modify, edit or extend existing report paths.
Each path represents a different group of analyses and may compute various portions of the data. From each path, we can select specific widgets that may later be displayed in the dashboard view (see Figure 5), which represents our front-end application.
The source of the data used in the shift report application is common for all plants. Because of this, there are filters to refine the data used in the analysis. In the application, it is possible to filter by plant, program (represents the production line), shift, team, start date and end date. After setting the filters and applying the changes, the report is automatically generated. Of course, the system can only explore and evaluate data that are available on the platform.

2.2.3. Plant Improvement Tracker

The data sources used in the plant improvement tracker application are the same as those in the shift report. This application is dedicated to monitoring the “before” and “after” states—prior to the implementation of Foundry (along with all self-developed applications) and afterward. For evaluation, multiple filters need to be set, such as plant, line, station (optional), start date, end date, implementation date, number of weeks before implementation and number of weeks after implementation.
After correctly setting up the filters, it is possible to view comparisons and evaluate the data. With the plant improvement tracker application, we can assess multiple aspects (see Table 2).

2.2.4. Data Processing and Users Groups

As we push data into a cloud solution, we can store them for later analysis. Based on our agreements and available capacity, we have decided to retain the data for a maximum of six months. These data can be used, for example, to generate shift reports or conduct other analyses to monitor changes.
To facilitate reporting at the end of each shift, we have created a shift report skeleton application that is shared with all plants that have been rolled out on the platform. Additionally, to create statistics before and after the implementation of any changes, we have developed the plant improvement tracker application.
Due to the volume of data and the complexity of the logic, we are unable to view all data and production changes “live” in real time; instead, there is a delay caused by periodic updates. The data for the cycle time analysis application are sourced from short-range datasets that are updated approximately every 15 min.
This application provides end users with a brief overview of the processes. Since all data are stored and evaluated in one centralized location, users can easily detect weak points or sub-processes and initiate corrective actions to bring the process back to optimal conditions.
Based on these data, the application can be utilized across all departments in any production environment, from management to line leaders. The solution is fully transparent, allowing all users to access and view the statistics.

3. Results

All three developed applications are connected and create a complex package. These applications aim to help manufacturing plants and their employees better understand the processes and address daily operational issues. By analyzing real application usage, we can identify how users interact with all applications and where there are opportunities for improvement.

3.1. Workshop Application

The workshop application is dedicated to real-time monitoring and can identify deviations in discipline, recognize issues with operators’ performance on the line or highlight other technical problems that may occur during the shift. In the shop floor overview, the application uses color management to help users easily understand where the problems are.
The application displays only statistics and indicates where problems have been detected (see Figure 6—red, yellow and green fields), but it does not specify the nature of the problems. Actions must be taken based on the statistics for issue investigation.
After a few months of usage across multiple plants in European countries, we can see the first benefits from the application. Initially, it was crucial to establish a daily routine for users regarding system usage. Feedback from users indicates that each plant uses the application slightly differently. Some plants focus on scanner and fastener RFT, while others prioritize cycle punctuality and real-time defect monitoring.
It is important to mention that data quality is crucial when taking actions based on cycle punctuality, as real data are compared with data manually ingested into the system. This system will effectively highlight all deviations and mark them for future investigation.

3.2. Shift Report

The tool for monitoring and evaluating all shift data in one place is called the shift report application (see Figure 7). This application provides plant users with all the necessary information, and if any specific portion of the statistics is missing, it is fully modular and open to modifications.
After a few months of usage, we can see several benefits from using this application, such as preparing all necessary statistics for daily production meetings, tracking results from actions taken on production equipment and monitoring stations during process rebalancing, among others. Another significant advantage is the creation of a standard process. Previously, data had to be exported from multiple sources—such as production information, production orders and scrap rates—and then aggregated using tools like Microsoft Excel or Power BI to generate reports. The new solution is user-friendly and fully automated, streamlining this entire process.

3.3. Plant Improvement Tracker Usage and Example

The implemented solution is evaluated and monitored on a weekly basis, and the general KPIs used for evaluation are shown in Figure 8:
All improvements are later calculated into cost reduction, focusing on headcount reduction and process optimization, with the potential to generate higher output on the line in a shorter time.
For the evaluation of Foundry indicators, we are considering the following (see Figure 9).
The analytical tool for this monitoring is the Plant Improvement Tracker (PIT) application. Data are directly sourced from the available statistics and overwritten in the provided file. Each manufacturing plant is evaluated separately.
As an example, we can consider the assembly line for the production of car seats. The evaluated timeframe will be from 23 September to 23 October, with the implementation date set for 7 October 2024. The monitoring range includes two weeks before and two weeks after the implementation date.
In the production statistics (see Figure 10), we observe a significant improvement of 16.34%. While this figure suggests a considerable enhancement, it may also reflect varying volumes ordered by the customer. Build time can also be affected by the variation of assembly variants; more complex variants typically require longer assembly times (see Figure 11).
The average build time in opportunities for rebalance (see Figure 12) highlights the areas to focus on for rebalancing across all stations, calculated as the difference between the process time on the station and the most standard time. Attention should be directed to areas with a negative value, indicating processing times that exceed standard times.
The module quality (see Figure 13) focuses on two main indicators. The number of defects box compares the number of defects found before and after implementation, along with the percentage change. A red box for percentage change indicates an undesirable increase in the number of defects, while a green box represents a desirable reduction in defects post-implementation.
IPPM stands for Internal Parts Per Million, which is calculated as follows in Equation (2):
I P P M = T o t a l   N u m b e r   o f   D e f e c t i v e   P a r t s   i n   t h e   S e l e c t e d   T i m e   R a n g e T o t a l   N u m b e r   o f   P a r t s   P r o d u c e d · 1,000,000
The IPPM box also shows a comparison before and after implementation with the percentage change. A red box for percentage change indicates an undesirable increase in IPPM, while a green box reflects a desirable reduction in IPPM post-implementation.
The rework module (see Figure 14) indicates the percentage of reworks before and after implementation. The graph displays the number of repairs per station prior to and following the implementation.
The scanning module (see Figure 15) indicates the total amount of completed and not completed scans, evaluating it as the RFT% (Right First Time statistic). The graph below presents data from before and after the analysis of the selected period, along with the percentage increase or decrease.
The fasteners module (see Figure 16) presents statistics similar to those of the scanners module. This module indicates the total number of completed and not completed fastenings, evaluating it as the RFT%. The graph below displays data from before and after the analysis of the selected period, including the percentage increase or decrease.
The tester module (see Figure 17) indicates the number of completed tests with no defects, the number of incomplete tests with defects and their total percentage. The graph below displays data from before and after the analysis of the selected period, including the percentage increase or decrease.

4. Discussion

This research presents an in-depth case study on digitizing production data within a multinational automotive company using Palantir Foundry. Through the CTD initiative, the research highlights the practical benefits of three key applications—workshop, shift report and PIT—for efficient monitoring, reporting and improvement of production processes.
The workshop application enables real-time monitoring, flagging deviations in cycle times and other process metrics that impact efficiency. By eliminating manual comparisons, the application supports rapid responses to production issues, with metrics like cycle punctuality and RFT serving as primary indicators of performance.
Shift reports compile production data across shifts, streamlining daily and weekly evaluations of KPIs and enabling a structured overview of long-term production trends. The PIT application then facilitates in-depth analysis of production changes before and after implementing the Foundry solution. This feature is essential for identifying successful adjustments and areas for further optimization, with metrics such as IPPM for quality and rework ratios highlighting performance improvements.
Overall, the digitized approach presented simplifies data handling, ensures standardized reporting across plants and minimizes the potential for human error. The success of the CTD project illustrates how digitized, modular solutions like Foundry can enhance operational efficiency, support dynamic improvements and provide a scalable model for other departments and companies aiming to modernize and optimize production environments. However, the solution described is primarily effective for large enterprises. For micro, small or medium-sized enterprises, the high initial and operating costs may render this solution less accessible.
The Palantir Foundry platform, with its full modularity, offers significant potential for a wide range of industries. It provides robust capabilities for big data evaluations, which can extend to additional production analyses and evaluations, such as tool moni-toring for maintenance operations, logistics management, health and safety evaluations and more. Live data can be processed and analyzed in real time using Foundry applications and stored in the cloud for long-term analysis, enabling comprehensive insights across various operational domains.

5. Conclusions

The Palantir Foundry solution described in this document demonstrates significant potential as a versatile tool for data collection and evaluation, with extensive modular capabilities. This solution enables users to move away from conventional analysis methods, saving time on activities that can be automated using advanced technology. Additionally, its ability to identify and address inefficiencies in real-time through automated systems not only boosts productivity but also offers substantial financial savings.
As a fully modular platform, Foundry allows developers to create diverse applications beyond production health monitoring, such as financial tracking, tool maintenance monitoring and more. This flexibility makes it an invaluable resource for improving processes across multiple departments and functions within manufacturing environments. The Foundry platform thereby supports a streamlined, data-driven approach that is both adaptable and scalable across various operational needs.

Author Contributions

Conceptualization, P.K. and J.J.; methodology, P.K. and J.J.; software, P.K.; validation, P.K., J.J. and J.F.; formal analysis, P.K., J.J. and J.F.; investigation, P.K. and J.J.; resources, J.J. and J.F.; writing—original draft preparation, P.K., J.J. and J.F.; writing—review and editing, P.K., J.J. and J.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data are contained within the article.

Acknowledgments

This paper was developed within the project implementations KEGA 038TUKE-4/2024, Increasing the applicability of graduates on the labor market by implementing practice requirements into the teaching process of the environmental engineering study program; and KEGA_010ŽU-4/2023, Innovative approaches in teaching in the field of transport studies focused on railway transport management with the support of risk and crisis management.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bartel, A.; Ichniowski, C.; Shaw, K. How does information technology affect productivity? Plant-level comparisons of product innovation, process improvement, and worker skills. Q. J. Econ. 2007, 122, 1721–1758. [Google Scholar] [CrossRef]
  2. Sarbu, M. The impact of industry 4.0 on innovation performance: Insights from German manufacturing and service firms. Technovation 2021, 113, 102415. [Google Scholar] [CrossRef]
  3. Denicolai, S.; Zucchella, A.; Magnani, G. Internationalization, digitalization, and sustainability: Are SMEs ready? A survey on synergies and substituting effects among growth paths. Technol. Forecast. Soc. Chang. 2021, 166, 120650. [Google Scholar] [CrossRef]
  4. Ashouri, S.; Hajikhani, A.; Suominen, A.; Pukelish, L. Measuring digitalization at scale using web scraped data. Technol. Forecast. Soc. Chang. 2024, 207, 123618. [Google Scholar] [CrossRef]
  5. Rossi, A.H.G.; Marcondes, G.B.; Pontes, J.; Leitão, P.; Treinta, F.T.; De Resende, L.M.M.; Mosconi, E.; Yoshino, R.T. Lean Tools in the Context of Industry 4.0: Literature Review, Implementation and Trends. Sustainability 2022, 14, 12295. [Google Scholar] [CrossRef]
  6. Folgado, F.J.; Calderón, D.; González, I.; Calderón, A.J. Review of Industry 4.0 from the Perspective of Automation and Supervision Systems: Definitions, Architectures and Recent Trends. Electronics 2024, 13, 782. [Google Scholar] [CrossRef]
  7. Gobble, M.M. Digitalization, digitization, and innovation. Res. Technol. Manag. 2018, 61, 56–59. [Google Scholar] [CrossRef]
  8. Buer, S.V.; Strandhagen, J.W.; Semini, M.; Strandhagen, J.O. The Digitalization of Manufacturing: Investigating the Impact of Production Environment and Company Size. J. Manuf. Technol. Manag. 2021, 32, 621–645. [Google Scholar] [CrossRef]
  9. Gao, D.; Yan, Z.; Zhou, X.; Mo, X. Smarter and Prosperous: Digital Transformation and, Enterprise Performance. Systems 2023, 11, 329. [Google Scholar] [CrossRef]
  10. Radicic, D.; Petković, S. Impact of Digitalization on Technological Innovations in Small and Medium-Sized Enterprises (SMEs). Technol. Forecast. Soc. Chang. 2023, 191, 122474. [Google Scholar] [CrossRef]
  11. Maroufkhani, P.; Ismail, W.K.W.; Ghobakhloo, M. Big data analytics and firm performance: A systematic review. Information 2019, 10, 226. [Google Scholar] [CrossRef]
  12. Storey, V.C.; Woo, C. Data challenges in the digitalization era. In Proceedings of the 28th Workshop on Information Technologies and Systems, Santa Clara, CA, USA, 16–18 December 2018. [Google Scholar]
  13. Shi, P.; Cui, Y.; Xu, K.; Zhang, M.; Ding, L. Data Consistency Theory and Case Study for Scientific Big Data. Information 2019, 10, 137. [Google Scholar] [CrossRef]
  14. Gillon, K.; Brynjolfsson, E.; Mithas, S.; Grin, J.; Gupta, M. Business analytics: Radical shift or incremental change? Commun. Assoc. Inf. Syst. 2014, 34, 287–296. [Google Scholar] [CrossRef]
  15. Krajný, P.; Janeková, J. Data Digitalization in an Industrial Enterprise. In Proceedings of the 21th International Scientific Postgraduate Conference for Faculties of Mechanical Engineering of Technical Universities and Colleges Novus Scientia 2024, Košice, Slovakia, 25 January 2024. [Google Scholar]
  16. Tabim, V.M.; Ayala, N.F.; Marodin, G.A.; Benitez, G.B.; Frank, A.G. Implementing Manufacturing Execution Systems (MES) for Industry 4.0: Overcoming buyer-provider information asymmetries through knowledge sharing dynamics. Comput. Ind. Eng. 2024, 196, 110483. [Google Scholar] [CrossRef]
  17. Benitez, G.B.; Ghezzi, A.; Frank, A.G. When technologies become Industry 4.0 platforms: Defining the role of digital technologies through a boundary-spanning perspective. Int. J. Prod. Econ. 2023, 260, 108858. [Google Scholar] [CrossRef]
  18. Ashrafi, A.; Zare Ravasan, A. How market orientation contributes to innovation and market performance: The roles of business analytics and flexible IT infrastructure. J. Bus. Ind. Mark. 2018, 33, 970–983. [Google Scholar] [CrossRef]
  19. Papadopoulos, T.; Singh, S.P.; Spanaki, K.; Gunasekaran, A.; Dubey, R. Towards the next generation of Manufacturing: Implications of Big Data and Digitalization in the context of Industry 4.0. Prod. Plan. Con. 2021, 33, 101–104. [Google Scholar] [CrossRef]
  20. Ji-fan Ren, S.; Wamba, S.F.; Akter, S.; Dubey, R.; Childe, S.J. Modelling quality dynamics, business value and firm performance in a big data analytics environment. Int. J. Prod. Res. 2016, 55, 5011–5026. [Google Scholar] [CrossRef]
  21. de Oliveira, N.J.; Bruno, L.F.C.; Santiago, S.B.; de Oliveira, M.C.; de Lima, O.P. The Impact of digitalization on the plastic injection production process. Rev. De Ges. E Sec.-Ges. 2023, 14, 332–346. [Google Scholar] [CrossRef]
  22. Schroeck, M.; Shockley, R.; Smart, J.; Romero-Morales, D.; Tufano, P. Analytics: The real-world use of big data. IBM Glob. Bus. Serv. 2012, 12, 1–20. [Google Scholar]
  23. Mikalef, P.; Pappas, I.O.; Krogstie, J.; Giannakos, M. Big data analytics capabilities: A systematic literature review and research agenda. Inf. Syst. E Bus. Manag. 2018, 16, 547–578. [Google Scholar] [CrossRef]
  24. Rialti, R.; Marzi, G.; Ciappei, C.; Busso, D. Big data and dynamic capabilities: A bibliometric analysis and systematic literature review. Manag. Decis. 2018, 57, 2052–2068. [Google Scholar] [CrossRef]
  25. Ardito, L.; Scuotto, V.; Del Giudice, M.; Petruzzelli, A.M. A bibliometric analysis of research on Big Data analytics for business and management. Manag. Decis. 2019, 57, 1993–2009. [Google Scholar] [CrossRef]
  26. Wamba, S.F.; Mishra, D. Big data integration with business processes: A literature review. Bus. Process Manag. J. 2017, 23, 477–492. [Google Scholar] [CrossRef]
  27. Chen, H.Q.; Xin, H.W.; Teng, G.H.; Meng, C.Y.; Du, X.D.; Mao, T.T. Cloud-based data management system for automatic real-time data acquisition from large-scale laying-hen farms. Int. J. Agric. Biolog. Eng. 2016, 9, 106–115. [Google Scholar] [CrossRef]
  28. Gittler, T.; Plümke, L.; Silani, F.; Moro, P.; Weiss, L.; Wegener, K. People, Process, Master Data, Technology: Data-Centric Engineering of Manufacturing Management Systems. In Proceedings of the 8th International Conference on Competitive Manufacturing, Stellenbosch, South Africa, 9–10 March 2022; pp. 447–462. [Google Scholar] [CrossRef]
  29. Adrita, M.M.; Brem, A.; O’Sullivan, D.; Allen, E.; Bruton, K. Methodology for Data-Informed Process Improvement to Enable Automated Manufacturing in Current Manual Processes. Appl. Sci. 2021, 11, 3889. [Google Scholar] [CrossRef]
  30. Locklin, A.; Jazdi, N.; Weyrich, M.; Przybysz-Herz, K.; Libert, R.; Ruppert, T.; Jakab, L. Tailored digitization with real-time locating systems Ultra-wideband RTLS for production and logistics. ATP Mag. 2021, 3, 76–83. [Google Scholar]
  31. Clancy, R.; O’Sullivan, D.; Bruton, K. Data-driven quality improvement approach to reducing waste in manufacturing. TOM J. 2023, 35, 51–72. [Google Scholar] [CrossRef]
  32. Pfirrmann, D.; Voit, M.; Eckstein, M. Quality control of a milling process using process data management in the aerospace industry. MM Sci. J. 2019, SI, 3067–3070. [Google Scholar] [CrossRef]
  33. Li, W.C.; Yang, X.Q.; Yin, X.Q. Digital transformation and labor upgrading. Pac.-Bas. Fin. J. 2024, 83, 102280. [Google Scholar] [CrossRef]
  34. Beneventano, D.; Vincini, M. Foreword to the Special Issue: Semantics for Big Data Integration. Information 2019, 10, 68. [Google Scholar] [CrossRef]
Figure 1. Current state of data collection and evaluation.
Figure 1. Current state of data collection and evaluation.
Processes 12 02816 g001
Figure 2. Proposed solution for the collection and evaluation of production data.
Figure 2. Proposed solution for the collection and evaluation of production data.
Processes 12 02816 g002
Figure 3. Dashboard view for shop floor overview in CTD application.
Figure 3. Dashboard view for shop floor overview in CTD application.
Processes 12 02816 g003
Figure 4. Shift report paths overview in contour application.
Figure 4. Shift report paths overview in contour application.
Processes 12 02816 g004
Figure 5. Example of dashboard widget for over takt time monitoring.
Figure 5. Example of dashboard widget for over takt time monitoring.
Processes 12 02816 g005
Figure 6. Color management on shop floor overview.
Figure 6. Color management on shop floor overview.
Processes 12 02816 g006
Figure 7. Shift report—dashboard view for long-term analysis.
Figure 7. Shift report—dashboard view for long-term analysis.
Processes 12 02816 g007
Figure 8. KPIs used for evaluation and monitoring of application usage.
Figure 8. KPIs used for evaluation and monitoring of application usage.
Processes 12 02816 g008
Figure 9. Indicators monitored using CTD application.
Figure 9. Indicators monitored using CTD application.
Processes 12 02816 g009
Figure 10. PIT statistics of monitoring of Jobs Per Hour (JPH) calculations.
Figure 10. PIT statistics of monitoring of Jobs Per Hour (JPH) calculations.
Processes 12 02816 g010
Figure 11. Build time calculations.
Figure 11. Build time calculations.
Processes 12 02816 g011aProcesses 12 02816 g011b
Figure 12. PIT Opportunities for rebalancing.
Figure 12. PIT Opportunities for rebalancing.
Processes 12 02816 g012
Figure 13. PIT and module for quality monitoring.
Figure 13. PIT and module for quality monitoring.
Processes 12 02816 g013
Figure 14. Rework module.
Figure 14. Rework module.
Processes 12 02816 g014
Figure 15. PIT scanners module.
Figure 15. PIT scanners module.
Processes 12 02816 g015
Figure 16. PIT fasteners module.
Figure 16. PIT fasteners module.
Processes 12 02816 g016
Figure 17. PIT testers module.
Figure 17. PIT testers module.
Processes 12 02816 g017
Table 1. Option code and regular expression example.
Table 1. Option code and regular expression example.
Option Code ExampleRegular Expression ExampleDescriptionVariant
A1PG041xx5xxxx.RB4MNNN00P10001^.{16}(B4M).+^. starting with 16 random characters then look for “B4M”Sport
A1PG041xx5xxxx.RB4MNNN00P10001^.{25}[1].+^. starting with 25 random characters then look for “1”Leather
Table 2. Plant improvement tracker modules and their characteristics.
Table 2. Plant improvement tracker modules and their characteristics.
ModuleCharacteristics
Productionis focused on job statistics—total output of the station/line and average JHP (Jobs Per Hour) statistics.
Build timeis focused on average build time per station and total build time.
Opportunities for rebalanceevaluates in-process time per station before and after and shows potential for rebalancing of the process.
Qualityis focused on comparison of number of defects before and after in multiple formats like Internal Part Per Million (IPPM), defects per week, defects per weekday and defects per hour.
ReworkRework statistics are divided into multiple groups: rework ration before and after, number of reworks per station duration of repairs per station.
Scanners and fastenersThe main indicator is RFT and also their performance, completed and not completed scanning or fastenings.
TestersTesters RFT is evaluated with information about completed and not completed tests
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Krajný, P.; Janeková, J.; Fabianová, J. Data Digitization in Manufacturing Factory Using Palantir Foundry Solution. Processes 2024, 12, 2816. https://doi.org/10.3390/pr12122816

AMA Style

Krajný P, Janeková J, Fabianová J. Data Digitization in Manufacturing Factory Using Palantir Foundry Solution. Processes. 2024; 12(12):2816. https://doi.org/10.3390/pr12122816

Chicago/Turabian Style

Krajný, Peter, Jaroslava Janeková, and Jana Fabianová. 2024. "Data Digitization in Manufacturing Factory Using Palantir Foundry Solution" Processes 12, no. 12: 2816. https://doi.org/10.3390/pr12122816

APA Style

Krajný, P., Janeková, J., & Fabianová, J. (2024). Data Digitization in Manufacturing Factory Using Palantir Foundry Solution. Processes, 12(12), 2816. https://doi.org/10.3390/pr12122816

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop