Next Article in Journal
Fast Extraction of Coupling of Modes Parameters for Surface Acoustic Wave Devices Using Finite Element Method Based Simulation
Previous Article in Journal
Influence of the Electron Selective Contact on the Interfacial Recombination in Fresh and Aged Perovskite Solar Cells
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Leveraging Quantity Surveying Data and BIM to Automate Mechanical and Electrical (M & E) Construction Planning

1
Laboratoire d’Innovation Numérique pour les Entreprises et les Apprentissages au service de la Compétitivité des Territoires LINEACT CESI, 8 Rue Isabelle Autissier, 17140 Lagord, France
2
EQUANS, 15 Rue Nina Simone, 44000 Nantes, France
3
Ecole Nationale Supérieure des Arts et Métiers ENSAM, 151 Boulevard de l’Hôpital, 75013 Paris, France
4
Laboratoire d’Innovation Numérique pour les Entreprises et les Apprentissages au service de la Compétitivité des Territoires LINEACT CESI, la Défense FR, 1 Avenue du Président Wilson, 92074 Paris, France
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(9), 4546; https://doi.org/10.3390/app12094546
Submission received: 6 April 2022 / Revised: 21 April 2022 / Accepted: 27 April 2022 / Published: 29 April 2022
(This article belongs to the Topic Industrial Engineering and Management)

Abstract

:
Despite the great potential of LPS and BIM to improve construction project productivity, the full integration of these modern production and information management systems at the data processing level is not yet achieved. After matching the literature to empirical studies in a Constructive Research Approach, it emerged that very few studies have investigated how buildings’ data could be preserved and continuously evolve during the project lifecycle. Accordingly, we underline the potential role of data warehousing in rendering operational data as a strategic asset for decision making. These findings motivate the present research, which aims to capitalize on quantity surveying data in order to automate the generation of M & E installation schedules. This paper first introduces the system functional requirements. Then, it proposes a conceptual scheme for the planning data mart (a data warehouse subset dedicated to planning subject area). Furthermore, we shed light on the M & E fragnet standardization procedure and how data have been processed. Finally, we present the current software developments to demonstrate the feasibility of this concept.

1. Introduction

Schedule slippages and massive cost overruns are typical failures of construction projects across the globe. Professionals underline the correlation between project failures and labor productivity that can be measured by time on tool indicators [1]. The latter consist of the effective working time of crews excluding anything that prevent laborers from working, such as waiting times, movements, planning, etc. Time on tool analysis has shown that fewer than 4 out of 10 h in a day are productive [1]. Thus, removing the bulk of distractions and constraints that are known to impact labor productivity is the key for improving construction project performances. Installation crews must, however, have all the information and resources needed to complete their work. This is the aim of Workface Planning (WP), which refers to the definition, creation, execution and tracking of Installation Work Packages (IWPs) by Workface Planners. IWPs are small groupings of work tasks for execution by a single crew in a short time period [2]. There are many similarities between Workface Planning and the Last Planner System® (LPS). On one hand, LPS is a collaborative planning process that follows Lean Construction (LC) principles. It involves trades foremen and design team leaders in planning in greater and greater detail as the work deadline gets closer. It encompasses multiple components: master planning and phase planning to define what SHOULD be done, lookahead planning based on constraint identification and removal to establish what CAN be done, weekly work planning based on reliable promises of what WILL be done and learning based on analysis of the Planned Percent Completed (PPC) and reasons for variance [3]. On the other hand, WP is the last component of an overall project management methodology, Advanced Work Packaging, that focuses more on the engineering and procurement prioritization and early project sequencing efforts. Workface Planning includes IWP creation, constraint and backlog management, progress monitoring, etc. Workface Planners or Last Planners perform the function of production management, so they should be skilled enough to break down the work scope of each discipline into installation operations. Along with their inherent experience, planners rely basically on engineering information such as the materials list, drawings, specifications and vendor information to build executable plans. Likewise, each project stakeholder counts on the preceding knowledge to carry out their activities [4]. It is important to note that a piece of information turns out to be knowledge when it is useful and relevant to a specific subject. Knowledge is higher in the hierarchy when compared to information; it is also perceived to be a more valuable and competitive resource in most construction organizations. As a general rule, knowledge can be either explicit or implicit [5]. Explicit knowledge includes information contained in databases and documents. This information is quite shallow and does not contain deep experience-based knowledge. On the other hand, implicit knowledge refers to “know-how” information that employees have already learned through experience. This kind of information is hard to communicate or transfer between people and information management systems.
In view of the fact that overall construction project processes are interdependent, efficient information flows and relevance are the backbone of all successful construction projects. Unfortunately, the dynamic and fragmented nature of the construction industry hinders the exchange of valuable information between different actors in the project environment. The incompatibility between project stakeholders’ systems prevents them from rapidly sharing accurate project information and results in numerous data consistency problems and added costs. Within this context, Workface Planners are unable to access and use commercial and engineering data to develop consistent installation schedules. Hence, the poor information and knowledge management implies a weak production management system that leads to various wastages in construction, including rework, waiting, over-design and the extension of the overall project duration [6]. Moreover, construction is often criticized as a sector that lacks standardization. On most projects, Workface Planners are left to devise their own production management and control system [7]. Those issues are mainly addressed by innovative production management concepts, namely, the Lean philosophy.

1.1. Applying Lean Construction Principles to Waste Management

The last few years have witnessed the emergence of a new production management philosophy baptized Lean Construction. Naqib Daneshjo defined production management as the “planning, organizing, directing and controlling of production activities. Production management deals with converting raw materials into finished goods or products. Production management also deals with decision making regarding the quality, quantity, cost, etc., of production” [8]. Thus, Lean Construction is the application of production management to deliver optimal construction projects by maximizing customer value and minimizing wastes. The desired outcomes are attained through methodic, collaborative and continuously improved design and building processes and flows. Ultimately, Lean is about moving ever closer to uninterrupted flow in the sequence of operations that delivers perfect quality. In fact, it concerns the continuous learning and improvement of production operations. The term “flow” indicates not only the physical products and services but also the information necessary to run operations [9]. It is noteworthy to mention that a company’s Lean transformation is a unique journey. Even if there is no silver bullet to follow in a Lean transformation, Lean House frameworks may help to decide the approach, set the priorities and select the appropriate tools. For instance, the conventional “House of Lean” developed by Toyota rests on two pillars: Just-in-Time and Jidoka (Flow and Quality or “Go” and “Stop”). It also suggests building the foundations by means of Gemba walks, Standardization, Visual Management, 5S, etc. before dealing with the aforementioned pillars [9]. Surprisingly, the Last Planner System® (LPS) and Just-in-Time (JIT) are the most practiced Lean techniques in the construction industry, both addressing the flow pillar [10,11]. Aside from the steady adoption of 5S and Visual Management by construction companies, standardization is still embryonic in the construction sector. Still, standardized work is fundamental to keeping the production as close as possible to the continuous workflow [12]. For this reason, the construction industry must endeavor to standardize its processes.
While Lean Construction techniques already in practice have had positive impacts on construction project performance, many firms find Lean Construction methodologies challenging to implement. Many barriers to Lean Construction implementation have been identified, including the lack of top management support, reluctance to change and budget costs to settle new processes and tools. Moreover, the lack of information sharing appears to be among the top five barriers to successful Lean production management in the construction industry [13]. Therefore, it is important to capture and share knowledge that is generated throughout the project life cycle in order to enable improvements in decision making based on continuous learning. To this extent, technology can accelerate the implementation of Lean Construction practices, especially with BIM, which seems to be the key shift of the architecture, engineering and construction industry to data-driven construction.

1.2. BIG BIM as a Support to Continuous Information Flow

Firstly, BIM, which in a narrow sense is known as “little BIM,” is advocated as a central information management hub of the physical and functional characteristics of a facility. Within this definition, the acronym BIM stands for the Building Information Model. Secondly, BIM also refers to Building Information Modeling, which is the set of processes and technologies that enable architects, engineers and construction professionals to generate a digital model [14]. These actors will enrich, update and share the compilation of structured data hosted in the building model. Obviously, Building Information Modeling software allows us to model building components as parametric objects with geometric and non-geometric attributes that represent functional, semantic or topological information [15]. Lastly, BIM designates Building Information Management, which stands for the definition, the organization and the supervision of data exchange processes during the whole building life cycle [14]. The definitions proposed in the literature must be used with caution, since there is not a single satisfactory description of what BIM is. For the purpose of this study, BIM is considered to be a way of managing information to improve understanding throughout the building life cycle, synonymous with the BIG BIM view or the broader sense of BIM. According to Finith Jernigan [16], “BIG BIM promotes real sustainability. It connects the dots, improves information flow, and supports integration. It interacts with the world we inhabit […] by assembling knowledge and integrating a long view of our environment.” Table 1 displays the differences between little BIM and BIG BIM.
However, the BIG BIM concept—which promotes continuous information flow throughout the building life cycle—is not yet mature in practice [19]. In addition, there is little consideration of how experience-based knowledge, which is the higher competitive company asset, as explained earlier, can be effectively captured and used for continuous improvement with BIM. Accordingly, researchers are invited to emphasize works on BIM-based knowledge management throughout the project life cycle and try to solve interoperability issues between different BIM tools as well as other information management systems [20].

1.3. BIM and Lean Synergy

In the last decade, research in LC and BIM has demonstrated a considerable synergy between the two concepts. Indeed, they show clear intersections in their objectives; both BIM and LC focus on understanding client requirements and reducing waste to bring greater added value through the design and construction process [21]. While it is assumed that the mutual application of LC and BIM can foster construction project performance, the two concepts have been developed and mostly applied separately. Sacks et al. [22] first laid out an interaction matrix identifying 56 points of intersection between Lean Construction principles and BIM functionalities. Later, Ouskouie et al. [23] expanded the number of interactions and possibilities in the matrix as BIM use was quickly developing over the project life cycle. The authors of this study also conducted a literature review on the LC and BIM interactions mostly explored by researchers. The proposed weighting matrix showed that 4D BIM-based visualization of construction schedules produced and updated during LPS rituals is the most prominent and promising LC and BIM interaction at the construction stage [24]. Consistent with Schimanski et al.’s [25] results, a deep exploration of BIM-based production management systems revealed a lack of full integration between BIM and LPS at the data processing level [24]. In addition, the authors highlighted the relevance of automating the generation of phase schedules as part of a true BIM and LPS integration [24]. Automation is also a way to ensure the continuous use and improvement of building information and reduce data loss. This implies that each actor in the construction chain should become an intermediate customer for the preceding stages. In this way, every project member will be able to “pull” valuable information generated at previous phases to support their inherent activities, which pairs with LC and BIG BIM goals. One can even say that a Lean Construction is achieved through BIG BIM implementation. Construction schedule automation in real-life projects, which is the issue that will be addressed in this paper, is only one aspect of BIG BIM implementation.

1.4. Problem Relevance Preliminary Study

The preceding literature review details the theoretical foundation for the Workface Planning issue in the construction industry. As the problem area includes both theoretical and practical concerns, empirical evidence in real-life building projects was needed to legitimize the relevance of the selected problem. Hence, we conducted a survey involving cross-functional teams within the organization with which we collaborated for the purpose of this research. The company specialized in heat, ventilation and air conditioning, and the participants interviewed included representatives, operation directors, project managers, design technicians, foremen and laborers as well. We collected 100 answers through the internal company network “DigitaLean,” and we believe that the number of responses was representative enough to reveal the common trends and needs. The participants were asked about their familiarity with LPS and BIM practices as well as the functionalities that are most expected as part of a BIM approach. The questionnaire revealed that the firm employees’ acquaintance with LC techniques, especially LPS, was very limited. Indeed, 14 candidates had tried the LPS methodology, 62 did not know about it and 20 participants had just heard about it. In addition, the organization had acquired 3D modeling expertise in recent years. However, some discrepancies in 3D Building Information Modeling skills were observed between local agencies. Concerning future needs, almost 70% of the interviewees were looking forward to deploying the 4th dimension of BIM related to planning capabilities. The main features (Figure 1) that were desired were: 3D model visualization, supply chain management, progress monitoring and project scheduling.
Cross-checking the aforementioned theoretical and practical setbacks revealed the importance of Workface Planning automation to move forward with Lean and BIM integration. Therefore, this paper will shed light on how to leverage quantity surveying data and BIM to automate mechanical and electrical (M & E) Workface schedules.

2. Research Objectives and Methodology

2.1. Research Objectives

The present research seeks to improve production management of M & E installation works by proposing an automated planning method that enables detailed installation tasks to be defined and estimated more easily. Explicitly, we aspired to design an IT artifact that harnessed project estimates, design information and M & E expert knowledge to automate the generation of mechanical and electrical schedules (Figure 2). The idea was to offer early insight to M & E managers and foremen about the construction schedule using project estimate information and then updating that schedule as the Building Information Model is enriched. Furthermore, the generated output must form the basis for the progress tracking of M & E on-site construction operations. Throughout this research, standardized M & E work templates were defined for each material category using expert know-how information. The latter explicit knowledge is crucial for producing an automated schedule. Furthermore, this study presents the users’ functional requirements, the architecture of the automated M & E planning Web application and its conceptual data model as well as the current development state of the user interface.
This proposed solution will relieve M & E managers and foremen of the extremely laborious and time-consuming Workface Planning activities. It will also replace the sub-optimal ad hoc decisions made by those professionals to determine such tasks. Above all, the main driver for this concept is ensuring data continuity between quantity surveyors and design and execution teams. This paper only focuses on information that is relevant to the detailed planning of M & E installations.

2.2. Research Methodology

The underlying research strategy consisted of the Constructive Research Approach (CRA) described by Lukka [27] as a methodology that creates innovative constructions (or artifacts, as used in Design Search Research (DSR)) to solve real-world problems. CRA and DSR are quite similar from the process perspective; they both go from problem awareness and definition to solution proposition, artifact development and evaluation. In terms of application domains, DSR is predominantly applied in the information science field, while CRA has been adopted in both information science and general management disciplines. However, CRA and DSR follow different trajectories in problem solving. While the basic logic of DSR is deductive, it relies more on the application of previous knowledge through a specific kernel theory in the design. The CRA reasoning is abductive; it follows a softer, more intuitive and creative approach, which implies a “back and forth” direction between theory and empirical study [28]. The use of fundamental theories is not forbidden in CRA, but it is not a requirement, either. Moreover, the process description of CRA puts more weight on the collaboration between the researcher and practitioner with the aim of learning through experience. The CRA guidelines require that the artifact must be developed in close collaboration with the target organization, whereas DSR does not prescribe a definitive mode of collaboration, even if it requires that the resulting artifact be evaluated rigorously. The CRA process proposed by Lukka [29] outlines the steps and main activities conducted in this research realm. These steps are mapped according to the abductive process in Figure 3:
  • Find a practically relevant problem with the potential for theoretical foundation: The problem is selected through the researcher’s personal experience in the field, expert interviews and literature search. For this study, the research topic dealt with the automation of construction Workface Planning as part of a true BIM and LPS integration at the data processing level. The issue was selected based on a literature review of Lean Construction and Building Information Modeling interactions (Section 1.3) as well as a qualitative survey conducted within the hosting company (Section 1.4).
  • Examine the potential cooperation with the target organization(s): A project team is organized around the problem, and a formal agreement outlining the research activities, schedule, key milestones, funding and access to information is established to ensure the commitment of the stakeholders. This research is undertaken as an industrial fellowship program; thus, it implies a contractual partnership between the research candidate and the academic and industrial practitioners. This partnership led us to focus on the M & E organization’s discipline. Moreover, organization interest in data continuity drove our reasoning.
  • Obtain a deep understanding of the topic area: The objective is to gain a profound understanding of the organization’s practices. The researcher should be well-informed about prevailing theories through a literature review and be able to place their research in the context of existing knowledge. To meet the objectives of this phase, we conducted a thorough literature review on construction schedule automation techniques (Section 3.1), and we carried out interviews with competent practitioners and experienced a 3 month immersion in an MEP (Mechanical, Electrical and Plumbing) project to depict current planning processes within the target organization. Moreover, we examined the firm’s quantity surveying and design tools and technologies in order to identify the relevance and availability of the project’s estimates and design data for Workface Planning (Section 3.2). The latter analysis revealed many challenges regarding information consistency and loss in real-life projects.
  • Innovate a solution and develop a problem-solving construction: Here, the researcher develops a conceptual solution and studies its feasibility. If not feasible, the research is either dropped or significantly changed. On the other hand, a prototype of the solution is developed in an iterative (trial and error) fashion. In this phase, we mapped an ambitious technical architecture for data transfer along the MEP business process based on a data-warehousing concept (Section 4.1). The feasibility study revealed some data availability issues that required huge development efforts in third-party estimate software, emphasizing the standardization of MEP article codification and nomenclature throughout the project life cycle as well as dealing with change management issues. All those steps were preliminary issues that must be solved before the implementation of the proposed solution. Despite this, the firm was still motivated to carry out a proof of concept for project scheduling automation to reckon with the current data consistency constraints. In parallel to this research study, the company had also launched a project to rationalize its estimate processes. Under these circumstances, the research team tailored the architecture of the proposed solution, formulated the minimum viable product functional requirements and started the software prototyping (Section 4.2, Section 4.4 and Section 4.5). In addition, the M & E work sequences that formed the basis for an automated Workface schedule were needed at this stage. Thus, we modeled standard M & E fragnets in collaboration with competent M & E project managers (Section 4.3).
  • Implement and test the solution: the solution needs to be tested from the technical and process perspective and in a real-life project within the chosen organization. This step was carried out simultaneously with the application development in a test and learn logic (Section 4.5). Realistic mock data were used during the development stage for testing the functionalities under conditions that closely simulated a production environment.
  • Ponder the scope of applicability of the solution: An analysis of the solution’s implementation is carried out. If the application is successful, a diffusion of the concept to the wider industry must be performed.
  • Identify and analyze the theoretical contribution: At this stage, the researcher analyzes the findings and decides the implications for the original theories.

3. State of the Art

3.1. Automation of Construction Scheduling Techniques and Previous Works

On a theoretical level, a large body of research has addressed construction planning automation over the last three decades. The suggested scheduling engines have evolved from user-driven to model-driven systems with respect to project scope quantification [30]. Actually, the earlier tools required the manual entry of activities or components into a list, while modern tools take advantage of product modeling software advances such as computer-aided design (CAD) and BIM software. For instance, De Vries and Harink [31] derived vertical and horizontal relationships between CAD model components and used an external database to calculate activity duration and select appropriate equipment and labor depending on the component type. Likewise, Kim et al. [32] extracted Building Information Model materials, locations and quantities, then calculated activity durations using production rates from the RsMeans database. Furthermore, Liu et al. [33] used both topological adjacencies embedded in the Building Information Model and process patterns sourced from a Work Breakdown Structure database to perform construction sequencing. Accordingly, research developments that reason about the project scope from a CAD of a BIM model are mature enough. In spite of all those advances, most—if not all—construction projects still rely on fully manual scheduling practices. Amer et al. [30] revealed three issues that hinder the wide adoption and scaling of automated planning engines. The first insight related to the rigidness of activities and sequencing knowledge representation. Indeed, the examination of activity modeling and sequencing techniques showed that a substantial amount of manual work is required for creating and maintaining the know-how information bases. A large number of artificial intelligence-driven planning systems do not support learning from historical knowledge related to previous projects. Thus, it is of utmost importance to generalize activities and sequencing knowledge templates by parsing through previous project knowledge records and automatically learn from them without the need for extensive human input. The second insight that showed current automated planning systems to be irrelevant was that they are not validated by real-world projects or throughout the entirety of a project. Actually, scheduling engines were tested on a few types of buildings, mainly offices, universities and residential buildings with a limited set of building components that were primarily structural. Conversely, healthcare, industrial buildings and MEP installations got little attention in previous planning automation trials. Finally, the third insight regarded the decoupling of automated planning and schedule optimization research, which made those solutions suboptimized and consequentially less appealing in real-world projects. For example, Isaac et al. [34] proposed a method for automated scheduling of M & E systems by taking advantage of the components’ topological relationships in BIM. The generated schedule was optimized to conform to multiple constraints prescribed by practitioners and was represented using a Line of Balance diagram. The graphic format explicitly shows the components that need to be installed, their locations, required resources and other relevant data needed to execute the work. However, the representation does not allow tracking subsidiary activities related to conversion tasks, such as material reception, handling and pipe connecting, which require a high level of activity modeling granularity. In the future, the ability of automated scheduling systems to perform hierarchical planning at different levels of granularity throughout the project life cycle should support Lean workflows [30]. By doing so, short-term and long-term plans will be aligned through the automatic updating and continuous detailing of schedules from the master schedule level to the weekly work planning level and including the lookahead planning level.

3.2. Ongoing Planning Practices within the Partner Organization

As part of the Constructive Research Approach, a deep understanding of the target organization challenge is necessary to highlight the issues that are usually not reported or emphasized in the literature. This step of research was conducted through 3 months immersion in an MEP project and many interviews with experienced partitioners. The analysis revealed that there was no unique and homogeneous planning method within the firm. Actually, MEP managers were left to devise their own scheduling and control systems. Naturally, planners used estimated data generated at the tendering phase of the project in order to quantify their scope workload. Those estimates included lists of materials for each discipline, per localization and MEP system, and comprised estimated working times and material cost per unit as well. Material man-hours and cost per unit were retrieved from an internal company’s article database. Given the lack of a conventional activities and sequencing formalization in the firm, every planner defined each task and its dependencies at a level of detail that seemed suitable. Then, the planner manually and approximatively mapped the relationships between the tasks and material installation durations. The practitioners also confirmed that they usually use master schedules milestones, defined by the construction manager, to constrain task starting and ending dates. The produced Workface schedule was represented via a Gantt diagram in an Excel worksheet that showed only installation tasks. A macro view of the installation schedule was sometimes nested in a long-term plan that included the milestones of design, drawing coordination, MEP system installations and commissioning. In addition to the MEP Gantt chart schedule, the most advanced planners used a simple and visual Microsoft Office 365 tool called Planner to organize and synchronize multiple MEP actors’ tasks including design, procurement, security, administrative and commissioning activities as well as unforeseen on-site incidents. During the project’s installation stage, MEP foremen reported the progress and man-hour consumption for each task on the Gantt schedule. Those downstream reports enabled the project manager to forecast the cost at completion of the project (estimate at completion (EAC)). Figure 4 summarizes the project management process within the target company.
In the best circumstances, the Workface Planning exercise lasts a full-time week for an MEP project with a EUR 10 million turnover. Usually, planners struggle to assemble pieces of data collected from multiple tendering documents and quantity surveying tools. The main problem reported at this stage was the loss of localizations and MEP system attributes of materials when drawing up the customer quote from quantity take-off sheets. Strictly speaking, one part of relevant location-based planning information—namely, localization and system attributes—remained in quantity take-off sheets, while another valuable set of attributes—specifically, material installation time per unit—was generated within the estimate’s software. Considering this fragmented dataflow, planners needed to cross-reference fetched information with their own project knowledge in order to build accurate MEP Workface schedules. Because of the complexity and tediousness of this assignment, Workface Planning could be left behind by some MEP managers.
After all, it is known that all plans are forecasts and all forecasts are wrong. So, the key is planning how to “counterpunch”—how to respond effectively to a plan failure. Thus, planning is a vital process that involves constant corrections. This explains why building model information is as important as estimated data, since they will allow schedule updating. This conclusion led us to investigate the current partner firm’s modeling practices and tools. Today, almost 30% of the company’s MEP projects are modeled using Revit building information software at a level of detail (LOD) of 300; models are realistically plotted as an object or assembly. Quantities, shapes, quotation positions and orientations are accurate. Nevertheless, the BIM model alone does not allow for scheduling since no installation processes are included in the object’s details. In addition, we observed that the objects codification in BIM models was not consistent withthe articles codification used in quantity surveying within the company. As a result, the definition of a standard and unique article codification for both quantity surveying and design teams was a big issue that the company must deal with in order to reconcile estimate and building model information.
All things considered, the partner organization carried out little BIM. We noticed information throwaway, building model incompleteness and disintegration from real-world recipes and costs. Therefore, we recommended applying BIM as a carrier to transfer the knowledge contained in different knowledge management tools throughout the project life cycle. This did not mean that everything had to be in a single model, as BIM software vendors may claim. It would be more accurate to describe BIM as a series of interconnected models and databases. Such an integrated framework can be achieved using the projects’ data warehouse.

4. Development of a Prototype for Automated Generation of MEP Schedules

4.1. Software Principle and Its Integration in the Project Life Cycle

The main motivation for data warehousing is to transform operational data into strategic decision-making information. In general terms, a data warehouse is a database that is periodically filled with data retrieved from multiple databases for the purpose of analysis [35]. Easy access to adequately prepared data is beneficial in different decision support applications, such as management reporting, queries, decision support systems and executive information systems [35]. Essentially, a data warehouse ensures that the appropriate data are available to the appropriate end user at the appropriate time. The current trend in data warehousing is to develop a data warehouse with several smaller related data marts. A data mart is simply a subset of the data warehouse designed for a particular line of business or team, such as purchasing, sales and inventory [36]. As part of our project, a scheduling decision support system is built using a combination of raw data from project estimates and design as well as expert knowledge on activities sequencing. Thus, we designed a data mart model called “schedule-data” dedicated to translating complex business data into structured and usable computer information. With respect to the firm’s strategic vision, the modeled data mart will be embedded in the organization’s data warehouse “db_AxiBIM”.
In a perfect scheme, the modeled data mart is first populated with quantity take-off sheet data that include a list of materials with respective quantities, localization (building and level name), the MEP system name and the materials’ technical features (diameter, air flow, fluid pressure, power, etc.). Secondly, quantity surveyors retrieve this information from the data mart to produce the service quote at the level of detail required in the cost breakdown template provided by the customer. At this stage, the latter information is enriched with estimated working times and cost attributes that must be saved into the data mart. Consequently, the first iteration of the installation schedule can automatically be created from the data mart records and an MEP fragnets’ database. In the meantime, the data mart provides a set of preliminary hypotheses for the aeraulic and hydraulic network balance conducted by design teams using specific tools (Optimiz and Climawin, for this company). Following a digital data continuity, logic entails that the Revit modeling environment must use the data mart information to offer the designer a catalog of components with predefined properties. This functionality could reduce errors and optimize the project modeling duration, which is supposedly 20% higher using Revit compared to ordinary computer-aided design software within the firm. Once the MEP model is conclusive enough, the data warehouse should be fed up-to-date project data so as to allow for the adjustment of the installation planning with all design modifications. The scheduling software proposed is intended to become an operational system within the company for planning MEP works, estimating the weekly workload, tracking activities progress and reporting man-hours consumption. Figure 5 presents an integrated information framework mapping information flows between project processes within the company and motivated by the idea of data continuity.
Given that the information flow architecture envisioned seems to be very challenging, the research team prioritized the flows that must be processed within the research prototyping scope.
The research prototype scope is limited to the information flows between the quantity take-off sheets, the modeled data mart, the M & E information embedded in the Revit model and the scheduling software to be developed (represented by yellow colored arrows). In addition, the research scope includes the creation of a standardized know-how information base on MEP work and sequencing as a data source to enable transferring of the comprehensive knowledge of expert human planners to automated planning systems.
The extension of the quote creation tool “Quick Devis” from a data producer to a data consumer requires more sophisticated studies and developments with a third-party partner of the organization. Thus, the research prototype will address neither this flow nor the one related to the interdependency between the aeraulic and hydraulic balance tools and the data mart. The progress reporting software will not be developed at this research project phase, either.

4.2. Functional Requirements

In this section, we undertook requirement engineering for the automated scheduling software in collaboration with a lead software engineer. The requirement engineering practice involved diverse tasks including stakeholder identification, requirement elicitation, business rules analysis and change analysis. In particular, the requirement elicitation was facilitated by the immersion period and interviews with future software users who participated in M & E fragnets standardization as well [37]. A high-level overview of the system functionalities is presented through a use case diagram, a behavioral Unified Modeling Language (UML) diagram (Figure 6). A use case diagram is a visual representation of users’ possible interactions with a system. It consists of actors shown as stick figures and use cases represented by circles or ellipses [38]. On one hand, actors are users that interact with the system. An actor can be a person, an organization or an outside system that produces or consumes data from the system to be developed. On the other hand, a use case is a set of actions, services and functions that the system needs to perform [39]. While actors and use cases are linked by association relationships, use cases are linked together by three basic types of relationships: generalization and two standard stereotypes, i.e., «include» and «extend». In UML modeling, a generalization relationship between use cases indicates that the child use cases inherit the properties of the parent use case. The «include» relationship indicates that the base use case requires the included use case to be completed. Finally, an «extend» relationship is used when a use case conditionally adds steps to another first-class use case [40].
Because experts have recommended supplementing the use case diagram with more descriptive textual functional requirements [41], we formulated more detailed system functionalities in Table 2. From a process perspective, the table describes how data should be processed to generate the desired outcome. It also elucidates the graphical representations and services required from a user perspective. The following requirement list is not exhaustive; it is evolving through prototyping iterations in line with eXtreme Programming software development.

4.3. Standardization of Construction Sequences

Transferring planning knowledge from human planners to computers using computer-interpretable representations is mandatory to create automated planning systems. The challenge faced in construction planning know-how information is often ingrained through exposure to multiple project experiences. This knowledge lacks formal definitions and structuring, which makes transferring expert planners’ knowledge to scheduling systems very complex. Subsequently, the project team sets up workshops with confirmed M & E managers for the standardization of the business planning sequences using the tendering materials’ nomenclature as a yardstick. The latter nomenclature enumerates a large number of articles used to build customer quotes at the tendering phase. Each article within this reference list belongs to a group, a family and a super family, successively (Figure 7).
First, the team begins defining, from the expert background and experience, the Bill of Process (BOP) related to each material group. In manufacturing, the BOP comprises detailed plans explaining the manufacturing processes for a particular product [42]. In construction, BOP will refer to the construction processes for installing a particular material. For instance, the stainless black steel distribution pipe group TAD belongs to the stainless black steel pipe family TUA, and its BOP includes the following operations (Table 3): controlling the arrival of painted pipes, pipe handling, placing the pipe hangers and supports, running pipes and fittings, testing of pipes, connecting pipes to massive equipment, connecting pipes to terminals, assembly control of hydraulic networks and final impoundment and water quality control. These operations will be instantiated for each project building and level within the automated M & E planning system. Furthermore, the task force noticed that several material groups could have the same BOP sequences; in this case identical rules will be applied to every single material group. For example, we observed that the stainless black steel manifold pipe TAC and TAD groups belong to the same family TUA and share the same rules. Second, predecessors were identified for each operation (Table 3); this includes prerequisites related to either M & E activities or to other construction project disciplines. Finally, each BOP step was assigned a rate that allows for the breakdown of the installation duration of the material into its respective operations (Table 3). Typically, as the unit installation time of a TAD 10-065 article is 1.3 h per meter, running the pipes and the fitting task will require 50% of the total duration of TAD 10-065 implementation in a working area. To date, the project team formalized the Bill of Process for 55 groups out of 467.

4.4. System Data Modeling

In this section, we modeled the “schedule_data” database that supports the scheduling automation process. Data modeling is defined as “the process of creating a visual representation of either a whole information system or parts of it to communicate connections between data points and structures” [43]. Like any design process, data modeling begins with a high level of abstraction for the conceptual model, progresses to a logical model and concludes with a more concrete physical model. The conceptual data model defines the system entities and their relationships. Then, the logical data model goes beyond the conceptual model by detailing the entities’ attributes as well as their primary and foreign keys. Finally, a physical data model is usually derived from a logical model, considering all technology-specific details of a particular relational database management system (RDBMS), such as table name, column name, datatype, constraints, indexes, primary key, etc. [44]. There are two main modeling methodologies for database design: UML and Merise. At the conceptual level of the modeling process, the class diagram and entity/relationship schema are used in UML and Merise, respectively [45]. The conceptual model of the data mart “schedule_data” is represented in the UML class diagram below. It comprises six main classes and three link classes:
  • t_level: a class containing the building level attributes such as starting and ending availabilities.
  • t_article: a class hosting the tendering materials’ nomenclature.
  • t_level_article: an associate class comprising the project’s Bill of Material in each level.
  • t_group: a class representing the groups and the classification of material during the tendering phase.
  • t_operation: a class containing a standardized list of M & E activities.
  • t_group_operation: a link class mapping material group operations with their respective rates.
  • t_level_model: a class that consists of the modeled levels in the Revit model at the execution phase.
  • t_level_element_model: this class comprises the material quantity take-off per level from the 3D BIM model.
Moreover, the semantic relationships between classes are specified in Figure 8 by means of the UML association concept. For instance, the classes t_level and t_article_level have a many-to-many association. During the conceptual database design, the researchers encountered a recursive association for the t_operation class considering that each operation has many prerequisite operations and these prerequisite are themselves operations. The physical database model of “schedule_data” was derived from the UML class diagram and implemented in MySQL RDBMS.

4.5. Current Implementation Status

The latest release of the automated M & E scheduling engine has so far fully implemented a total of 12 out of 19 functional requirements from Table 2. The planning update using Revit model data has not been tackled yet. The current IT artifact is a Web application that uses a 3-tier architecture concept. The latter architecture separates the application into three layers: a presentation layer or user interface, a business logic layer and a data access layer [46] (Figure 9).
The presentation layer is built as a single-page client application with the Angular framework that uses Typescript, HTML and CSS. It communicates with other layers by API calls (i.e., application program interface calls). First, the user interface displays the list of levels per building as defined by the quantity surveying team at the tendering stage. It also exhibits the list of non-M & E operations that constrain the starting of M & E tasks. This content is uploaded from a back-end service using the HTTP protocol. Then, the user is invited to enter the levels’ start and end availability dates as well as the prerequisite milestones related to other disciplines’ operations (Figure 10 and Figure 11). The validation of that information triggers calculations of early start and finish dates of M & E activities on the server side. The latter is illustrated by the business logic layer that supports the application’s core functions and calculations, which is written in Java programming language.
Indeed, the business logic layer itself retrieves the prerequisites list from the data layer, which is populated by standardized Bill of Process information as well as the project bill of material at the tendering phase. Moreover, M & E operations are first retrieved and then grouped by building name and level name. Their respective durations are calculated using the following model. We first define our variables:
  • D i j ( x ) is the total run time of the M & E operation x performed in the building i at the level j.
  • z i j is the material z located at the building i and at the level j; we denote by D ( z i j ) x the run time of the operation x associated with the material z i j .
  • τ x ( z ) denotes the operation rate for a material z.
  • t ( z ) is the unit installation time associated with the material z.
  • q ( z i j ) is the required quantity of the material z in the building i at the level j.
These variables are related to each other by the following equations:
D i j ( x ) = z i j D x ( z i j ) ,
D x ( z i j ) = t ( z ) × τ x ( z ) × q ( z i j ) .
Finally, the prerequisites of each M & E operation are identified for the calculation of the predicted start and end dates associated with each operation. We used the following model:
  • Let D s ( x ) (respectively, D f ( x ) ) be the predicted start date (respectively, the predicted end date) of the operation x.
  • Let P x be the set of the prerequisites of the operation x.
  • We define D f ( p x ) as the predicted end date of the prerequisite p x of the operation x.
  • T s ( j x ) (respectively, T f ( j x ) ) is the day from which the level j x (i.e., in which the operation x occurs) is available (respectively, not available anymore for installation).
We have:
D s ( x ) = T s ( j x ) o t h e r w i s e . max p x P x D f ( p x ) i f P x a n d max p x P x D f ( p x ) > T s ( j x ) ,
D f ( x ) = T f ( j x ) o t h e r w i s e . D s ( x ) + D i j ( x ) d i f D s ( x ) + D i j ( x ) d < T f ( j x ) ,
where d is the daily working hours, set by default to 8 h per day.
Running the algorithm for the operation start and end date calculation required a topological sort of the operations list. In computer science, a topological sorting for a directed acyclic graph is a linear ordering of vertices such that for every directed edge u v, vertex u comes before v in the ordering [47]. The usual topological sorting algorithms are Breadth First Search (BFS) and Depth First Search (DFS). While BFS visits and explores key nodes of a graph in a breadthwise fashion, DFS traverses a graph in the depthward direction. As the search tree is deep in a scheduling problem, DFS was performed in order to reach deep nodes faster while using less memory [48]. The implementation of the DFS algorithm requires a graph data structure; thus, an additional algorithm was developed to transform the aforementioned M & E operation data into an adjacency list representation of a graph. In mathematics, and more specifically in graph theory, a graph is a structure amounting to a set of objects in which some pairs of the objects are “related.” The objects correspond to mathematical abstractions called vertices (also called nodes or points), and each of the related pairs of vertices are called an edge (also called a link or line) [49].
At last, an Angular Gantt component was used to display the M & E spatiotemporal schedule in the user interface. Figure 11 presents the building and level milestones in a master schedule view per quarter. A detailed view of the schedule is obtained by opening level panels and zooming in on the schedule timeline (Figure 12 and Figure 13). The hierarchical schedule layout enables multilevel views of tasks, which are iteratively grouped in sub-networks. Through the interface, we separated two time concepts: the runtime of an operation and the available time to execute this operation (“Durée utile” and “Durée disponible” in Figure 12). A parent runtime corresponds to the sum of its child runtimes. Likewise, the start and end dates of a parent are the respective minimum and maximum of its child start and end dates. Task information is also accessible via the task property panel, which includes general information and dependencies as well as resources (Figure 14). In the resource section, the checked material refers to the related equipment with the required quantities. For example, the task “Pose des radiateur,” referring to radiator installation in the level RDJ, requires 38 units of radiators.
The assessment of the daily workload and the labor needed to execute the generated plan (Figure 15) is important information for the decision maker in defining the appropriate installation strategy. Accordingly, the project manager can suggest to the construction manager either to start some tasks earlier using factual schedule simulation or to speed up some task paths by boosting their team size so as to meet the milestones. The ultimate decision must lead to a smooth workload. The load curve provides an overview of the daily workload and labor estimates (Figure 15).
Finally, further studies and developments are required to address the remaining key functional requirements of the application (7 FK from Table 2), including the schedule update feature from the Revit model.

5. Conclusions and Next Steps

In this paper, we spotlighted the relevance of information and knowledge management to promoting Lean production management in the construction industry. We urged the company to establish a true LPS and BIM integration at the data processing level, with the assumption that BIM is viewed as a series of interconnected databases and models. This broader view of BIM implies that data are ever-evolving assets that need to be preserved. Thus, we suggest that a project data warehouse can help achieve this aim. Furthermore, we provide an empirical demonstration of a single LPS and BIM interaction, namely, the automated generation of construction tasks for the mechanical and electrical (M & E) field. The developed Web application makes use of tendering data and standardized installation fragnets to build a Workface Planning schedule with a high level of detail. The proposed solution aims to relieve project managers from tedious planning activities, to regularize planning arrangements within the partner organization and to act as a decision support system. So far, 12 out of 19 functional requirements have been implemented, and the prototype built uses data from real-life projects. As the implementation of the prototype required the consolidation of sequencing know-how, preliminary work was carried out in collaboration with experienced M & E project managers. Finally, further studies and developments are required to address the remaining key functional requirements of the application. These encompass updating of the schedule as the Building Information Model advances. At this stage, 3D object information that is relevant to planning will be exported into the scheduling data mart using a Dynamo script. Then, the application business logic layer and user interface will be supplemented with the unprocessed functionalities.

Author Contributions

Conceptualization, M.S.; methodology, M.S.; software, M.S.; validation, K.B. and D.B.; formal analysis, M.S.; investigation, M.S.; data curation, M.S.; writing—original draft preparation, M.S.; writing—review and editing, M.S., K.B. and D.B.; supervision, B.M.; project administration, B.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The organization know-how information is confidential owing to intellectual property rights.

Acknowledgments

This thesis is part of an industrial fellowship program between the LINEACT Laboratory and EQUANS Company. Special thanks are directed to all the staff who contributed to this work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Concord Project Technologies. AWP and Field Productivity. 2020. Available online: https://academy.tconglobal.com/courses/advanced-work-packaging-fundamentals-1 (accessed on 15 November 2021).
  2. Concord Project Technologies. Workface Planning. 2020. Available online: https://academy.tconglobal.com/courses/advanced-work-packaging-fundamentals-1 (accessed on 15 November 2021).
  3. Project Production Systems Laboratory. P2SL Glossary. Available online: https://p2sl.berkeley.edu/glossary/knowledge-center-glossaryatoz/ (accessed on 12 January 2021).
  4. Iqbal, K.; Khan, A.R.; Flanagan, R.; Lu, S. Managing the complexity of information flow for construction small and medium-sized enterprises (CSMEs) using system dynamics and collaborative technologies. In Proceedings of the 31st ARCOM Conference, Lincoln, UK, 7–9 September 2015. [Google Scholar]
  5. Surbhi, S. Difference Between Information and Knowledge. 20 January 2018. Available online: https://keydifferences.com/difference-between-information-and-knowledge.html (accessed on 23 December 2021).
  6. Ahankoob, A.; Abbasnejad, B.; Wong, P. The Support of Continuous Information Flow Through Building Information Modeling (BIM). In 10th International Conference on Engineering, Project, and Production Management; Springer: Singapore, 2020. [Google Scholar]
  7. Bhargav, D. Developing a Construction Management System Based on Lean Construction and Building Information Modeling. Ph.D. Thesis, Technion University, Haifa, Israel, 2013. Available online: http://usir.salford.ac.uk/id/eprint/30820/ (accessed on 1 May 2021).
  8. Naqib, D. Production management systems. Transf. Inovácií. 2013. Available online: https://www.sjf.tuke.sk/transferinovacii/pages/archiv/transfer/28-2013/pdf/036-038.pdf (accessed on 23 December 2021).
  9. John, B.; Holweg, M. The Lean Toolbox: A Handbook for Lean Transformation, 5th ed.; PICSIE Books: Burkingham, UK, 2016. [Google Scholar]
  10. Babalola, O.; Ibem, E.O.; Ezema, I.C. Implementation of lean practices in the construction industry: A systematic review. Build. Environ. 2019, 148, 34–43. [Google Scholar] [CrossRef]
  11. Castiblanco, F.M.; Castiblanco, I.A.; Cruz, J.P. Qualitative Analysis of Lean Tools in the Construction Sector in Colombia. In Proceedings of the 27th Annual Conference of the International Group for Lean Construction (IGLC), Dublin, Ireland, 3–5 July 2019. [Google Scholar]
  12. Liker, J.K. The Toyota Way:14 Management Principles from the World’s Greatest Manufacturer, 1st ed.; McGraw Hill: New York, NY, USA, 2004. [Google Scholar]
  13. Demirkesen, S.; Wachter, N.; Oprach, S.; Haghsheno, S. Identifying Barriers in Lean Implementation in the Construction Industry. In Proceedings of the 27th Annual Conference of the International Group for Lean Construction (IGLC), Dublin, Ireland, 3–5 July 2019. [Google Scholar]
  14. Biblus. Modeling, Model and Management: The three M’s of BIM and the right BIM tools. 1 April 2021. Available online: https://biblus.accasoftware.com/en/modeling-model-and-management-the-three-ms-of-bim-and-the-right-bim-tools/ (accessed on 2 January 2022).
  15. Volk, R.; Stengel, J.; Schultmann, F. Building Information Modeling (BIM) for existing buildings—Literature review and future needs. Autom. Constr. 2014, 38, 109–127. [Google Scholar] [CrossRef] [Green Version]
  16. Jernigan, F. BIG BIM Little Bim: The Practical Approach to Building Information Modeling, 1st ed.; 4Site Press: Limerick, Ireland, 2007. [Google Scholar]
  17. SkyBIM. SkyBIM Cloud Based Management & Real-Time Costing of BIM Projects. 2012. Available online: https://www.slideshare.net/SkyBIM/sky-bim-presentation-august-2012 (accessed on 2 January 2022).
  18. Borrmann, A.; König, M.; Koch, C.; Beetz, J. Building Information Modeling: Why? What? How?: Technology Foundations. In Building Information Modeling, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2018; pp. 1–24. [Google Scholar]
  19. O’Malley, A. BIM Adoption in Europe: 7 Countries Compared. 21 June 2021. Available online: https://www.planradar.com/gb/bim-adoption-in-europe/ (accessed on 3 August 2012).
  20. Wang, H.; Meng, X. Improving information/knowledge management through the use of BIM: A literature review. In Proceeding of the 32nd Annual ARCOM Conference, Manchester, UK, 5–7 September 2016. [Google Scholar]
  21. Onyango, A.F. Interaction between Lean Construction and BIM. How Effectiveness in Production can Be Improved If Lean and BIM Are Combined in the Design Phase: A Literature Review. Master’s Thesis, KTH Royal Institute of Technology, Stockholm, Sweden, 2016. [Google Scholar]
  22. Sacks, R.; Koskela, L.; Dave, B.A.; Owen, R. Interaction of Lean and Building Information Modeling in Construction. J. Constr. Eng. Manag. 2010, 136, 968–980. [Google Scholar] [CrossRef] [Green Version]
  23. Oskouie, P.; Gerber, D.J.; Alves, T.; Becerik-Gerber, B. Extending the interaction of building information modeling and lean construction. In Proceedings of the 20th Annual Conference of the International Group for Lean Construction, San Diego, TX, USA, 18–20 July 2012. [Google Scholar]
  24. Sbiti, M.; Beddiar, K.; Beladjine, D.; Perrault, R.; Mazari, B. Toward BIM and LPS Data Integration for Lean Site Project Management: A State-of-the-Art Review and Recommendations. Buildings 2021, 11, 196. [Google Scholar] [CrossRef]
  25. Schimanski, C.P.; Marcher, C.; Monizza, G.P.; Matt, D.T. The Last Planner® system and building information modeling in construction execution: From an integrative review to a conceptual model for integration. Appl. Sci. 2020, 10, 821. [Google Scholar] [CrossRef] [Green Version]
  26. Sbiti, M.; Beddiar, K.; Beladjine, D.; Perrault, R. Field management solutions supporting foreman executive tasks. In Proceedings of the 23th International International Conference on Lean Construction, Lean Principles and Theory (ICLCLPT 2021), Amsterdam, The Netherlands, 8 February 2021. [Google Scholar]
  27. Lukka, K. The Constructive Research Approach. Appl. Soc. Sci. Philos. 2006, 1, 111–133. [Google Scholar]
  28. Kovács, G.; Spens, K.M. Abductive reasoning in logistics research. Int. J. Phys. Distrib. Logist. Manag. 2005, 35, 132–144. [Google Scholar] [CrossRef]
  29. Lukka, K. The Constructive Research Approach. In Case Study Research in Logistics; Turku School of Economics and Business Administration: Turku, Finland, 2003. [Google Scholar]
  30. Amer, F.; Koh, H.Y.; Golparvar-Fard, M. Automated Methods and Systems for Construction Planning and Scheduling: Critical Review of Three Decades of Research. J. Constr. Eng. Manag. 2021, 147, 03121002. [Google Scholar] [CrossRef]
  31. De Vriesa, B.; Harink, J.M. Generation of a construction planning from a 3D CAD model. Automation in Construction 2007, 16, 13–18. [Google Scholar] [CrossRef]
  32. Kim, H.; Anderson, K.; Lee, S.; Hildreth, J. Generating construction schedules through automatic data extraction using open BIM (building information modeling) technology. Autom. Constr. 2013, 35, 285–295. [Google Scholar] [CrossRef]
  33. Liu, H.; Al-Hussein, M.; Lu, M. BIM-based integrated approach for detailed construction scheduling under resource constraints. Autom. Constr. 2015, 53, 29–43. [Google Scholar] [CrossRef]
  34. Isaac, S.; Shimanovich, M. Automated scheduling and control of mechanical and electrical works with BIM. Autom. Constr. 2021, 124, 103600. [Google Scholar] [CrossRef]
  35. Ahmad, I. Data Warehousing in the construction industry: Organizing and processing data for decision making. In Proceedings of the CIB W078 WORKSHOP, Rotterdam, The Netherlands; 1999. [Google Scholar]
  36. JavaTPoint. Components or Building Blocks of Data Warehouse. Available online: https://www.javatpoint.com/data-warehouse-components (accessed on 15 January 2022).
  37. Malviya, S.; Vierhauser, M.; Cleland-Huang, J.; Ghaisas, S. What Questions do Requirements Engineers Ask? In Proceedings of the IEEE 25th International Requirements Engineering Conference (RE), Lisbon, Portugal, 4–8 September 2017. [Google Scholar]
  38. Klimek, R.; Szwed, P. Formal Analysis of Use Case Diagrams. Comput. Sci. 2010, 11, 115–131. [Google Scholar]
  39. Cockburn, A. Writing Effective Use Cases, 1st ed.; Addison-Wesley Professional: Boston, MA, USA, 2000. [Google Scholar]
  40. Creatly. Use Case Diagram Relationships Explained with Examples. 16 July 2021. Available online: https://creately.com/blog/diagrams/use-case-diagram-relationships/ (accessed on 3 August 2012).
  41. Lucichart. UML Use Case Diagram Tutorial. Available online: https://www.lucidchart.com/pages/uml-use-case-diagram/#discovery__top (accessed on 28 January 2022).
  42. Littlefield, M. The Evolution of MOM and PLM: Enterprise Bill of Process. 31 May 2012. Available online: https://blog.lnsresearch.com/bid/141670/the-evolution-of-mom-and-plm-enterprise-bill-of-process (accessed on 3 August 2012).
  43. IBM Cloud Education. Data Modeling. 25 August 2020. Available online: https://www.ibm.com/cloud/learn/data-modeling (accessed on 3 August 2012).
  44. Fernigrini, L. What Are Conceptual, Logical, and Physical Data Models? 9 February 2021. Available online: https://vertabelo.com/blog/conceptual-logical-physical-data-model/ (accessed on 3 August 2012).
  45. Arab, I.; Bourhnane, S.; Kafou, F. Unifying Modeling Language-Merise Integration Approach for Software Design. Int. J. Adv. Comput. Sci. Appl. 2018, 9. [Google Scholar] [CrossRef] [Green Version]
  46. Luchaninov, Y. Web Application Architecture in 2021: Moving in the Right Direction. 30 July 2021. Available online: https://mobidev.biz/blog/web-application-architecture-types (accessed on 3 August 2012).
  47. GeeksforGeeks. Topological Sorting. 18 January 2022. Available online: https://www.geeksforgeeks.org/topological-sorting/ (accessed on 3 August 2012).
  48. Simic, M. Depth-First Search vs. Breadth-First Search. 20 October 2021. Available online: https://www.baeldung.com/cs/dfs-vs-bfs (accessed on 3 August 2012).
  49. Trudeau, R.J. Introduction to Graph Theory, 2nd ed.; Dover Publications Inc.: New York, NY, USA, 2003. [Google Scholar]
Figure 1. (a) Urgent-important matrix of BIM dimensions; (b) main features required within the partner organization [26].
Figure 1. (a) Urgent-important matrix of BIM dimensions; (b) main features required within the partner organization [26].
Applsci 12 04546 g001
Figure 2. Research roadmap and scope (green color) regarding the automation of the Workface schedule.
Figure 2. Research roadmap and scope (green color) regarding the automation of the Workface schedule.
Applsci 12 04546 g002
Figure 3. Mapping the CRA steps according to the abductive research process [28], i–vii refers to the CRA steps explained above.
Figure 3. Mapping the CRA steps according to the abductive research process [28], i–vii refers to the CRA steps explained above.
Applsci 12 04546 g003
Figure 4. The project management process of the target company.
Figure 4. The project management process of the target company.
Applsci 12 04546 g004
Figure 5. Future architecture of information flows between project processes within the target organization.
Figure 5. Future architecture of information flows between project processes within the target organization.
Applsci 12 04546 g005
Figure 6. Use case diagram for the automated M & E scheduling software.
Figure 6. Use case diagram for the automated M & E scheduling software.
Applsci 12 04546 g006
Figure 7. Codification structure of tendering material nomenclature.
Figure 7. Codification structure of tendering material nomenclature.
Applsci 12 04546 g007
Figure 8. The “schedule_data” UML class diagram.
Figure 8. The “schedule_data” UML class diagram.
Applsci 12 04546 g008
Figure 9. The 3-tier architecture of the automated M & E planning Web application.
Figure 9. The 3-tier architecture of the automated M & E planning Web application.
Applsci 12 04546 g009
Figure 10. Level view for entering each level’s start and end availability dates.
Figure 10. Level view for entering each level’s start and end availability dates.
Applsci 12 04546 g010
Figure 11. Non-M & E operations view for milestones definition.
Figure 11. Non-M & E operations view for milestones definition.
Applsci 12 04546 g011
Figure 12. M & E master schedule per quarter.
Figure 12. M & E master schedule per quarter.
Applsci 12 04546 g012
Figure 13. M & E phase schedule view.
Figure 13. M & E phase schedule view.
Applsci 12 04546 g013
Figure 14. Task property panel view. (a) General task information. (b) Task dependencies. (c) Material resources from tendering data related to the task.
Figure 14. Task property panel view. (a) General task information. (b) Task dependencies. (c) Material resources from tendering data related to the task.
Applsci 12 04546 g014
Figure 15. Labor estimate curve.
Figure 15. Labor estimate curve.
Applsci 12 04546 g015
Table 1. Little BIM vs. BIG BIM [17,18].
Table 1. Little BIM vs. BIG BIM [17,18].
Little BIMBIG BIM
CAD on steroidsBIM on steroids
2D 3D4D 5D 6D
Model is incompleteModel is complete
Modeling expertiseCollaboration expertise (IPD)
Silo mentalityData sharing
Information throwawayData is an ever-evolving asset
No real-world recipes and costsReal-world recipes fully integrated
Benefits internalized to companyBenefits entire ecosystem
Little change in how you do businessNew business opportunities
Internet is a referenceInternet is pivotal
Table 2. List of functional requirements (FKs) for the automated M & E scheduling software prototype and mapping to use cases (UCs), priorities and current implementation status.
Table 2. List of functional requirements (FKs) for the automated M & E scheduling software prototype and mapping to use cases (UCs), priorities and current implementation status.
Functional Requirement (FK)UCPriorityStatus
1Entering the building levels’ milestones as prescribed in the construction manager master schedule.2High
2Displaying a list of prerequisite tasks related to other disciplines that constrain M & E processes.2High
3Entering a completion deadline for the prerequisite tasks.2High
4Instantiating appropriate M & E fragnets for each building level according to material category.3.1High
5Grouping operations’ instances by building name, level name, material superfamily and operation name successively, then calculating the total duration of operation groups.3.1High
6Identifying each operation prerequisites.3.1High
7Calculating the predicted starting date and ending date of each operation.3.1High
8Displaying a GANTT chart of M & E operations by building localization.3High
9Displaying a GANTT chart of M & E operations by M & E system.3Medium
10Organizing hierarchically the GANTT view from parent to child attributes (Building name → Level name → Parent → Operation).3High
11Calculating a sum of durations, a minimum starting date and maximum ending date for hierarchical summary tasks (Building name → Level name → Parent).3.1High
12Enabling material list consulting per task, level and building.4.1Medium
13Estimating labor needed to complete a daily plan.3.2High
14Modifying/adding/deleting operations and their respective durations, dates, predecessors and successors (implies recalculating tasks start and end dates).5High
15Requesting an automatic update of the schedule from Revit model data and displaying discrepancies with previous estimates data.4High
16Comparing the material quantities of estimates and the Revit model for each building and level.4.1.1High
17Saving and allowing viewing of the history of the generated schedule versions.6Low
18Allowing the administration of M & E fragnets.1Medium
19Enabling the configuration of planning parameters such as daily working hours and days off.2Medium
Table 3. Bill of Process for the stainless black steel distribution pipe group TAD.
Table 3. Bill of Process for the stainless black steel distribution pipe group TAD.
ProcessOperationRatesPredecessors
Installation of hydraulic networksControlling the arrival of painted pipes2%Premises and terraces waterproofing
Shoring props removal
Pipe handling5%Controlling the arrival of painted pipes
Placing the pipe hangers and supports15%Shoring props removal
Running pipes and fittings50%Controlling the drilled holes for pipe running
Pipe handling
Placing the pipe hangers and supports
Testing of pipes5%Connecting pipes to massive equipment
Connecting pipes to massive equipment5%Installation of massive equipment
Running pipes and fittings
Connecting pipes to terminals10%Installation of terminals
Running pipes and fittings
Assembly control of hydraulic networks3%Connecting pipes to massive equipment
Connecting pipes to terminals
Connecting condensate drains of terminals
Connecting the gas line of terminals
Final impoundment and water quality control5%Assembly control of hydraulic networks
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sbiti, M.; Beladjine, D.; Beddiar, K.; Mazari, B. Leveraging Quantity Surveying Data and BIM to Automate Mechanical and Electrical (M & E) Construction Planning. Appl. Sci. 2022, 12, 4546. https://doi.org/10.3390/app12094546

AMA Style

Sbiti M, Beladjine D, Beddiar K, Mazari B. Leveraging Quantity Surveying Data and BIM to Automate Mechanical and Electrical (M & E) Construction Planning. Applied Sciences. 2022; 12(9):4546. https://doi.org/10.3390/app12094546

Chicago/Turabian Style

Sbiti, Maroua, Djaoued Beladjine, Karim Beddiar, and Bélahcène Mazari. 2022. "Leveraging Quantity Surveying Data and BIM to Automate Mechanical and Electrical (M & E) Construction Planning" Applied Sciences 12, no. 9: 4546. https://doi.org/10.3390/app12094546

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop