Next Article in Journal
Adding Green to Architectures: Empirical Research Based on Indoor Vertical Greening of the Emotional Promotion on Adolescents
Previous Article in Journal
Advances in Road Engineering: Innovation in Road Pavements and Materials
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An AprilTags-Based Approach for Progress Monitoring and Quality Control in Modular Construction

1
Shanghai Airport Authority, Shanghai 201207, China
2
Department of Civil Engineering, Tongji University, Shanghai 200092, China
3
Department of Civil and Urban Engineering, New York University, New York, NY 11021, USA
*
Author to whom correspondence should be addressed.
Buildings 2024, 14(7), 2252; https://doi.org/10.3390/buildings14072252
Submission received: 19 May 2024 / Revised: 30 June 2024 / Accepted: 8 July 2024 / Published: 22 July 2024
(This article belongs to the Section Construction Management, and Computers & Digitization)

Abstract

:
Traditional approaches to modular construction progress monitoring and quality control with stringent and tight tolerances for on-site and off-site assembly processes are usually based on 3D laser scanning, but the high equipment costs associated with acquiring point clouds have economic ramifications. This paper provides the details of a new and inexpensive method through the integration of AprilTags and an ordinary phone. By using AprilTags instead of QR codes to label modules, progress management is achieved through the rapid identification and association of multiple modules based on a single image. Moreover, a virtual multi-view vision algorithm based on AprilTags is proposed to generate 3D reverse models of the construction site; the quality result can be acquired by comparing the offset and rotation values of the reverse model and the BIM model. Finally, all the algorithms are validated through comparing the reverse models with the reference models made with 3D printing and 3D laser scanning, which verifies the accuracy and efficiency of the proposed method.

1. Introduction

As prefabrication becomes increasingly acceptable in the construction industry, modular construction involving highly integrated prefabrication of modules has been increasingly used around the world [1,2]. In general, modular construction is a factory-based construction process in which modules are produced in the factory and transported to construction sites. While the scope of preassembling can vary from panels (i.e., a set of workpieces preassembled at factories that are not volumetric, such as facade wall panels) to volumetric units [3] (i.e., workpieces preassembled at factories that form a 3D bounding box), the modules that are evaluated within the scope of this work are preassembled, fully integrated (i.e., include all trade components) volumetric units that form the actual structure and fabric of a building [4,5], and can have steel, precast concrete, a timber frame, or a composite material as their primary structural core [6]. Given these fully integrated volumetric units, on-site work typically includes stacking modules vertically and horizontally at construction sites.
The application of modular construction can have many benefits for the whole construction process, such as improving production efficiency, reducing construction time, and protecting the urban environment [7]. Recent modular projects have already established a solid track record of accelerating project timelines by 20–50%, and modular construction projects in Europe and the United States could deliver annual cost savings of up to $22 billion by 2030 under moderate assumptions of penetration [8]. Modular construction has also been shown to result in higher-quality buildings [9], reductions in the cost of on-site work [10], and improvements in construction safety [11]. Given the gradually more reliable structural performance of high-rise buildings [12], modular construction is no longer confined to houses or low-story building markets and is becoming a new trend in the construction industry to improve the overall efficiency of projects.
While modular construction aims to minimize on-site work, congestion, and safety concerns, it also comes with contingent requirements on tolerance and the vertical/horizontal alignment of modules to be stacked on site. Considering the storage in the factory, shipment, and assembly, modular construction requires more refined progress monitoring to avoid on-site assembly mistakes and minimize the whole project cost due to rework [13]. Building information modeling (BIM) has been deeply rooted in the AEC/FM industry, and semantically accessible building information models provide design and fabrication data on the modules. Therefore, to realize more precise monitoring progress throughout the whole life cycle, it is important to connect construction information such as the progress and results of assembly with the design data collected in BIM [14,15]. During on-site assembly, any undetected tolerance and alignment issues will result in module misalignment and lack of fit in the allocated spaces [16]. At traditional construction sites, quality control and defect detection are usually performed by acquiring data on ongoing work and comparing the data to tolerances [17,18,19]. The data acquired can be manually or semiautomatically captured and processed, and at best, reality capture technologies (e.g., laser scanners) can be used to capture these data and compare it to the digital model to determine the actual deviations from design models.
However, these approaches always require time and effort to capture and process the data from these reality capture technologies [20], along with high investment and training costs. There is still no research on the automatic tracking of the construction process of modular buildings using ordinary mobile phones. To comply with the rapid pace of installation, rapid quality control, and tolerance checking for modular construction projects, there is a need for a method to rapidly monitor the progress and quality of work at sites without the need for post-processing of captured data. Therefore, this study aims to develop an inexpensive method for monitoring modular construction progress and quality control. In addition, this paper provides the details of this method along with the results of algorithm tests performed. In summary, this method automatically acquires the as-is condition and installation data of stacked modules on sites using no-cost AprilTags and a common phone camera. By using AprilTags instead of QR codes to label modules, progress management is achieved through the rapid identification and association of multiple modules based on a single image. Moreover, a virtual multi-view vision algorithm based on AprilTags is proposed to generate 3D reverse models of the construction site; the quality result can be acquired by comparing the offset and rotation values of the reverse model and the BIM model. The feasibility and precision of the proposed method were tested with synthetic modules and compared to the traditional approach based on QR tags and 3D laser scanning. Since the component is the basic unit of the associated AprilTag, the proposed method can be extended to the construction management of all prefabricated buildings.
This paper is organized as follows: A literature review of the existing approaches to progress monitoring and quality control of modular construction, along with AprilTags-based applications, follows the introduction section. In Section 3, the major concepts and issues, including the reasons for using AprilTags and the resources of modular construction tolerance, are explained. In Section 4, the processing framework for construction management and methodologies for generating actual modular buildings are explained, including the sources of error throughout the whole detection process. Validation work has been performed considering two distinct scenarios that could occur at sites while using this method for progress and quality monitoring of modular construction assemblies at sites. The first scenario represents the ideal cases where any discrepancy between the digital replica of modules and the actual assembly is attributed to a quality problem only due to construction errors. The second evaluates a more probable case that contains composite errors. During validation, reconstruction models, which represent the actual construction conditions, were generated automatically and compared with reference models generated based on 3D printing and 3D laser scanning technologies to evaluate whether the AprilTags-based method is feasible for modular buildings.

2. Literature Review

This section provides an overview and synthesis of existing approaches to construction progress and quality monitoring. The typical process for progress monitoring at construction sites is a labor-intensive and manual process. Construction progress at sites has traditionally been captured using daily or weekly reports, and these reports can provide several types of information, including information on assembly progress, project delays, and the delivery of modules to the sites [17]. With the wider utilization of BIM technology in construction projects, models have been used for progress and quality monitoring. In recent projects, QR code [21] tags have been attached to modules and/or prefabricated panels to enable workers to input information about components, as shown in Figure 1a. Integrating progress data on components/modules with the corresponding digital replicas has been a practice for progress monitoring as a reliable way to capture the as-built status accurately and compare it to the planned progress. Regarding the quality control of assemblies at sites, current industry practice leverages technologies such as total-station instruments, prototyping, and 3D laser scanning technology [16]. The coordinates of several feature points with a total station or the point clouds from a laser scanner are captured and compared to the design data to ensure that the deviations are within the allowable tolerances. Due to the advantages of rapidly capturing the real world in 3D with high accuracy, laser scanning technology has been an important element of practice for capturing high-precision data on objects worldwide (Figure 1b) [22]. However, laser scanners and total stations always need proactive scan planning, appropriate positioning, and relocation to collect a view of the structure being captured for progress monitoring and quality control.
AprilTag is a robust and flexible visual fiducial system that was developed by the April Robotics Laboratory at the University of Michigan. AprilTags are used to calculate the exact position, orientation, and identity of a marker relative to a camera by searching for linear segments and detecting squares [23]. The system is applicable to a wide range of tasks, including camera calibration, robotics, and augmented reality. The basic process of AprilTag detection can be summarized in four steps, as shown in Figure 2.

2.1. Existing Approaches to Construction Progress and Quality Monitoring

2.1.1. Progress Monitoring at Construction Sites

An analysis of the related work on progress monitoring at construction sites revealed that automated progress monitoring approaches fall into two categories.
One category overlays real-time construction progress data on as-planned BIMs using some data-updating media (such as QR code tags). Lin et al. [25] presented a novel system called the Mobile 2D Barcode/RFID-based Maintenance Management system to improve laboratory equipment and instrument maintenance management using 2D barcode and radio frequency identification (RFID) technologies. Zhao et al. [26] introduced a method that incorporates RFID and LoRa technologies, sensor networks, the BIM model, and cloud computing to automatically collect, analyze, and display real-time information about PC components.
The other category reconstructs a 3D model from images/point clouds and compares it to the as-planned model. Han et al. [27] presented a geometry- and appearance-based reasoning method for detecting construction progress by initially geometry-based filtering and detecting the state of construction of BIM elements and then appearance-based reasoning operation-level activities. Shirowzhan et al. [28] suggested appropriate building change detection algorithms based on a comparative evaluation of five selected algorithms, including three pixel-based algorithms, DSMd, SVM, and ML, plus C2C and M3C2.

2.1.2. Quality Control at Construction Sites

The differences in the heights of the walls of a module tend to be specific to the system and installation method, but in extreme cases, this can cause stepping out in the height of modules or vertical out-of-alignment effects where modules undergo progressive rotation. Therefore, high-quality modular construction requires minimal horizontal or vertical offset and rotation. Regarding quality control at construction sites, the literature clearly shows that applications of high-precision cameras, 3D laser scanners, and other high-precision equipment in construction quality monitoring have been very widespread without considering the economic cost. Golparvar-Fard et al. [29] compared two methods for obtaining point cloud models for the detection and visualization of the as-built status of construction projects. Bhatla et al. [30] investigated a technology to generate as-built 3D point clouds using photos taken using handheld digital cameras and compared them against the original as-built 3D models. Marzouk et al. [31] used 3D modeling techniques and image processing algorithms to generate a system that is able to detect cracks and defective areas in finishing works and calculate the percentage of the defective area.
To reduce the cost of acquiring images and point clouds, Nahangi et al. [32] investigated the data acquisition capabilities of two low-cost range cameras for the eventual purpose of pipe-fitting process monitoring in a smart fabrication facility environment. Guo et al. [20] conducted a comparative study to quantitatively research the time and cost-effectiveness of TLS-based geometric quality assessment for structural columns.
Existing progress monitoring and quality control approaches that are based on laser scanning have the main disadvantage of high upfront cost due to the high price of 3D laser scanners or scanning services and the follow-up processing and registration of the point clouds. Moreover, there are many other restrictions in using laser scanning at construction sites: for example, professional training to capture the progress data properly, congestion/obstructions at sites, and scanning failures on high-reflecting surfaces (e.g., glass curtain walls). On the other hand, approaches that leverage QR/RFID tags for progress monitoring and quality control necessitate one-at-a-time scanning of tags that is time-consuming and inconvenient at construction sites. In addition, the value of tags is only to bind a URL, which is not maximized. For the abovementioned low-cost digital methods, their accuracy does not meet the quality control requirements of modular buildings, and more importantly, the features (module assembly) have not been well explored and utilized during detection and data processing.

2.2. Applications and Features of AprilTags

In recent years, with the rise of remote and automatic systems in many industries, the AprilTag has been used as a position and attitude reference for robots and unmanned aerial vehicles (UAVs). For example, Feng et al. [33] proposed a vision-based solution using a network of cameras and AprilTag markers to automatically estimate the pose for articulated machines. Liang et al. [34] studied a tracking scheme based on AprilTag identification for the problems of real-time performance and occlusion, which improved the tracking accuracy.
Aiding their detection at long ranges, visual fiducials are comprised of many fewer data cells: the alignment markers of a QR tag comprise approximately 268 pixels (not including required headers or the payload), whereas an AprilTag only ranges from approximately 49 to 100 pixels, including the payload [35]. Compared to other tags such as ARToolkit and ARTag, the AprilTag shows more robust performance when there are changes in lightning, projects scale and tag clarity. With this advantage, the AprilTag has the potential for use in construction progress monitoring and quality control.
However, there is still no research on automatic tracking of the construction process using ordinary mobile phones with AprilTags. During BIM4D construction management, QR code tags can be replaced by AprilTag markers when the detection process is completed far from the building rather than when the QR code is closely scanned. In addition, the actual construction status can be recorded using the position and orientation data from the AprilTags. The detailed application conditions and data processing methodology are introduced in the following sections.

3. Challenges for Progress and Quality Monitoring in Modular Construction

The modular construction process starts when the modules are manufactured in the factory and shipped to the construction site. In general, the purpose of project management at a site needs to consider quality, time, cost, delivery, etc. [36] For this research, construction progress and quality, which specifically refer to the assembly progress and quality of modular buildings, are chosen as the main content in construction management.
Challenge 1: The assembling progress of the actual modules needs to be associated with the schedule plan of the BIM model.
Challenge 2: The quality of the modular building, including the offset/rotation and dimensional variation of the modules, needs to be detected.

3.1. Progress Monitoring for Modules at the Site

From the perspective of the actual application, the most practical strategy to monitor project progress with BIM is to use tags. QR code tags are widely used to help constructors update information at sites via the BIM database [37]. Although various factors, such as the properties of the modules, materials, project size, and other structural components, contribute to the QR link, if the constructors need to track the assembly quality at the site during construction, other equipment, such as a laser scanner and a total station, will be indispensable. AprilTags can play the same role in modular construction management to connect with the BIM database and update information via its ID, as shown in Figure 3. Furthermore, another major role that AprilTags can play in modular construction is controlling construction tolerance at the site. The position data returned from the tags can be used to estimate the assembly quality of the modules; sequentially, the value of tags can be maximized.

3.2. Quality Control for Modules at the Site

During the assembly of an entire modular building, construction deviations will almost inevitably occur at the site for modules of all types [9]. Even when the module quality is examined and guaranteed in a factory, the modules may be slightly out of place, which can cause an accumulated tolerance when the modules are lifted to their final positions when stacked on site. To work together with other modules, the modules may require some offset and rotation during construction.
For multistory modular buildings, vertical tolerances diverge and become very important for high-rise applications. The differences in the heights of the walls of a module tend to be specific to the system and installation method, but in extreme cases, this can cause stepping out in the height of modules or vertical out-of-alignment effects where modules undergo progressive rotation. Generally, the positions of modules can be shifted slightly to maintain the overall verticality of the extremities of the façade, but an offset cross-section will result [2]. This may be coupled with problems in maintaining the horizontality of floors in taller buildings, as illustrated in Figure 4. As the height of the building increases, the risks from the module assembly tolerances will also increase considerably.
In more general cases, the manufacturing process for all types of modules could potentially cause differences in the geometry, particularly for bending or torsional components in the modules. Furthermore, dimensional variations may also occur due to long-distance shipments. Some timber-based module geometry errors will arise from long-term shrinkage, and for concrete modules, from casting to lifting in a modular factory, the concrete strength fluctuates over time. All of these factors can have a certain effect on the final geometry. Therefore, a generic quality detection method needs to capture three-dimensional data at the site instead of using design data.

4. An AprilTag-Based Approach to Monitoring Progress and Quality in Modular Construction

4.1. Overview of the Approach

This approach focuses on detecting the geometric deviations of modules and analyzing their cascading effect on vertical structures through the integration of BIM, AprilTags, and reconstruction. To acquire construction progress and quality data in a low-cost and convenient way, the performance of the proposed method can be substantially equivalent to that of conventional tools, such as QR codes used in 4D BIM management [14] and a 3D laser scanner used to obtain quality data [38,39]. The flowchart of the approach is presented in Figure 5. Before applying the proposed approach, tags need to be attached to the corners of a surface or other specific locations of volumetric units during manufacturing at manufacturing plants.
As shown in Figure 5, the approach works as follows.
  • First, two or more photos of the under-detection modular building need to be captured at the construction site. Then, there are two sections corresponding to the two challenges presented in Section 3.
  • For the progress monitoring part, the captured photos from the site will be used to compare the actual progress with respect to the planned progress that is represented in BIM 4D. In this step, the photo time will be submitted to the BIM database to extract the current BIM model information, and the ID of the tags will be recognized. The modules are subsequently matched and compared with the current BIM information.
  • For the quality control part, the construction quality data are automatically processed under two main scenarios using the tags on the modules, which will be introduced in Section 4.3. Through image processing, an actual model representing the modular building at the site can be generated and compared with the design model from the BIM database. The resulting assembly and geometric quality measurements can be compared with the design specifications for decision-making regarding rework or acceptance.
It should be noted that the extra step with laser scanning is only used in generating the reference model to investigate the performance of the developed approach in this study and is not part of the approach. For quantitative analysis, a simplified cubic model, which represents scaled modular units, was built. Each module had an AprilTag on its outer surface when it was ready for shipping in the factory. For this purpose, we generated a reference model from point clouds to compare the performance of the approach with respect to the state of the art, including efficiency and accuracy. The details of the approach will be provided in the subsequent sections in two parts based on the purpose it serves.

4.2. Part 1: Progress Monitoring Using the Developed AprilTag-Based Approach

In conventional 4D-based progress monitoring systems, dynamic monitoring always includes two main inputs—a 3D as-planned model and construction progress data. In addition to these two inputs, unique IDs of the modules or components are needed in the AprilTag-based approach. When the design work is finalized for a modular building, the ID of each volumetric module is distributed to a tag and recorded in the BIM model of that modular building. The AprilTags that include the IDs corresponding to the digital replica of the modules in the BIM are utilized in the factory to label the physical modules. Due to the low storage capacity of AprilTags, URLs connecting tags to remote BIM servers cannot be included in AprilTags (unlike QR code tags), and IDs will be the only mechanism to connect the digital and physical correspondences of modules.
In contrast to using normal QR code tags, module IDs need to be bound to the IDs of tags from the category of AprilTags, such as Tag36h11. Therefore, the data processing involves acquiring the IDs of the tags to check with the BIM database instead of accessing the information in the tags, as shown in Figure 6. In addition, the size of the tags needs to be larger than that of normal QR code tags because the quality control process needs to refer to the position of these tags. To obtain more precise tag information, the size of the tags depends on the distance between the phone and the modules, while the camera parameters are also considered.
However, in practice the process flow is similar to that when using QR code tags. First, site personnel take photos of the assembled modules at a construction site and update the progress information on the linked pages based on the AprilTags in the photo instead of scanning the QR codes one by one, which can significantly improve management efficiency. Subsequently, given the updated tag list received in the progress data, the approach checks the ID of the tags in the corresponding digital model (BIM) and updates the status of the modules in the digital model. Finally, a comparison of the progress between the actual and planned progress is provided.

4.3. Part II: Quality Control Using the Developed AprilTag-Based Approach

To assess the quality of modular construction assemblies at construction sites, actual modular models need to be generated, and the variations between the design model and the actual model then can be calculated and analyzed. Therefore, it is necessary to obtain the actual construction model efficiently. Considering that each module carries a tag in a specific position on its surface, which has been recorded in the BIM database, the proposed method of assessing the quality of the modular construction is presented in Figure 7.
Although for most modular systems, a routine predelivery inspection for the modules will be applied to ensure that the modules meet the construction specifications, some unexpected deformation can still appear in the modules at the site unless each module has enough rigidity to ignore the minor variations in itself. To account for both, the developed approach takes into account two scenarios—one corresponding to the ideal situation with only assembly errors at sites and the other corresponding to the general situation where modules can undergo deformation—which is suitable for modules with different materials and structures. Therefore, scenario 1 refers to “deviations during assembly at sites only”, and scenario 2 refers to “deviations during assembly with deformations in individual modules”.

4.3.1. Scenario 1: Deviations Due to Assembly Errors Only

The main purpose of checking this condition individually, which only considers assembly errors, is to improve the efficiency of the reconstruction process for highly rigid modules (such as concrete modules or other structurally stiff materials) because the geometric data from the digital replica can be directly invoked, as shown in Figure 8. When the design of the whole modular building is finished, the binding data of each AprilTag with a module will be recorded in the BIM database. First, a photo containing the modules in the construction needs to be taken, as shown in Figure 8a. Sequentially, the tags on the surfaces of the modules can be acquired while the tag position data can be transferred to the BIM database.
After receiving the tag data from the photo, the BIM database checks the ID of each tag and matches it with the model of the module to transmit the geometric information corresponding to each tag, such as the length, width, and height. With the geometric information and the position of the tag, the reconstruction method is presented in Figure 8b.
There are two reference frames in this scenario—the camera coordinates (xc, yc, and zc) and the tag coordinates (xt, yt, and zt)—while the origin points of these two coordinates are the center of the camera lens and the center of the AprilTag, respectively (and the optic axis is the z-axis). To improve the accuracy of tag detection, the camera intrinsic matrix K needs to be acquired through calibration instead of using the parameters from the phone. In addition, the coordinate system (u, v) of the image frame measures the pixel locations in the image plane. After detecting the photo pixels, the edge and corner points of the AprilTag are identified, which means that the position of the tags (ut, vt) in the image frame is recognized. Referring to the imaging principle of cameras, the projective mapping from the camera coordinate system and the tag coordinate system to the image coordinate system are shown below:
z c · u t v t 1 = K x c y c z c = K R T x t y t z t 1
During the processing of the photos, the size of the tags in their coordinates (xt, yt, and zt) is set as “unit 1”, and the proportional action factor (zc) is removed with the tag size information from the database. Then, the extrinsic parameters can be calculated. Note that the extrinsic parameters include the rotation array (R) and translation array (T). This means that the coordinate system transforms from tag coordinates to 3D camera coordinates. In this case, the distance of the tag relative to the edge of the module surface is determined. Therefore, each actual module in construction can be modeled based on the 3D geometry data of the module in the tag coordinates and the extrinsic parameters (R, T). After reconstructing all the actual modules, the assembly quality can be estimated by comparing the reconstruction model with the BIM model, which has the corresponding IDs.

4.3.2. Scenario 2: Deviations during Assembly with Deformations in Individual Modules

Considering a more general case, there could be deformations in the modules combined with the assembly deviations, such as in the case of light steel modules or modules made of structurally less stiff materials. The main difference from the ideal condition is that the design geometry data cannot be used in reconstructing the actual modules. Therefore, the actual 3D geometric data need to be generated based only on the photos. It should be noted that the deformation of the modules remains at a small level relative to the volume and size itself. Therefore, even if there is little variation in the surface, the local area where the tag is located is still a plane, which means that AprilTags can still be attached to the surface.
Step 1: Acquire multiple photos at sites
The distance information is generally uncertain in photos taken with only one camera. In this case, photos containing the same under-detection modules need to be taken at least twice, as shown in Figure 9a. Note that a single movement of the camera should not be too large to make the method ineffective, as it needs to refer to the distance between the camera and the modular building. Sequentially, the tags on the surface of the modules can be acquired, and the tag position data in different photos can be calculated.
To solve the problem of distance information uncertainty, the main purpose of the algorithm is to simulate a binocular stereo-vision system based on AprilTags and a phone, and then the actual module models can be extracted from the generated point cloud. A simplified binocular stereo-vision system with one tag and two photos is presented in Figure 9b. There are three reference frames in this method—the camera coordinates (x1, y1, and z1), the moving camera coordinates (x2, y2, and z2), and the tag coordinates (x3, y3, and z3)—while the origin points of these three coordinates are the center of the camera lens and the center of the AprilTag, respectively (and the optic axis is the z-axis). The camera intrinsic matrix K need to be acquired through calibrating instead of using the parameters from the phone. In addition, the camera intrinsic matrix K and the coordinate system (u, v) of the image frame are the same as the options in the ideal condition.
Step 2: Utilize a binocular stereo system
After detecting the tag in photo 1 and photo 2, the extrinsic parameters (R, T) of each photo can be acquired, as shown in Figure 9c. To detect the same tags, the relative position of the camera was calculated with the extrinsic parameters (R1, T1) in photo 1 and the extrinsic parameters (R2, T2) in photo 2. Note that the rotation array (R) and translation array (T) are from the tag coordinates relative to the camera coordinates. Therefore, the relative position (Rs, Ts) of the camera can be calculated as follows (the translation array T 2 is the translation matrix of camera 2 relative to camera 1):
R S = R 1 R 2 1
T 2 = R 1 R 2 1 T 2
T S = T 1 + T 2 = T 1 + R 1 R 2 1 T 2
In the photos, many modules are taken, and more than one of them is the same. The same modules all have tags on their surfaces, so there is more than one tag (n) after processing. In this case, the average result is used to improve the accuracy. Finally, the extrinsic parameters (Ronce, Tonce) are calculated as follows:
R o n c e = 1 n 1 n R S ;         T o n c e = 1 n 1 n T S
Combining the extrinsic parameters (Ronce, Tonce) and the camera intrinsic parameters through calibration, the simulated binocular stereo system can work. Note that the phone cannot use the autofocus mode—otherwise, the intrinsic camera may vary when moved—and other options, such as the size of the photos, should also remain unchanged. With binocular stereo photos, distance information can be acquired, as shown in Figure 9d.
First, the photos need to be rectified onto a common image plane using the parameters of the simulated binocular stereo system in such a way that the corresponding points have the same row coordinates. This photo rectification makes the image appear as though the two cameras are parallel [40]. The pixel disparity from the pair of rectified stereo photos is subsequently calculated via the semiglobal matching (SGM) method [41]. Photo 1 from the camera is used as the reference image for computing the disparity map with photo 2. Note that the range of disparity must be determined to cover the minimum and maximum amount of horizontal shift between the corresponding pixels in the rectified stereo pair image in the disparity calculation.
Then, a point cloud containing three-dimensional coordinates corresponding to the pixels in the disparity map can be generated. The coordinates of the point cloud relative to the optical center of the camera in a simulated binocular stereo system are the same as the camera coordinates (x1, y1, and z1).
Step 3: Extract modules
The point clouds generated from a simulated binocular stereo system always contain many noisy points. To acquire the actual model of the modules, the relative points that belong to the modules in the point cloud need to be extracted. The extraction of the modules is completed in the following two substeps.
(1) Coarse extraction. There are a large number of points in the point cloud, which makes processing difficult; therefore, the first extraction step involves selecting all the points of the modules for construction and deleting the background noise. In the photo, each module has a tag on the surface, so it is determined which module is detected. For each tag, an ideal area (Ai) containing all the modules can be calculated based on its extrinsic parameters (R, T) and the design model information, which is similar to the process in the construction tolerance-only section. Then, the final ideal space (A) can be determined by combining all the ideal spaces calculated using each tag ( A = A i ). The coarse extraction based on the tags on the modules is presented in Figure 9e. Note that the selected space (ΔX, ΔY, and ΔZ) will be slightly larger than the ideal space to ensure that the points of the modules can be enclosed even if there is some tolerance in the modules.
(2) Fine extraction. After the coarse extraction, most of the noisy points will have been deleted, including the background noise. To acquire an accurate point cloud of the modules, fine extraction is then completed for each module. The fine extraction method for the modules is presented in Figure 9f. With the ideal area (Ai) for each module based on the tag, a clipping space larger than Ai is determined to extract the points in this region. However, these points still contain some noise. Principal component analysis (PCA) is used to continue clipping the points. A best-fit space that is smaller than the last clipping space can be determined based on the PCA corner and an increment on each principal axis. The increment should be constantly adjusted considering the different modules and project conditions to obtain a better extraction result. Note that it is not practical to clip the points by detecting the edges of the point cloud because the module is tightly connected with the others next to it, and there will be cases where one or two edges have no features.
After the extraction process, the point cloud for the modular building is acquired. In some cases, the modular building in construction cannot be taken in one photo; for example, the photo in Figure 9a includes only one-third of the whole building under construction. The point clouds generated in different photos need to be registered together. Because of the tags on the modules, there is no need to find features in the points for registration, such as using the iterative closest point algorithm [42]. Different modules can be registered with tag transformation, as shown in Figure 9c. The relative position of the point cloud is calculated with the extrinsic parameters (R1, T1) in photo 1 and the extrinsic parameters (Rn, Tn) in the other photos. Therefore, the points of other modules can be transformed into the point cloud in photo 1 with Formulas (2) and (4). Then, the integrated point cloud is acquired. In this study, the algorithms were implemented in MATLAB and Python 3.6 environments.

4.3.3. Precision Analysis of the Quality Control Method

The quality of the modular construction can be assessed by comparing the reconstruction modules with the design model in the BIM database. The differences between the design model and the reconstruction model are quantitatively analyzed based on the construction requirements and design documents. However, in the establishment of mathematical models for automatic reconstruction, errors and inaccuracies will inevitably occur. Therefore, it is essential to discuss the reliability of the entire process to verify the credibility of the method.
In Scenario 1, the main source of reconstruction errors is a tag error (e1) when detecting the AprilTags. Three factors contribute to this error (e1): the precision of the phone camera, the shooting angle from the phone to the tags, and the size of the tags. In general, if the tag has a larger relative size in the photo, i.e., more pixels, the tag-detection result will be more accurate. Therefore, the size of the tags should be adjusted in different projects considering the design of the modules’ facades.
In Scenario 2, other errors in reconstructing the point cloud (e2) need to be considered, and there are two other sources of these errors: point cloud generation and clipping. (1) Point cloud generation errors arise from the binocular stereo system. First, the movement of the phone when taking photos of the same modules, such as the situation shown in Figure 9b, must not be so far as to prevent point cloud generation. Moreover, the baseline of the binocular stereo system needs to be larger to obtain a better result. In this case, the movement distance should consider the module size and the shooting distance to find an appropriate movement distance for each project. (2) Clipping errors come from the fine extraction process. The last clipping space will contain an increment adjusted considering the different modules to obtain a better extraction result, and this increment will cause a small amount of error in the final point cloud of the actual modules. The total error for the composite tolerance condition consists of the tag error (e1) and the reconstruction error (e2).

5. Validation

To demonstrate the feasibility of the proposed AprilTags-based approach for progress monitoring and quality control in modular construction, tests that replicate the scenarios and conditions described in Section 4 were designed and analyzed. During the tests, a QR code was used as a reference for the progress monitoring part, while the models from the 3D printer and 3D laser scanner were used as a reference for the quality control part. Because the purpose of this paper is to provide a convenient and economic construction management method, only a common phone was used during the whole process.

5.1. Progress Monitoring Using AprilTags and QR Codes

Progress monitoring in this test meant that the constructor could update the assembly information through the phone by scanning the tags on the modules. During the manufacturing of the modules, paper tags containing AprilTags and QR codes were pasted on the surface of the modules. When the tests started, the constructor took a shot of two tags separately, and the progress information of the modules was recorded in the BIM database, as shown in Figure 10 and Figure 11.
Figure 10 shows the process of updating the assembly information with a QR code. The constructor scanned one tag at a time with a phone. Sequentially, the URL in the tag was read and used as a link to connect the server side, and the constructor filled in the stage information of the modules on the website. Then, a new record was added to the BIM database. Figure 11 shows the process of updating the assembly information with AprilTags. The constructor scanned four tags at a time with a phone. Using this approach, the four AprilTag IDs were recognized. Because the module ID and AprilTag ID had been bound in the database, the website for inputting the stage information could be accessed based on this association, and four new records were added to the BIM database. Therefore, AprilTags can not only replace QR codes in progress monitoring but also greatly improved detection efficiency.

5.2. Quality Control Using AprilTags and 3D Laser Scanner

5.2.1. Scenario 1: Deviations Due to Assembly Errors Only

Before the test started, the BIM model of the modules used for simulating the modular construction was finished and the design data stored. In this test, each module was a standard cube ( 80 × 80   m m ), and a mobile phone (Samsung Galaxy Note 8) was used as the photographic machine. Although the manufacturing process of current CMOS camera modules (CCM) has reached a high degree of precision, it may also cause errors as the use time increases. In addition, there is always more than one camera on a current phone. Therefore, the phone must be calibrated first, and it should be noted that the automatic mode of the phone camera cannot be used because the focal length will change. The calibration target and sixteen photos were used for the phone camera calibration, as shown in Figure 12. Through the calibration, the camera intrinsic matrix (K) could be acquired.
To guarantee that the reference model in this test (RMT1) was the same as the BIM model so that the BIM data could be used in the reconstruction process, the modules were fabricated by a 3D printer, as shown in Figure 13. Each module of the whole BIM model in the upper right corner was individually designed and printed. A 3D printer (Ultimaker 3 Extended [43]) was used to print the modules, and the detailed technical parameters are listed in Table 1.
(a) Reference model (RMT1) design
During the printing of each module, to simulate the offset and rotation of the modular construction, the BIM model of each module was designed in detail, as shown in Figure 14b. RMT1s are hollow boxes, and it takes approximately 10 h for all the modules to print with 0.25 mm resolution. For example, in rotating modules, the hook and brace are accurately designed and printed, thus achieving an assembly that is the same as the design, as shown in Figure 14a. Therefore, the assembly result of RMT1 is exactly the same as that of the BIM model, as shown in Figure 14c. The value of the fine design is that the reconstruction model can be directly compared with the BIM model even if the reconstruction process is carried out based on the reference model.
(b) Test procedure and model comparisons
On the surface of each module, a standard-size label containing an AprilTag was pasted at a specific position. Moreover, the ID of each tag was bound to the module information in the BIM database. The calibrated mobile phone was used for taking pictures, as shown in Figure 15a. After that, according to the camera intrinsic matrix (K), all the tags in the picture are were, and their corner points were marked, as shown in Figure 15b. When the tags in the picture were detected, the ID information and spatial coordinates were sent to the BIM database. For each module, the reconstruction model was generated based on its spatial coordinates and the geometric information of the BIM bound to the ID, which can reflect the actual assembly error, as shown in Figure 15c.
After the reconstruction model was generated, the test results were acquired. Since there was no deformation in the modules during construction, the reconstruction model could be directly compared with the BIM model, as shown in Figure 16. In the BIM model, the offset was set to 5 mm, and the rotation was 3.58 degrees; in the reconstruction model, the measured offset was 4.76 mm, and the rotation was 3.41 degrees. Therefore, in the current experimental scene and equipment, the offset error was 0.24 mm and the rotation error was 0.17 degrees. The module assembly information obtained by the proposed method had at least 95% credibility, so this method can be considered feasible for quality inspection in modular construction.

5.2.2. Scenario 2: Deviations during ASSEMBLY with deformations in Individual Modules

In this test, a more general situation was simulated, namely, the actual modules contained deformation, so the BIM data were not available for use in model reconstruction. Before the start of the test, the BIM model of the modules used to simulate the assembly conditions was designed. The modules were standard cubes (120 × 120 mm), and the offset and rotation were designed in detail, as shown in Figure 17a. It should be noted that the mobile phone, the tag size and the photographing distance were consistent with those in Test 1 because only in this way could the recognition error in Test 1 be referenced. The actual module was a nonstandard cube in which the box contained deformation, as shown in Figure 17b.
(1) Reference model (RMT2) design
Since the modules in this test were nonstandard cubes, the reference model representing the actual assembly situation was different from the BIM model. Therefore, high-precision 3D laser scanning technology was used to obtain the reference model, as shown in Figure 18a. A handheld laser scanner [44] was used during the test, and the specific parameters of the scanner are shown in Table 2. First, the actual modular construction situation was scanned by the laser to obtain the point cloud, and the reference point cloud could be obtained by manually deleting the noise points, as shown in Figure 18b. Taking one of the modules as an example, the reference point cloud was meshed for subsequent detection, as shown in Figure 18c.
(2) Test procedure
In this case, the reconstruction process could not use the data in the BIM database. According to Section 4.3.2, only a mobile phone was used to take pictures, which were then taken again after being moved, as shown in Figure 19a. It should be noted that the entire shooting process was performed with a handheld device. Using the obtained photos, the external parameters of the two photos were calculated. Taking one of the modules as an example, the extrinsic parameters (R1, T1) in photo 1 and the extrinsic parameters (R2, T2) in photo 2 are shown below:
R 1   T 1 = 0.979 0.013 0.202 0.001 0.998 0.061 0.203 0.059 0.977   79.009 56.840 342.35 ;   R 2   T 2 = 0.978 0.016 0.207 0.004 0.998 0.058 0.208 0.056 0.976   68.013 55.780 343.06
Because there were four modules in one photo, the final external parameters are the average result calculated for the four tags (Rs, Ts). Finally, the extrinsic parameters (Ronce, Tonce) were calculated, and an undistorted image is shown in Figure 19b.
R o n c e   T o n c e = 0.999 0.003 0.001 0.003 0.999 0.001 0.001 0.001 0.999   10.91 0.397 0.679
By calculating the disparity of the two pictures, the disparity point cloud could be obtained, as shown in Figure 19c. In addition, the color point cloud could be obtained by assigning the corresponding RGB value to each point where the parallax was successfully calculated, as shown in Figure 19d. Note that since the parallax point cloud was based on the first picture, the RGB matrix of the photo taken first was used. After obtaining the point cloud, coarse extraction was applied to the overall points, and then fine extraction was applied to each module sequentially. By combining the extracted individual modules, the final reconstruction model was generated, as shown in Figure 19e.
(3) Offset and rotation comparison
To calculate the offset and rotation of the point cloud obtained via laser scanning and the point cloud obtained via the proposed method, principal planes were generated, and the points were projected on the plane, as shown in Figure 20. In the main plane, the offset of the reference model (RMT2) was 15.99 mm, and the rotation was 6.41°. Correspondingly, the offset of the reconstruction model was 15.08 mm, and the rotation was 7.23°. Therefore, in the current experimental scene and equipment, the offset error was 0.88 mm and the rotation error was 0.82 degrees.
(4) Model comparisons
Since the reconstruction model was completely realized based on the AprilTags in this test, the reconstruction point cloud included the deformation. Therefore, it was necessary to check the reliability of the reconstruction model and the reference model in the deformation part. Although the reconstruction model had been extracted and processed, there were still many noise points in the thickness direction. The module meshed in the reference had been chosen for processing and comparison, as shown in Figure 21a. First, an NURBS surface was used to fit the point cloud to obtain a free-form surface containing the mesh part of the point cloud. Sequentially, the four corner points around the point cloud were used to construct a plane, and the plane was extruded into a cube as a clipping surface, as shown in Figure 21b. By cutting the NURBS surface with the clipping surface, the final reconstruction surface was obtained, as shown in Figure 21c.
After generating the final surface, the obtained reconstruction surface shown in Figure 21c was compared with the meshed surface shown in Figure 18c, which was the reference model in this test. First, the corners of the two surfaces were aligned, and then the surfaces were fine-tuned using the nearest neighbor algorithm. The distribution of the difference between the two surfaces is shown in Figure 22. The detailed error values are shown in Table 3. According to the table, the average error was +0.4970/−0.6213 mm and the probability that the error was less than 1.5 mm reached 95%. Therefore, the method proposed in this paper can replace laser scanning for quality inspection of the modular construction process.

5.3. Precision Discussions

Through the above quantitative tests, the feasibility of the proposed method has been proven. The main purpose of the proposed method is to provide a new approach to the management of assembly-type construction projects in addition to applying QR codes and 3D laser scanning technology. When applying the proposed method, several key points need to be noted: (1) the ID and tag location posted on the module should be associated with the BIM model; (2) the mobile phone camera must not use the automatic mode; (3) the distance between two shots cannot be too far or too close because too close will reduce the accuracy of generating the point cloud, while too far will cause the parallax calculation to fail.
There is no doubt that this method has the advantages of economy and convenience, but the accuracy is affected by many factors in actual project applications. For example, the camera CMOS accuracy of the on-site mobile phone, the manufacturing flatness of the module local area where the tag is located, the distance and angle when photographing the modules, the sticking error when placing the tag, and the cutting error in the process of reconstructing the model. The parameters of the entire process will vary from project to project. To improve the accuracy, the tags should be photographed vertically as much as possible, and a larger tag size should be used to ensure that the tag is flat. In addition, the shooting distance should be close to the calibration distance to obtain high-precision photos.

6. Conclusions and Future Work

Modular buildings maximize the prefabrication of components in factories, so only the connection and assembly processes are carried out on site. Therefore, modular constructions always have greater management and installation quality requirements, including offset and rotation generated during on-site assembly. BIM management methods based on QR codes and 3D laser scanning have been widely used in industry, but the high equipment costs have also put economic pressure on the promotion of BIM technology. In this study, a novel economical modular construction management procedure is introduced. First, photos containing the target modular building are taken and the AprilTags in the photos are detected. After identifying the IDs in the AprilTags, process management is realized by comparing the tag IDs with the BIM database to realize the function of using QR codes. In terms of quality control, there are two conditions: scenario 1, deviations due to assembly errors only; and scenario 2, deviations during assembly with deformations in individual modules. For modules with high structural stiffness or higher manufacturing accuracy, only construction tolerances are considered. The BIM data of the modules are used to assist in reconstruction. For the composite tolerance condition where BIM data cannot be used, the reverse point cloud is generated based on AprilTags and the final reconstruction model is completed through coarse extraction and fine extraction.
Two validation tests were performed to evaluate the capabilities, as well as the limitations, of the methodology discussed in this paper. In the first test, the reconstruction model was compared with the reference model fabricated by 3D printing to simulate the construction tolerance-only condition; in the second test, the reconstruction model was compared with the reference model generated by 3D laser scanning to simulate the composite tolerance condition. By comparing the offset and rotation in the reconstruction model and the reference model, it has been shown that the proposed method is feasible for quality inspection in the modular construction process. In addition, the reconstruction point cloud was further precisely clipped and analyzed by alignment with the reference model, and the probability that the error was less than 1.5 mm reached 95%, illustrating the performance of the proposed algorithm on the deformation surface.
In summary, this method automatically acquires the as-is condition and installation data of stacked modules on sites using no-cost AprilTags and a common phone camera. By using AprilTags instead of QR codes to label modules, progress management is achieved through rapid identification and association of multiple modules based on a single image. Moreover, a virtual multi-view vision algorithm based on AprilTags is proposed to generate 3D reverse models of the construction site; the quality result can be acquired by comparing the offset and rotation values of the reverse model and the BIM model. Since the component is the basic unit of the associated AprilTag, the proposed method can be extended to the construction management of all prefabricated buildings.
Next, the proposed method will be applied to different actual projects to collect effective application data. The influence of lens distortion, moving distance, and distance to the module on the accuracy of the reverse model will be the focus of future research. To further improve the quantization solution of the optimal accuracy setting in the discussion section, the authors are developing a parameter system that can combine project features and mobile phone internal parameters to automatically determine the optimal settings. Based on the collected data from different projects, artificial intelligence (AI) is used in this parameter system to quickly analyze the reliability of using different mobile phones, constructions at different sites, and different modular buildings to help asset managers take appropriate actions accordingly. Finally, according to the proposed method, a BIM management and quality control system for prefabricated construction based on AprilTags and mobile phones has been developed.

Author Contributions

Conceptualization, J.L.; Methodology, J.L.; Software, Q.Z.; Validation, J.L. and S.E.; Resources, Q.Z.; Data curation, S.E.; Writing—original draft, J.L.; Writing—review & editing, S.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research was sponsored by Shanghai Pujiang Programme under grant no. 23PJC054.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors also thank all team members involved in the Future Building Informatics and Visualization Lab (biLAB) of NYU for their help in developing this article.

Conflicts of Interest

Author Jindian Liu was employed by the company Shanghai Airport Authority. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Jones, S.A.; Laquidara-Carr, D. Prefabrication and Modular Construction 2020, Dodge Data & Analytics. 2020. Available online: https://www.construction.com/toolkit/reports/prefabrication-modular-construction-2020 (accessed on 27 January 2020).
  2. Lawson, M.; Ogden, R.; Goodier, C. Design in Modular Construction; CRC Press: London, UK, 2014; Available online: https://www.amazon.com/Design-Modular-Construction-Mark-Lawson/dp/0415554500 (accessed on 14 February 2014).
  3. Smith, R.E.; Timberlake, J. Prefab Architecture: A Guide to Modular Design and Construction; Wiley: New Jersey, NJ, USA, 2011; Available online: https://www.abbeys.com.au/book/prefab-architecture-a-guide-to-modular-design-and-construction.do (accessed on 19 November 2010).
  4. Elnaas, H.; Gidado, K.; Philip, A. Factors and drivers effecting the decision of using off-site manufacturing (OSM) systems in house building industry. J. Eng. Proj. Prod. Manag. 2014, 4, 51–58. [Google Scholar] [CrossRef]
  5. Goh, M.; Goh, Y.M. Lean production theory-based simulation of modular construction processes. Autom. Constr. 2019, 101, 227–244. [Google Scholar] [CrossRef]
  6. Lacey, A.W.; Chen, W.; Hao, H.; Bi, K. Structural response of modular buildings—An overview. J. Build. Eng. 2018, 16, 45–56. [Google Scholar] [CrossRef]
  7. Aitchison, M. Prefab Housing and the Future of Building: Product to Process; Lund Humphries: London, UK, 2018; Available online: https://books.google.com/books?id=bPyCAQAACAAJ (accessed on 1 March 2018).
  8. Bertram, N.; Fuchs, S.; Mischke, J.; Palter, R.; Strube, G.; Woetzel, J. Modular Construction: From Projects to Products, McKinsey & Company. 2019. Available online: https://www.mckinsey.com/industries/capital-projects-and-infrastructure/our-insights/modular-construction-from-projects-to-products# (accessed on 18 June 2019).
  9. Sharafi, P.; Mortazavi, M.; Samali, B.; Ronagh, H. Interlocking system for enhancing the integrity of multistory modular buildings. Autom. Constr. 2018, 85, 263–272. [Google Scholar] [CrossRef]
  10. Lopez, D.; Froese, T.M. Analysis of costs and benefits of panelized and modular prefabricated homes. Procedia Eng. 2016, 145, 1291–1297. [Google Scholar] [CrossRef]
  11. Arashpour, M.; Wakefield, R.; Lee, E.W.M.; Chan, R.; Hosseini, M.R. Analysis of interacting uncertainties in on-site and off-site activities: Implications for hybrid construction. Int. J. Proj. Manag. 2016, 34, 1393–1402. [Google Scholar] [CrossRef]
  12. Lawson, R.M.; Ogden, R.G.; Bergin, R. Application of modular construction in high-rise buildings. J. Archit. Eng. 2012, 18, 148–154. [Google Scholar] [CrossRef]
  13. Arashpour, M.; Kamat, V.; Bai, Y.; Wakefield, R. Abbasi Optimization modeling of multiskilled resources in prefabrication: Theorizing cost analysis of process integration in off-site construction. Autom. Constr. 2018, 95, 1–9. [Google Scholar] [CrossRef]
  14. Lorenzo, T.M.; Benedetta, B.; Manuele, C.; Davide, T. BIM and QR-code. A Synergic Application in Construction Site Management. Procedia Eng. 2014, 85, 520–528. [Google Scholar] [CrossRef]
  15. Charef, R.; Alaka, H.; Emmitt, S. Beyond the third dimension of BIM: A systematic review of literature and assessment of professional views. J. Build. Eng. 2018, 19, 242–257. [Google Scholar] [CrossRef]
  16. Enshassi, M.S.A.; Walbridge, S.; West, J.S.; Haas, C.T. Integrated risk management framework for tolerance-based mitigation strategy decision support in modular construction projects. J. Manag. Eng. 2019, 35. [Google Scholar] [CrossRef]
  17. Puri, N.; Turkan, Y. Bridge construction progress monitoring using lidar and 4D design models. Autom. Constr. 2020, 109, 102961. [Google Scholar] [CrossRef]
  18. Kim, J.; Chi, S. Multicamera vision-based productivity monitoring of earthmoving operations. Autom. Constr. 2020, 112, 103121. [Google Scholar] [CrossRef]
  19. Han, D.; Zhao, Y.; Pan, Y.; Liu, G.; Yang, T. Heating process monitoring and evaluation of hot in-place recycling of asphalt pavement using infrared thermal imaging. Autom. Constr. 2020, 111, 103055. [Google Scholar] [CrossRef]
  20. Guo, J.; Yuan, L.; Wang, Q. Time and cost analysis of geometric quality assessment of structural columns based on 3D terrestrial laser scanning. Autom. Constr. 2020, 110, 103014. [Google Scholar] [CrossRef]
  21. QR Code. Available online: https://www.qrcode.com/en/about/ (accessed on 1 January 1994).
  22. Czerniawski, T.; Leite, F. Automated digital modeling of existing buildings: A review of visual object recognition methods. Autom. Constr. 2020, 113, 103131. [Google Scholar] [CrossRef]
  23. Shabalina, K.; Sagitov, A.; Sabirova, L.; Li, H.; Magid, E. ARTag, AprilTag and CALTag fiducial systems comparison in a presence of partial rotation: Manual and automated approaches. In Informatics in Control, Automation and Robotics; Gusikhin, O., Madani, K., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 536–558. [Google Scholar] [CrossRef]
  24. Abbas, S.M.; Aslam, S.; Berns, K.; Muhammad, A. Analysis and Improvements in AprilTag Based State Estimation. Sensors 2019, 19, 5480. [Google Scholar] [CrossRef] [PubMed]
  25. Lin, Y.C.; Cheung, W.F.; Siao, F.C. Developing mobile 2D barcode/RFID based maintenance management system. Autom. Constr. 2014, 37, 110–121. [Google Scholar] [CrossRef]
  26. Zhao, L.L.; Liu, Z.S.; Mbachu, J. Development of intelligent prefabs using IoT technology to improve the performance of prefabricated construction projects. Sensors 2019, 19, 4131. [Google Scholar] [CrossRef]
  27. Han, K.; Degol, J. Golparvar-Fard Geometry—And appearance—Based Reasoning of Construction Progress Monitoring. J. Constr. Eng. Manag. 2018, 144, 04017110. [Google Scholar] [CrossRef]
  28. Shirowzhan, S.; Sepasgozar, S.M.E.; Li, H.; Trinder, J.; Tang, P. Comparative analysis of machine learning and point-based algorithms for detecting 3D changes in buildings over time using bitemporal lidar data. Autom. Constr. 2019, 105, 102841. [Google Scholar] [CrossRef]
  29. MGolparvar-Fard; Bohn, J.; Teizer, J.; Savarese, S.; Peña-Mora, F. Evaluation of image-based modeling and laser scanning accuracy for emerging automated performance monitoring techniques. Autom. Constr. 2011, 20, 1143–1155. [Google Scholar] [CrossRef]
  30. Bhatla, A.; Choe, S.Y.; Fierro, O.; Leite, F. Evaluation of accuracy of as-built 3D modeling from photos taken by handheld digital cameras. Autom. Constr. 2012, 28, 116–127. [Google Scholar] [CrossRef]
  31. Marzouk, M.; Hassouna, M. Quality analysis using three-dimensional modeling and image processing techniques. Constr. Innov. 2019, 19, 614–628. [Google Scholar] [CrossRef]
  32. Nahangi, M.; Czerniawski, T.; Haas, C.T.; Walbridge, S. Pipe radius estimation using Kinect range cameras. Autom. Constr. 2019, 99, 197–205. [Google Scholar] [CrossRef]
  33. Feng, C.; Dong, S.; Lundeen, K.; Xiao, Y.; Kamat, V. Vision-based articulated machine pose estimation for excavation monitoring and guidance. In ISARC. Proceedings of the International Symposium on Automation and Robotics in Construction; IAARC Publications: Oulu, Finland, 2015; Volume 32, p. 1. Available online: https://www.iaarc.org/publications/ (accessed on 17 June 2015).
  34. Liang, X.; Chen, G.D.; Zhao, S.R.; Xiu, Y.W. Moving target tracking method for unmanned aerial vehicle/unmanned ground vehicle heterogeneous system based on AprilTags. Meas. Control. 2020, 53, 427–440. [Google Scholar] [CrossRef]
  35. Olson, E. AprilTag: A robust and flexible visual fiducial system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3400–3407. [Google Scholar]
  36. Eadie, R.; Browne, M.; Odeyinka, H.; McKeown, C.; McNiff, S. BIM implementation throughout the UK construction project lifecycle: An analysis. Autom. Constr. 2013, 36, 145–151. [Google Scholar] [CrossRef]
  37. Innella, F.; Arashpour, M.; Bai, Y. Lean methodologies and techniques for modular construction: Chronological and critical review. J. Constr. Eng. Manag. 2019, 145, 04019076. [Google Scholar] [CrossRef]
  38. Tang, P.; Huber, D.; Akinci, B.; Lipman, R. Lytle Automatic reconstruction of as-built building information models from laser-scanned point clouds: A review of related techniques. Autom. Constr. 2010, 19, 829–843. [Google Scholar] [CrossRef]
  39. Sanhudo, L.; Ramos, N.M.M.; Martins, J.P.; Almeida, R.M.S.F.; Barreira, E.; Simões, M.L.; Cardoso, V. A framework for in situ geometric data acquisition using laser scanning for BIM modeling. J. Build. Eng. 2020, 28, 101073. [Google Scholar] [CrossRef]
  40. Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library; O’Reilly Media, Inc.: California, CA, USA, 2008; Available online: https://dl.acm.org/doi/book/10.5555/2523356 (accessed on 9 September 2009).
  41. Hirschmüller, H. Accurate and efficient stereo processing by semi-global matching and mutual information. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 20–25 June 2005; Volume 2, pp. 807–814. [Google Scholar] [CrossRef]
  42. Besl, P.J.; McKay, N.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef]
  43. Ultimaker Ultimaker 3 Extended. Available online: https://ultimaker.com/3d-printers/ultimaker-3 (accessed on 1 January 2019).
  44. SHINING 3D TECH. FreeScan X3. Available online: https://www.shining3d.com/ (accessed on 1 January 2020).
Figure 1. Existing approaches: (a) using a QR code tag and (b) acquiring data with a laser scanner.
Figure 1. Existing approaches: (a) using a QR code tag and (b) acquiring data with a laser scanner.
Buildings 14 02252 g001
Figure 2. AprilTag detection algorithm with an input image of class-36H10 AprilTag [24].
Figure 2. AprilTag detection algorithm with an input image of class-36H10 AprilTag [24].
Buildings 14 02252 g002
Figure 3. Comparison between using a QR code and an AprilTag.
Figure 3. Comparison between using a QR code and an AprilTag.
Buildings 14 02252 g003
Figure 4. Problems in modular construction: (a) offset and rotation; (b) dimensional variation of modules.
Figure 4. Problems in modular construction: (a) offset and rotation; (b) dimensional variation of modules.
Buildings 14 02252 g004
Figure 5. The whole framework of this paper: flowchart of the approach and performance comparison.
Figure 5. The whole framework of this paper: flowchart of the approach and performance comparison.
Buildings 14 02252 g005
Figure 6. Processing difference between using a QR code and an AprilTag.
Figure 6. Processing difference between using a QR code and an AprilTag.
Buildings 14 02252 g006
Figure 7. The quality control process when using the AprilTags-based approach.
Figure 7. The quality control process when using the AprilTags-based approach.
Buildings 14 02252 g007
Figure 8. Reconstructing the actual model in Scenario 1.
Figure 8. Reconstructing the actual model in Scenario 1.
Buildings 14 02252 g008
Figure 9. Reconstructing the actual model under composite tolerance conditions:(a) take photos at least twice; (b) generate simplified binocular stereo-vision system with one tag; (c) acquire the extrinsic parameters; (d) calculate the distance between the camera before and after movement; (e) apply coarse extraction and (f) apply fine extraction.
Figure 9. Reconstructing the actual model under composite tolerance conditions:(a) take photos at least twice; (b) generate simplified binocular stereo-vision system with one tag; (c) acquire the extrinsic parameters; (d) calculate the distance between the camera before and after movement; (e) apply coarse extraction and (f) apply fine extraction.
Buildings 14 02252 g009
Figure 10. The assembly information updated with a QR code.
Figure 10. The assembly information updated with a QR code.
Buildings 14 02252 g010
Figure 11. The assembly information updated with AprilTags.
Figure 11. The assembly information updated with AprilTags.
Buildings 14 02252 g011
Figure 12. Calibration of the phone camera.
Figure 12. Calibration of the phone camera.
Buildings 14 02252 g012
Figure 13. 3D printing of the modules.
Figure 13. 3D printing of the modules.
Buildings 14 02252 g013
Figure 14. The fine design of RMT1 to simulate offset and rotation: (a) the hook and brace; (b) BIM model with the hook and brace and (c) the reference model with the hook and brace.
Figure 14. The fine design of RMT1 to simulate offset and rotation: (a) the hook and brace; (b) BIM model with the hook and brace and (c) the reference model with the hook and brace.
Buildings 14 02252 g014
Figure 15. The reconstruction process of the actual model in Test 1: (a) taking pictures of the reference model; (b) marking the corner points and (c) generating the reconstruction model.
Figure 15. The reconstruction process of the actual model in Test 1: (a) taking pictures of the reference model; (b) marking the corner points and (c) generating the reconstruction model.
Buildings 14 02252 g015
Figure 16. The offset and rotation in (a) the reference model (RMT1) and (b) the reconstruction model.
Figure 16. The offset and rotation in (a) the reference model (RMT1) and (b) the reconstruction model.
Buildings 14 02252 g016
Figure 17. The design model (a) and the actual model with deformation in Test 2 (b).
Figure 17. The design model (a) and the actual model with deformation in Test 2 (b).
Buildings 14 02252 g017
Figure 18. 3D laser scanning of the modules (a), processing the point cloud data (b), and generating the reference model RMT2 (c).
Figure 18. 3D laser scanning of the modules (a), processing the point cloud data (b), and generating the reference model RMT2 (c).
Buildings 14 02252 g018
Figure 19. The reconstruction process of the actual model in Test 2: (a) take pictures with an ordinary phone; (b) generate an undistorted image; (c) obtain the disparity point cloud; (d) create the color point cloud; (e) use coarse extraction to obtain the final reconstruction model.
Figure 19. The reconstruction process of the actual model in Test 2: (a) take pictures with an ordinary phone; (b) generate an undistorted image; (c) obtain the disparity point cloud; (d) create the color point cloud; (e) use coarse extraction to obtain the final reconstruction model.
Buildings 14 02252 g019
Figure 20. The offset and rotation in (a) the reference model (RMT2) and (b) the reconstruction model.
Figure 20. The offset and rotation in (a) the reference model (RMT2) and (b) the reconstruction model.
Buildings 14 02252 g020
Figure 21. The reconstruction point cloud (a), generating an NURBS surface to fit the point cloud (b), and reconstructing the final surface (c).
Figure 21. The reconstruction point cloud (a), generating an NURBS surface to fit the point cloud (b), and reconstructing the final surface (c).
Buildings 14 02252 g021
Figure 22. Comparison of the reconstruction surfaces.
Figure 22. Comparison of the reconstruction surfaces.
Buildings 14 02252 g022
Table 1. Parameters of the 3D printer.
Table 1. Parameters of the 3D printer.
PropertyValue
Layer resolution0.25 mm nozzle: 150–60 micron
XYZ resolution6.9, 6.9, 2.5 micron
Build speed<24 mm³/s
Print technologyFused filament fabrication (FFF)
Table 2. Parameters of the laser scanner.
Table 2. Parameters of the laser scanner.
PropertyValue
Scan rate350,000 (times/s)
Scanning area300 × 250 (mm)
Resolution0.100 (mm)
Measurement accuracy0.030 (mm)
Volume accuracy0.020 + 0.080 (mm/m)
Table 3. The comparison results of the reconstruction surfaces.
Table 3. The comparison results of the reconstruction surfaces.
ResultReconstruction Error (mm)
Mean Error+0.49−0.62
Standard Deviation0.75
RMS Estimate0.77
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, J.; Ergan, S.; Zhang, Q. An AprilTags-Based Approach for Progress Monitoring and Quality Control in Modular Construction. Buildings 2024, 14, 2252. https://doi.org/10.3390/buildings14072252

AMA Style

Liu J, Ergan S, Zhang Q. An AprilTags-Based Approach for Progress Monitoring and Quality Control in Modular Construction. Buildings. 2024; 14(7):2252. https://doi.org/10.3390/buildings14072252

Chicago/Turabian Style

Liu, Jindian, Semiha Ergan, and Qilin Zhang. 2024. "An AprilTags-Based Approach for Progress Monitoring and Quality Control in Modular Construction" Buildings 14, no. 7: 2252. https://doi.org/10.3390/buildings14072252

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop