Next Article in Journal
Exploring the Use of Data in a Digital Twin for the Marine and Coastal Environment
Previous Article in Journal
Multi-Scenario Simulation of Urban Land Expansion Modes Considering Differences in Spatial Functional Zoning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Documentation for Architectural Heritage: A Historical Building Information Modeling Data Modeling Approach for the Valentino Castle North Wing

Department of Architecture and Design, Polytechnic University of Turin, Viale Pier Andrea Mattioli 39, 10125 Torino, Italy
*
Author to whom correspondence should be addressed.
ISPRS Int. J. Geo-Inf. 2025, 14(4), 139; https://doi.org/10.3390/ijgi14040139
Submission received: 27 December 2024 / Revised: 7 March 2025 / Accepted: 18 March 2025 / Published: 25 March 2025

Abstract

:
Although HBIM (Historical Building Information Modeling) excels in geometric data acquisition and modeling within Scan-to-BIM (Building Information Modeling) workflows, its application in digital documentation faces persistent challenges, such as balancing precision and efficiency, ambiguous information structures, and the absence of standardized protocols. To address these issues, this study refines key steps from the systematic Scan-to-BIM process and proposes a documentation-oriented HBIM workflow. The workflow is designed to tackle data complexity and semantic alignment challenges through detailed strategic planning, standard data collection, efficient geometric modeling, and ontology-based information integration. Validated in the Valentino Castle’s north wing digital archiving project, the proposed framework emphasizes archival management and standardization, reducing reliance on high-precision point cloud data and complex geometric modeling. Instead, it adopts low-precision geometric models as information storage containers, employing standardized information structures to manage and transfer heterogeneous data. Key contributions include the following: (1) establishing a requirements-driven and model-level-based framework for standardized project management; (2) introducing a component alignment concept to harmonize IFC (Industry Foundation Classes) standards and traditional terminologies; and (3) developing a four-level information structure to enhance HBIM parameter and database management. The workflow significantly reduces data acquisition and modeling time while offering a replicable methodology for heritage documentation, promoting cross-disciplinary collaboration and standardization in digital preservation practices.

1. Introduction

Architectural heritage documentation faces numerous challenges due to its interdisciplinary nature, cultural significance, complex information, and vulnerability to degradation. While digital innovations, particularly Heritage Building Information Modeling (HBIM), have revolutionized conservation practices, bottlenecks still persist in documentation workflows. HBIM extends traditional BIM to address heritage-specific needs, including reverse engineering, damage mapping, and lifecycle management [1,2]. By combining advanced surveying technologies with parametric modeling, HBIM enables accurate geometric representation and systematic documentation of structural conditions [2]. Despite progress in Scan-to-BIM workflows, automated modeling techniques, and standardized data formats (e.g., IFC, COBie) [3], several issues limit the full potential of HBIM:
  • Fragmented Methodologies: Current approaches to standardized documentation frameworks vary widely, from five-phase workflows [4] and hierarchical classification systems [5] to four-tier architectures for facility management interoperability [6]. Achieving universal adaptability for different heritage structures remains a challenge.
  • Semantic Interoperability: Although ontology-driven solutions [7,8,9] and relational databases [10] have improved semantic structuring, issues such as inconsistent metadata annotation and cross-platform knowledge integration persist.
  • Modeling Fidelity Trade-offs: Balancing geometric accuracy (LOG) with semantic depth (LOI) remains challenging. While methods like stratified modeling [11], adaptive components [12], and path-objects for monitoring [13] offer partial solutions, technical and operational constraints hinder scalable implementation.
These challenges are compounded by insufficient workflow standardization, fragmented collaboration, and mismatched data schemas. This study proposes a documentation-centric HBIM framework emphasizing technical coherence, semantic adaptability, and interdisciplinary operability to address these issues.

2. Research Objective

Building on the challenges outlined above, this study aims to advance HBIM documentation methodologies through four interconnected objectives:
  • Standardizing Workflows: Develop a framework for documentation that integrates phased requirements analysis, standardized data collection protocols, and archival management strategies, addressing inconsistencies in information acquisition and storage.
  • Adaptive Geometric–Semantic Modeling: Propose efficient 3D modeling methods for scenarios with limited geometric precision (e.g., without accurate point cloud data). Implement a dual-layer structure to balance geometric accuracy (LOG) and semantic depth (LOI).
  • Enhancing Cross-Platform Interoperability: Design a model-based semantic management system compatible with various software environments, including BIM platforms, databases, and web interfaces. This system will leverage ontology-driven approaches [7,8,9] and structured query mechanisms [10] to improve data integration.
  • Developing a Collaborative Data Ecosystem: Evaluate software-independent data integration methods, such as spreadsheets, APIs, and cloud platforms, to facilitate cross-disciplinary collaboration. Modular metadata schemas and stakeholder engagement protocols will ensure effective information sharing.

3. Materials and Methodology

3.1. Establishment of Workflow

Scan-to-BIM has become a well-established workflow in HBIM, characterized by distinct phases such as the creation of a Common Data Environment (CDE), measurement surveys, point cloud processing, geometric modeling, semantic enrichment, and conservation measures [4]. This workflow is particularly suited for conservation and restoration projects.
This study synthesizes multiple Scan-to-BIM workflows and HBIM guidelines to develop a comprehensive Scan-to-BIM paradigm [4,10,14]. From this paradigm, essential operations are extracted to construct an efficient documentation workflow. The primary goals are to enhance reliability and adaptability by adopting a validated framework, and to improve HBIM management efficiency by simplifying the workflow, reducing reliance on precise measurements and complex geometric modeling. The proposed systematic paradigm also provides a reference for HBIM documentation, aiding project comparison, summarization, and metadata generation for future work. The workflow is divided into six stages (Figure 1):
  • Stage 1. Ontological Knowledge Structure: In this stage, strategic planning defines information requirements and workflow standards. It involves gathering data on project objectives, tasks, and data formats, while establishing protocols for interdisciplinary teams. Additionally, it proposes an integrated information management strategy based on HBIM frameworks and cultural heritage ontologies (e.g., CIDOC or IFC) to standardize geometric modeling and interoperability.
  • Stage 2. Preliminary Information Collection: This stage involves gathering geometric and semantic information on architectural heritage, such as historical drawings, point cloud scans, and asset management records. Establishing standardized protocols for information collection ensures clarity in methods, metadata, and classification systems.
  • Stage 3. Geometric Processing: High-precision surveys are not always needed for documentation-focused HBIM projects, but existing point clouds and drawings remain valuable. Key steps include filtering, cleaning, registration, and fusion of point clouds, along with scanning and digitizing historical documents. Proper metadata documentation is critical to ensure future users can assess the reliability of geometric models.
  • Stage 4. Geometric Modeling: The primary goal is not high-precision modeling but ensuring alignment between semantic information and 3D objects. This stage adopts simplified 3D modeling strategies, enabling appropriate storage of semantic data while maintaining model flexibility.
  • Stage 5. Semantic Information Enrichment: In the documentation project, data are categorized into architectural, heritage, asset, and project information. This stage focuses on effectively managing these interconnected data categories within the HBIM platform.
  • Stage 6. Interoperability: Given the involvement of various stakeholders, predefined standards for information exchange are necessary. These may include spreadsheet models, external databases, or web-based platforms.

3.2. Requirements and Workflow Planning

3.2.1. Information Requirement Analysis

This workflow aims to engage various stakeholders, such as construction managers and researchers, through Information Requirements Management [15]. Information needs are categorized into Organizational Information Requirements (OIR), Asset Information Requirements (AIR), and Project Information Requirements (PIR).
OIR defines the information necessary for strategic goals, policy decisions, compliance, asset management, and business operations. AIR establishes the foundation for asset information production, including standards and methods. PIR specifies the information needed to meet the client’s goals throughout the project lifecycle, including HBIM-related rules and content.
The project utilizes OIR, PIR, and AIR templates (XLSX) from the 2021 Construction Industry Council guidelines, ensuring that team members align with the client’s information requirements and maintain consistency in data delivery and verification. These templates can be modified to support project updates.

3.2.2. BIM Execution Plan Development

After obtaining the requirements, particularly the completion of PIR, this study developed a comprehensive BIM Execution Plan (BEP) encompassing the following:
  • Schedule;
  • Roles and responsibilities;
  • Model usage;
  • BIM tools;
  • Simplified modeling specifications;
  • Data formats and exchange methods in CDE;
  • Naming conventions;
  • Coordination among model alliances;
  • Decomposition structure and information requirement levels for each model element.
Additionally, a shared data environment was set up using the Dalux BOX platform and Google Drive, enabling data and model sharing, team collaboration, and improved interoperability.

3.3. Data Collection and Geometric Preprocessing

In a documentation-oriented HBIM project that does not rely on a high level of geometry, preliminary information collection poses a significant challenge. A structured framework was developed to enhance efficiency and improve team collaboration, dividing the information collection and analysis process into three distinct steps:
  • Step 1: Information classified by sources [16]
    • Transfer of information and data from existing projects, archives, or research.
    • Recognition or relabeling of existing data and information stores, such as GIS and databases.
    • Collection of new or updated information and data from surveys and new research.
  • Step 2: Information classified by categories [17]
    • Archeological and Historical Data: focused on archeological investigations, historical context analysis, and the study of the building’s morphology and functional evolution over time.
    • Geometry: focused on recording, surveying, and visualizing the exact shape and characteristics of the building’s fabric in its current state.
    • Pathology: aimed at identifying and surveying potential damage or decay of the historic building’s fabric over time, whether material decay or structural deterioration.
    • Performance Data: focused on understanding and analyzing the current operability and performance of the building in various aspects, such as energy performance, thermal performance, system performance, and safety and security performance.
  • Step 3: Information classified by storage [18]
    • Documentation: the original archives and materials.
    • Alphanumerical Information: information on various parameters.
    • Geometrical Information: geometric information from various modeling software or CAD.
Geometric preprocessing follows the standardized Scan-to-BIM workflow [19], focusing on point cloud registration, cleaning, classification, and segmentation, using tools like CloudCompare (Version v2.13.alpha.230227) and Autodesk ReCap 2024.

3.4. Identification, Classification, and Grading of HBIM Objects

While advanced point cloud acquisition and high-precision modeling techniques are effective, they may not always be necessary. Overly detailed models can waste resources, so the focus in HBIM documentation is on establishing a robust framework, consistent modeling strategies, and precise information management. Lower geometric precision is acceptable for projects emphasizing semantic information management, as long as it follows defined standards.
In BIM, classification systems organize building information by grouping objects with similar characteristics or behaviors, creating a standardized coding system that facilitates information retrieval by people, software, and machines. Common systems include OmniClass and UniClass. However, classification methods for historical buildings differ due to variations in materials, construction techniques, functions, and decorative elements. Many heritage components feature complex internal structures, presenting two key challenges:
  • Classification Mismatch: modern BIM classification systems are poorly aligned with heritage building components.
  • Granularity Issues: defining the smallest unit for modeling heritage elements is difficult.
To address these challenges and align classification systems with traditional architectural terminology, this study proposes the following methodology (Figure 2):
  • Standardization of Historical Component Terminology: Using Hopkins [20] for standardized building component names.
  • Semantic Classification: Introducing De Luca et al.’s [21] three semantic levels: “Finalized Group”, “Morphological Entities”, and “Reference Marks”.
  • BIM Classification Alignment: Mapping the sixth level of the BIM classification system (based on BIM FORUM LOD Specification 2023) to the “Finalized Group”. Complex structural details are categorized as “Morphological Entities” and “Reference Marks”.
  • Model Simplification Norms: Developing simplification guidelines based on the classification system and creating a Valentino Castle classification database in CSV format for updates and queries.
  • Model Level for Geometric Simplification: Introducing a model level to describe geometric simplification. Italian regulations define the Level of Development (LOD) as a combination of the Level of Geometry (LOG) and Level of Information (LOI), ranging from A to G. For historical buildings, LOD F (as-is) and LOD G (maintenance phase) are common [22]. This study also adopts the Level of Detail concept, replaced by the Level of Knowledge (LOK) to distinguish between geometric and informational granularity, referencing UNI 10838:1999, UNI 8290:1981, and UNI 11337:2017.

3.5. Semantic Information Management

3.5.1. Information Structure Design

Based on the work of Bruno, N., and Roncella, R. [5], this study proposes a dual-track information storage method and designs a semantic classification system compatible with HBIM and external database structures.
This system connects 3D geometric models with external relational databases to interpret multidisciplinary data. Each HBIM object is assigned a unique identification code (ID) to ensure the connection between the 3D model and the database. The system organizes information related to architectural heritage using the performance requirements methods from UNI 10838:1999 and UNI 8290:1981.
The system is divided into four levels: Building, Zone, Architectural Component A, and Architectural Component B:
  • Level 1. Building: This level encompasses the entire building, handling information shared with the whole structure and not specific to any individual building element. It includes global project information (OIR, AIR, PIR, etc.), original documents and archives, and information that is difficult to classify, such as performance indicators;
  • Level 2. Zone: This level describes and manages rooms or spaces with specific functions or characteristics, adding a dimension for retrieving and managing any building element or information. It includes building elements, facilities, equipment, and assets within the rooms or spaces;
  • Level 3. Arhitectural Component A: This level describes building components and elements, aiming for simplified geometric modeling;
  • Level 4. Architectural Component B: This level involves more detailed semantic information of components, such as damage information, historical information, and finer geometric details not captured in the geometric representation.
This classification aligns the geometric and semantic information in HBIM with external database structures. Each building element has a graphical representation in the 3D model and a corresponding description in the database. Each level contains three main types of data: survey metadata, modeling metadata, and descriptive data. All operations and steps must be meticulously recorded and described to ensure users understand the quality, methods, instruments, accuracy, and validation of the modeling or documentation, meeting the needs of various disciplines, including pre-conservation, conservation–restoration, and monitoring information.
This strategy defines these four levels as BIM entities in the database to link the BIM 3D model with database information [5]. These four-level entities form a primary relational table, which can be connected to independent tables that store other detailed information grouped by information type. This data structure ensures database flexibility and creates independent tables for new information in any field, facilitating linkage and integration with the existing database. The links between the primary and subsidiary tables can be established in various ways.

3.5.2. Ontology Design

Simultaneously, this study draws on the work of Previtali et al. [7] to design a universal “foundation” ontology system for cultural heritage information, aligning it with four levels (Figure 3). The “foundation” ontology can be expanded across domains to develop specific ontologies. This system establishes relationships among building entities, spaces, events, assets, and other cultural heritage information, enhancing the interoperability of HBIM. The “foundation” ontology employs seven elements to provide conceptual knowledge of buildings, serving as a basis for developing domain-specific ontologies. The developed ontology suggests concepts that can be easily integrated into IFCOWL.
This “foundation” ontology maintains consistency with the existing cultural heritage ontology, CIDOC-CRM. However, buildings, rooms, spaces, and architectural elements (including A and B) are somewhat related to the four levels of the HBIM information system but are not entirely consistent. The diagram illustrates the main concepts and relationships in the “foundation” ontology implemented in RDF/OWL language. Each domain creates its own domain ontology within this framework by adding domain-specific attributes and relationships to the “foundation” elements. The software platform used is Protégé, an ontology editing and knowledge management system developed by Stanford University. It can construct large-scale visual ontology models in various representation formats, facilitating an understanding of connections between components and concepts.

3.6. Case: North Wing of the Valentino Castle

The workflow was implemented in the north wing of the Valentino Castle after the design. The Valentino Castle, now the campus of the School of Architecture and Design at the Politecnico di Torino, is a renowned cultural heritage site (Figure 4).
Politecnico di Torino has divided the Valentino Castle campus into four areas. Area A includes the central part of the castle, such as the Hall of Honor, two main towers, two front towers (North and South), and the 19th-century wing. Area B consists of the Chevalley wing along the Po River. Area C, the Aloisio block built in 1890, serves as the university’s public service area. Area D is the new wing on the south side (Figure 5).
Over the past two decades, a comprehensive digital framework for the Valentino Castle has been developed, incorporating Metric Survey, Scan-to-BIM, and GIS. Starting with the “Progetto Castello” project in the 1980s, efforts have included digitizing historical data, conducting surveys, and creating the first web platform for castle management [23]. Key contributions include Costamagna and Spanò’s work on GIS standards [24], Raineri’s early HBIM concepts [25], and Chiabrando et al.’s methods for integrating historical documents into BIM [26]. Since 2018, there has been increased focus on using GIS and BIM for long-term documentation and restoration, with new technologies like MMS and photogrammetry enhancing survey methods [27,28,29].
Clearly, HBIM-related studies on the Valentino Castle show both relevance and continuity. Most previous research has focused on geometric acquisition and modeling, accumulating extensive data such as point clouds, drawings, and documents. These complex archives form a rich information foundation, making them an excellent case for HBIM documentation.

3.6.1. Data Collection and Management

During the project implementation phase, the research team conducted an initial needs assessment for participants and stakeholders. Based on their roles and specific requirements, the team pre-developed digital and paper-based requirement forms and distributed them to relevant personnel. Upon collecting the completed forms, the team organized and analyzed the data. To address additional geometric data needs, the team conducted a three-day on-site survey, assessing the current condition of the heritage site and the castle’s north wing. The project systematically classified heterogeneous information, focusing on datasets containing multi-type raw data. Through data filtering, classification, and re-annotation, the team ensured accurate information retrieval and effective semantic management. To address challenges in reading, collecting, and organizing historical documents and complex heterogeneous information, the team integrated manual retrieval with AI tools. For digital files (e.g., PDF and TXT formats), AI tools like ChatGPT-4.0 and Copilot were used to extract key information and link it to the Revit model, utilizing methods such as pre-training, contextual understanding, semantic analysis, and response generation.

3.6.2. Geometric Processing and Modeling

During the information collection phase, the project leveraged existing point clouds and other geometric data sources to boost efficiency and cut costs. To overcome inherent issues with these datasets, the team implemented several key measures: all original point cloud files were cataloged and recorded with detailed metadata and annotations regarding their state, operations, and usage; CloudCompare software(version 2.12) was employed for the registration, fusion, and cleaning of the point clouds. Owing to the volume of data, the team categorized the files by acquisition projects, processed them in batches, and managed various coordinate systems using Autodesk Recap before aligning the multiple point cloud files in Revit. In cases where point clouds lacked specific geometric details, such as the roof truss structure of the castle’s north wing or details in certain rooms, the team supplemented the data using existing drawings and manual surveys. This additional geometric information was then directly modeled during the geometric modeling phase and recorded in the corresponding metadata.
This study developed a multi-tiered modeling strategy for HBIM by first defining simplification strategies and then categorizing geometric objects into four types: structural elements, architectural elements, damages and decay, and zones and rooms (Figure 6).
For structural components, the Faro As-Built plugin was employed to automatically generate floors, walls, columns, and vaults from point cloud data using algorithms (including RANSAC) to detect horizontal planes and extract geometric features, while omitting non-critical details such as bricks and mortar. In cases where key elements (e.g., roof timber truss systems or stairs) were absent from the point clouds, in-place components were used to represent these systems, with their positions and semantic attributes duly recorded.
For architectural elements, a combination of in-place components and loadable families was used—columns were modeled retaining essential features, while doors, windows, and decorations were generated semi-automatically using X-ray orthographic projections and subsequent extrusion of basic shapes. For components like the vaults in the north wing of the castle, this study tested four methods for surface modeling, allowing the selection of appropriate strategies based on different needs (Figure 7).
This study considered two primary types of deterioration—localized damage (e.g., cracks, spalling, efflorescence) and global structural damage (e.g., wall tilting, roof truss deformation, collapse) (Figure 8). Given the project’s focus on documentation rather than intervention, a comprehensive survey was not undertaken; instead, existing damage records from previous projects were reviewed and supplemented with brief visual inspections to verify significant new deterioration. For damage modeling, a simplified, patch-based approach was adopted based on the HBIM method proposed by Barontini et al. [30]. In this method, the damage features of components and materials are abstracted into independent, regularly shaped “patches” whose dimensions are determined from orthographic projections of point cloud data, with manual additions made for damage not captured in the point cloud. Predefined cubic patches with a fixed thickness of 0.01 m were used to visualize damage, and these patches were categorized as independent landmark components to differentiate them from architectural elements and to enable export in IFC format. Additionally, historical damage reports were organized and, after consulting glossaries from ICOMOS-ISCS, the Syrian Heritage Archive, and EwaGlos [31,32,33,34], a damage classification table was developed categorizing damage into three hierarchical levels (Deterioration Category, Deterioration Type, and Deterioration Phenomenon) with seven distinct categories.
In this study, zones were categorized into two distinct types using the “COBie zones” module of Revit interoperability tools (Figure 9): current functional zones and historical partitions. The current functional zones included fire zones, circulation zones, office areas, and classrooms, which were designed to aid in asset management, facility management, and daily operational activities. Historical partitions, on the other hand, divided the entire north wing into zones such as terraces, the north tower, exhibitions, and the chapel, managing building elements within the semantic context of the historical building. The modeling of the center and south wings of the Valentino Castle was also performed, where solids were extruded from built volumes, maintaining a low level of detail appropriate for establishing building locations and related data, without intricate architectural detailing.
Model accuracy and deviation analyses were conducted to evaluate the precision of the parametric models and identify any issues caused by geometric simplification. The AsBuilt plugin’s offset analysis was utilized to quantify the deviation between the point cloud data and the modeled objects. Two types of surface analyses were carried out: a global offset analysis (Figure 10), which covered the walls, floors, and columns of the entire north wing with a calculation range of 50 millimeters; and a local offset analysis for key components and decorations (e.g., arches, doors, windows) with a smaller calculation range of 50 to 10 millimeters. To visualize these deviations, a pseudo-color scalar field was overlaid on the model, where red indicated positive deviation (overestimation) and blue indicated negative deviation (underestimation). This methodology helped detect modeling tolerances and artifact deformations. The global offset analysis focused on larger structural components, while the local analysis examined finer details, allowing for a more comprehensive evaluation of the model’s overall accuracy.

3.6.3. Semantic Information Enrichment

This study employs a structured approach to enriching the semantic information associated with the Valentino Castle North Wing project, categorizing data into multiple levels to facilitate effective management and interoperability.
At Level 1, Building, project information is managed using external spreadsheets, which are linked to Revit schedules/quantities tables via the Table Gen tool within the DirootsOne plugin. This tool enables automatic synchronization of data between Excel and Revit, ensuring that any updates in the spreadsheets are reflected in Revit each time the project is opened. Documents related to the Valentino Castle, such as PDFs, images, and videos, are converted into digital formats and stored at this level. Specific information from these documents is extracted and refined into semantic or image data, which can then be attached to component parameters. However, Revit’s limited file linking capabilities—restricted to formats like Revit/DWG/point cloud/PDF/image—make it difficult to manage annotations directly within these documents.
At Level 2, Zone, each room or space is assigned an independent information sheet, organized into four sections: Basic Information (e.g., area, height), Room Component Information (e.g., walls, doors, windows), Asset Information (e.g., furniture and facilities), and User Information (e.g., operations and interventions specific to the room). Redundant information, such as multiple records for internal walls in different rooms, is avoided by designating a single external room location for each wall, while other room-specific information is managed via room description parameters. This sheet-based system connects data to BIM parameters using plugins or Dynamo for greater flexibility and management.
At Levels 3, Architectural Component A, architectural components are defined and categorized. Level 3 focuses on the basic information, simplified modeling descriptions, and metadata related to components. Despite lower modeling precision, the metadata indicate the model’s purpose and its intended level of detail. Information management at this level relies on Revit’s schedules/quantity takeoffs, which are based on COBie standards to ensure interoperability. For finer control, shared parameters (such as LOK and LOG) are added. This study also tested two methods for diachronic analysis: using Revit’s built-in phase functionality (which is cumbersome) and using BIM One Inc’s ColorSplasher plugin for custom filtering and color schemes, though the plugin has since been modified and made available as an open-source project.
Level 4, Architectural Component A, stores more detailed damage and deterioration information for architectural components, extending the Level 3 data or being managed independently depending on the component type. A damage classification system based on several key references (e.g., ICOMOS [31], EwaGlos Classification [32], Fitzner, B., & Heinrichs, K. (2001) [34]) is used to assess the severity and urgency of damage. The damage grading system classifies damage by its extent (condition grade) and treatment priority (urgency grade). A damage information table is established for each damaged object, categorized into five sections: Basic and Classification Data, Inspection Data (damage changes over time), Geometric Data (area, quantity, extent), Symptoms and Diagnosis (cause, symptoms, grades), and Intervention Analysis (recommendations for treatment).

3.6.4. Interoperability and Data Transfer

This section evaluates three mainstream approaches for information transfer in HBIM: spreadsheets, external databases, and web-based BIM solutions(Figure 11). The spreadsheet tools utilized are Microsoft Excel 2024 and WPS Office (Version 12.1.0.19770). For database management, SQL Server 2022 is employed, with DB Link and Database Manager plugins facilitating the connection between the SQL database and Revit 2024. Web-based BIM platforms, including both open-source and commercial solutions, were tested by importing Revit files to assess their performance and accuracy in information transfer.
Spreadsheets are widely used for information management in BIM projects, particularly for organizing quantities and schedules. Their familiarity and flexibility make them suitable for HBIM information transfer. This study explored three methods to connect spreadsheets with Revit parameters in the project.
  • Method 1: Dynamo Plugin for Visual Programming Language (VPL)
    • Component Filtering: Extract “ifcGUID” parameters from Revit project objects and filter the corresponding entries in Excel sheets.
    • Data Matching: Match and assign values based on ifcGUIDs and parameter names in the spreadsheet.
    • Simultaneous Linking: Connect Revit data and spreadsheet tables concurrently.
  • Issues: This strategy requires advanced VPL skills and custom scripts tailored to the spreadsheet design.
  • Method 2: Microsoft Excel or WPS Office Integration
    • Data Export: Use the DiRootsOne plugin to export Revit schedule/quantity tables as primary tables.
    • Component Queries: Create sub-tables for individual components and input additional parameters.
    • Data Linking: Apply the VLOOKUP function to integrate sub-table information with the primary table.
  • Issues:
    • Manual formula input limits scalability for extensive datasets.
    • Formula-generated values in the primary table are not recognized by Revit. Therefore, formula results must be converted to static values for re-import.
  • Method 3: WPS Office “Data/Get Data/Cross-Table Connection” Function
    • Data Import: Import individual workbook files from the same folder into a consolidated table.
    • Data Alignment: Use the “Existing Connections” feature to align information to specific cells.
    • Automation with VBA: Employ VBA scripts to automate data integration for large datasets.
Advantages: Real-time data synchronization eliminates redundant input and output operations.
Potential Enhancements: Further performance evaluation is required, with specific metrics on synchronization speed and user effort.
To enhance information management, an external database strategy was adopted using SQL database management tools.
Database Structure: The relational database was organized into multiple tables connected through primary and foreign keys, with data models illustrating interrelationships.
Data Integration: Revit data were linked to the external database using DB LINK and CodeMill Manager.
Model Object Linkage: Unique object IDs ensured consistent connections between database entries and Revit model elements, enabling flexible data access and editing.
Performance Evaluation: The system demonstrated adaptability in interfacing with other RDBMS solutions and commercial/open-source BIM software, providing robust data management capabilities.
The potential applications of various web-BIM platforms were evaluated for the documentation project of the north wing of the Valentino Castle. The focus was on adaptability, information management efficiency, and information exchange capabilities.
DALUX: Provides a Revit plugin for direct RVT format uploads to the cloud, preserving views, parameters, and level mappings. It supports real-time comment updates between the web-BIM and local Revit project.
BIMData: Notable for its development tools (SDK, system design, and API), enabling custom function development. However, it requires advanced programming skills.
Performance Metrics: Comprehensive evaluation requires data on upload times, synchronization speed, and user feedback.

4. Results

4.1. Data Collection and Management

The three-day on-site survey revealed changes since the last conservation intervention, including new restoration projects and damage conditions at the site, especially in the north wing of the castle. The team successfully categorized heterogeneous datasets, with a shift towards effective semantic management through classification and re-annotation processes. A significant challenge experienced was low efficiency in organizing and retrieving information from vast historical documents and raw data. AI tools demonstrated their potential in extracting content and connecting it with the Revit model, which improved information organization. This study assessed the performance of AI tools like ChatGPT and Copilot, specifically their ability to pre-train, comprehend contextual data, and perform semantic analysis to enable more efficient data integration.

4.2. Geometric Processing and Modeling

Utilizing existing point clouds significantly increased efficiency and reduced both time and costs compared to planning new data acquisitions. However, the initial datasets presented several challenges: the original files were numerous and varied in instrument and accuracy, with some lacking geographic references; many had not undergone necessary registration, cleaning, and fusion, resulting in bulky data; and inspections revealed missing information in critical areas—such as the roof truss structure of the castle’s north wing and certain rooms, along with portions of the north facade being obscured by vegetation. The measures implemented, as described above (Figure 12), effectively addressed these issues, ensuring more reliable and streamlined geometric data integration for subsequent modeling.
The application of the modeling strategy resulted in an efficient and coherent geometric representation of the heritage structure. Automatic extraction of structural elements and the use of in-place components successfully compensated for incomplete point cloud data, ensuring that key architectural features were clearly represented. The semi-automatic generation of doors, windows, and decorative elements produced recognizable volumes that could be accurately identified in external databases.
When applied, the simplified patch-based approach effectively captured surface-level damage across the heritage structure. For modeling global or partial structural damage—such as wall tilting, roof truss deformation, and collapse—yellow patches with appropriately enlarged dimensions were utilized to ensure clear visibility within the HBIM environment. These patches were supplemented with multi-category labels for annotation and dynamically linked to primary objects, allowing automatic updates when the parameter values changed. This approach provided an efficient overview of structural damage, facilitating quick identification and retrieval by operators, though it was noted that the method does not support detailed structural analysis.
The categorization of zones into current functional zones and historical partitions provided a comprehensive framework for managing both modern and historical building elements within the HBIM environment. The application of the COBie zones module enabled efficient asset and facility management, particularly in relation to functional zones. The modeling of the center and south wings as solids with a low level of detail was successful in establishing spatial relationships and providing an overall context for the building layout, though it did not provide high-detail geometric representation. This approach facilitated an understanding of the building’s overall structure and data management needs, particularly for future reference and use in ongoing heritage documentation and management.
The deviation analysis revealed that the global offset analysis, which covered larger building components like walls and floors, produced more accurate results compared to the local analysis of finer architectural elements. This discrepancy was largely due to the higher accuracy of the walls and floors, which were generated using the AsBuilt plugin, in contrast to the local key components that were manually simplified, omitting many geometric details. The application of the pseudo-color scalar field provided clear visualization of deviations, helping to identify areas where simplifications had led to inaccuracies. Overall, the analysis confirmed that while the global structural elements were modeled with relatively high accuracy, the simplifications in finer details, such as decorative elements, led to more significant deviations from the point cloud data.

4.3. Semantic Information Enrichment

The semantic information enrichment process facilitated efficient information management throughout the project, ensuring that data were categorized and linked to relevant BIM components for effective analysis and retrieval. At Level 1, the use of external spreadsheets for data transfer and synchronization improved the coordination between project participants. However, the limitations in file annotation within Revit’s linking system were noted, which might hinder full documentation management.
At Level 2, the independent room information sheets allowed for detailed data organization, although issues of redundancy were effectively minimized by the strategic design of wall and room descriptions. The integration of user-specific data in the sheets provided a flexible way to capture operations and interventions relevant to the project.
At Levels 3/4, the metadata associated with architectural components improved both model accuracy and interoperability. The addition of damage classification and deterioration analysis in Level 4 enriched the model’s capability to track and manage the condition of specific architectural elements over time. The damage grading system and intervention analysis provided a systematic approach for monitoring deterioration and prioritizing conservation efforts. The modifications to the ColorSplasher plugin helped streamline diachronic analysis by offering greater customization in visualizing component changes across different project phases.

5. Discussion

This study presents key findings that highlight the value and advantages of the proposed method within the context of project execution:
The HBIM documentation method presented in this study is built upon a predefined Scan-to-BIM workflow framework [4,7,29], offering two main advantages: ensuring both methodological reliability and adaptability, while providing a benchmark for assessing completeness during the documentation process. By restructuring the four-level information architecture proposed by Bruno and Roncella (2019) [5], this study optimizes classification. It integrates the classification and spatial segmentation of architectural elements into the lv3 Component A level, while detailing components through an independent three-tier identification system [11,21]. In contrast to the original framework, which distributed architectural component grading between lv3 Component A and lv4 Component B, this adjustment allows lv4 Component B to focus specifically on damage characteristics and monitoring equipment, creating an entity that is both associated with and independently managed from the main components. While this approach improves the clarity of information management, the addition of the identification system may introduce cognitive complexity.
Furthermore, this study enhances the semantic richness of the information framework by integrating the “Foundation” ontology theory proposed by Khan et al. (2022) [6]. By categorizing Rooms and Spaces within the Zones domain and subdividing Elements into Component A and B categories, precise mapping is established between HBIM information levels and database/ontology structures. This alignment mechanism not only optimizes data interaction efficiency but also offers a semantic modeling paradigm for historical building components [8]. The core value of this framework lies in constructing an extensible semantic conceptualization that forms the foundation for future intelligent diagnosis and knowledge reasoning.
This study builds on the BIM Execution Plan (BEP) proposed by Martinelli et al. (2022) [4] to establish a Common Data Environment (CDE) that facilitates interdisciplinary collaboration. By integrating standardized methodologies and deliverables, this framework enhances project adaptability and improves the reusability of core data across various research contexts. Within this framework, this study organizes information requirements systematically at the organizational (OIR), asset (AIR), and project (PIR) levels [6], defining structured attribute allocation rules. These rules range from specifying discrete parameters to implementing intelligent linking mechanisms for external documents such as technical reports and as-built drawings. The management system employs a dual-mode interaction design, supporting a fully digital BIM workflow while maintaining traceable paper-based interfaces for field operations constrained by technical limitations.
The Valentino Castle project illustrates the system’s efficiency: although the initial deployment of the requirement management framework takes about one week, its structured approach offers substantial benefits in later phases. By using structured questionnaires to integrate diverse data sources—such as historical archives, conservation reports, and the academic literature—the system reduces preliminary data collection and processing time by 50% compared to traditional methods. Notably, the established model is highly transferable within the BEP framework, and its modular design enables direct adaptation to other heritage projects, significantly lowering the initialization costs for similar studies. This underscores the sustainable value of systematic information management frameworks in the digitalization of cultural heritage.
This HBIM workflow enhances geometric modeling efficiency through a dual optimization strategy: simplifying data complexity and balancing modeling accuracy. First, it develops a building element recognition method compatible with a four-level information architecture, based on the theoretical framework by De Luca et al. (2011) [21]. This method is particularly effective for the automated recognition and simplified modeling of Level 3 architectural components (Architectural Component A). Second, it applies differentiated modeling strategies to various component types. Specifically, the “Patches” parameterized management system, designed for deterioration (Decay/Damage), significantly improves modeling efficiency while maintaining data integrity.
In comparison to the high-precision deterioration modeling method by Santoni et al. (2021) [35], this study’s geometric simplification strategy reduces modeling time for structural elements and stratigraphic units by 90%. While Santagati et al. (2021) [12] addressed the compatibility between component geometry simplification and quantity extraction using “Filled Region” families and Dynamo scripts, their solution still requires time-consuming manual projection outlines. Additionally, the hidden properties of “Filled Region” families in 3D visualization could lead to potential information management gaps. In contrast, the “Patches” system efficiently manages deterioration data by coupling non-geometric attributes with spatial references. The limitations of the “Patches” strategy are also evident. Excessive geometric simplification may negatively impact subsequent geometric studies and conservation efforts focused on deterioration.
However, it is important to note that the choice of geometric modeling methods represents a trade-off between accuracy and efficiency, and must be dynamically adjusted based on the model’s specific application scenario (Table 1).
Notably, this parameterized approach based on HBIM entity management has been extended to the modeling of monitoring equipment. By associating equipment attributes, spatial coordinates, and maintenance logs topologically, this method demonstrates its universal advantages in managing specific object information [13,36]. This offers a new technical pathway for the systematic integration of multi-dimensional data in historic buildings.
In the semantic enrichment phase, this study enhances system performance through two key improvements based on the framework from Bruno, N., and Roncella, R. [5]: (1) the restructuring of the Components A/B hierarchy, where Components A focus on managing component form and materials, while Components B specialize in handling damage features and monitoring data. This reorganization significantly improves search efficiency by optimizing data classification and segmentation; (2) the refinement of Zones into two categories: Space (functional space units) and Room (physical room entities), which allows for the spatial localization of building components and other items.
At the data transfer level, this study adopts a hybrid management architecture. On the one hand, a cross-platform compatible spreadsheet solution is used, offering significant advantages in the requirements analysis and information traceability stages. On the other hand, an enhanced database architecture is integrated, combining the Codemill plugin with Revit’s native DBLink tool to create a lightweight database within the Revit environment. This approach enables the automated synchronization of component parameters and spatial information, while reducing technical barriers through a graphical user interface (GUI), allowing modelers to manage complex data without the need for expertise in query languages such as SQL (Table 2).
The case study validates that this hybrid database solution presents three core values in historic building information modeling: First, the complementarity between the flexibility of the front-end spreadsheet and the rigor of the back-end database effectively balances ease of operation and data integrity; second, the dual spatial–functional analysis system significantly improves the machine-readability of architectural semantics; third, the visualized database interface greatly optimizes routine maintenance task efficiency, particularly demonstrating its technical advantages in multidisciplinary collaboration scenarios. These features provide an expandable solution framework for the deep application of HBIM technology in heritage conservation.
The research leveraged AI tools to enhance efficiency in semantic retrieval, information extraction, and data management. ChatGPT facilitated semantic inquiries, while OCR technology extracted text and images from scanned documents. AI-driven scripts automated data transfers between spreadsheets, reducing database skill requirements. These tools minimized manual data processing, enabling faster assessment and organization of information, contributing to project efficiency. However, limitations included OCR’s dependency on document quality, the need for user proficiency in AI operations, and the instability of AI-generated content. These challenges highlight the need for further optimization to improve reliability and practical application in HBIM processes.

6. Conclusions

This study proposes a documentation-oriented HBIM system as an innovative solution for the digital preservation and management of architectural heritage. The approach establishes a data foundation through standardized documentation processes and implements a three-tier semantic classification strategy to define and segment architectural components, along with a simplified quantification for geometric modeling. A four-tier information structure is applied to the HBIM environment, database, and ontology, minimizing errors and misalignments during data transfer. Unlike other studies, this research emphasizes information storage and management, demonstrating unique advantages in the accessibility and collaborative efficiency of heritage archives. The aim is to create an open public data environment for potential pre-preservation, planned conservation, or other future research while optimizing the lifecycle management of heritage archives through a structured information framework.
Despite these advancements, this study has several limitations: (1) the high dependency on manual input for geometric modeling and semantic interpretation, which is susceptible to subjective influence and affects model consistency; (2) the lack of quantifiable execution standards for existing LOD/LOI models, leading to insufficient reuse across multi-scale models; (3) the untapped potential of AI tools (whether for geometric processing, inference, or language models) in documentation, modeling, and retrieval; and (4) the absence of a spatial geographic information management system, limiting multi-dimensional data analysis capabilities.
Future research will focus on the following optimizations: the development of AI-based geometric simplification algorithms using visual programming, the establishment of parameter-driven LOD/LOI quantification models, the creation of a structured semantic encoding rule library, the integration of large language models to enhance machine interpretation of heterogeneous text, the incorporation of GIS platforms and Web 3.0 technologies for spatiotemporal topological management and cross-platform interaction, the use of multimodal training to improve complex document recognition robustness, and the development of human–machine collaborative quality assurance mechanisms. These improvements will further refine the technical ecosystem of the documentation-oriented HBIM system, offering scalable digital solutions for unprotected architectural heritage and driving the paradigm shift in cultural heritage preservation from an experience-driven to a data-driven approach.

Author Contributions

Conceptualization, Xiang Li, Lorenzo Teppati Losè, and Fulvio Rinaudo; data curation, Xiang Li and Lorenzo Teppati Losè; formal analysis, Xiang Li; funding acquisition, Fulvio Rinaudo; investigation, Xiang Li and Lorenzo Teppati Losè; methodology, Xiang Li, Lorenzo Teppati Losè, and Fulvio Rinaudo; project administration, Xiang Li; resources, Xiang Li and Fulvio Rinaudo; software, Xiang Li and Lorenzo Teppati Losè; supervision, Fulvio Rinaudo; validation, Xiang Li, Lorenzo Teppati Losè, and Fulvio Rinaudo; visualization, Xiang Li; writing—original draft, Xiang Li; writing—review and editing, Lorenzo Teppati Losè, and Fulvio Rinaudo. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data sharing is not applicable.

Acknowledgments

The authors would like to thank Annalisa Dameri, the Rector’s delegate for the Castello del Valentino, for her support during the research and for making the archive material available.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Dore, C.; Murphy, M. Integration of Historic Building Information Modeling (HBIM) and 3D GIS for recording and managing cultural heritage sites. In Proceedings of the 2012 18th International Conference on Virtual Systems and Multimedia, Milan, Italy, 2–5 September 2012; pp. 369–376. [Google Scholar]
  2. Pepe, M.; Costantino, D.; Alfio, V.S.; Restuccia, A.G.; Papalino, N.M. Scan to BIM for the digital management and representation in 3D GIS environment of cultural heritage site. J. Cult. Herit. 2021, 50, 115–125. [Google Scholar]
  3. Antonopoulou, S.; Bryan, P. BIM for Heritage: Developing a Historic Building Information Model; Historic England: London, UK, 2017.
  4. Martinelli, L.; Calcerano, F.; Gigliarelli, E. Methodology for an HBIM workflow focused on the representation of construction systems of built heritage. J. Cult. Herit. 2022, 55, 277–289. [Google Scholar]
  5. Bruno, N.; Roncella, R. HBIM for conservation: A new proposal for information modeling. Remote Sens. 2019, 11, 1751. [Google Scholar] [CrossRef]
  6. Khan, M.S.; Khan, M.; Bughio, M.; Talpur, B.D.; Kim, I.S.; Seo, J. An integrated hbim framework for the management of heritage buildings. Buildings 2022, 12, 964. [Google Scholar] [CrossRef]
  7. Previtali, M.; Brumana, R.; Stanga, C.; Banfi, F. An ontology-based representation of vaulted system for HBIM. Appl. Sci. 2020, 10, 1377. [Google Scholar] [CrossRef]
  8. Colucci, E.; Xing, X.; Kokla, M.; Mostafavi, M.A.; Noardo, F.; Spanò, A. Ontology-based semantic conceptualisation of historical built heritage to generate parametric structured models from point clouds. Appl. Sci. 2021, 11, 2813. [Google Scholar] [CrossRef]
  9. Parisi, P.; Lo Turco, M.; Giovannini, E.C. The value of knowledge through H-BIM models: Historic documentation with a semantic approach. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 581–588. [Google Scholar]
  10. Bacci, G.; Bertolini, F.; Bevilacqua, M.G.; Caroti, G.; Martínez-Espejo Zaragoza, I.; Martino, M.; Piemonte, A. HBIM methodologies for the architectural restoration. The case of the ex-church of San Quirico all’Olivo in Lucca, Tuscany. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 121–126. [Google Scholar]
  11. Moyano, J.; Carreno, E.; Nieto-Julián, J.E.; Gil-Arizón, I.; Bruno, S. Systematic approach to generate Historical Building Information Modelling (HBIM) in architectural restoration project. Autom. Constr. 2022, 143, 104551. [Google Scholar]
  12. Santagati, C.; Papacharalambous, D.; Sanfilippo, G.; Bakirtzis, N.; Laurini, C.; Hermon, S. HBIM approach for the knowledge and documentation of the St. John the Theologian cathedral in Nicosia (Cyprus). J. Archaeol. Sci. Rep. 2021, 36, 102804. [Google Scholar]
  13. Garcia-Gago, J.; Sánchez-Aparicio, L.J.; Soilán, M.; González-Aguilera, D. HBIM for supporting the diagnosis of historical buildings: Case study of the Master Gate of San Francisco in Portugal. Autom. Constr. 2022, 141, 104453. [Google Scholar]
  14. Bruno, S.; De Fino, M.; Fatiguso, F. Historic Building Information Modelling: Performance assessment for diagnosis-aided information modelling and management. Autom. Constr. 2018, 86, 256–276. [Google Scholar]
  15. ISO/TC 59/SC 13 ISO 19650:2018; Organization and Digitization of Information About Buildings and Civil Engineering Works, Including Building Information Modelling (BIM): Information Management Using Building Information Modelling. ISO: Geneva, Switzerland, 2018.
  16. England, H. BIM for Heritage: Developing the Asset Information Model; Historic England: London, UK, 2019.
  17. Khalil, A.; Stravoravdis, S.; Backes, D. Categorisation of building data in the digital documentation of heritage buildings. Appl. Geomat. 2021, 13, 29–54. [Google Scholar]
  18. Cheng, J.C.; Zhang, J.; Kwok, H.H.; Tong, J.C. Thermal performance improvement for residential heritage building preservation based on digital twins. J. Build. Eng. 2024, 82, 108283. [Google Scholar]
  19. Andrews, D.; Bedford, J.; Young, G. Geospatial Survey Specifications for Cultural Heritage; Historic England: Swindon, UK, 2024.
  20. Hopkins, O. Reading Architecture Second Edition: A Visual Lexicon; Laurence King Publishing: Hachette, UK, 2023. [Google Scholar]
  21. De Luca, L.; Busayarat, C.; Stefani, C.; Véron, P.; Florenzano, M. A semantic-based platform for the digital analysis of architectural heritage. Comput. Graph. 2011, 35, 227–241. [Google Scholar]
  22. Pavan, A.; Mirarchi, C.; Giani, M. BIM: Metodi e Strumenti. Progettare, Costruire e Gestire Nell’era Digitale; Tecniche Nuove: Milan, Italy, 2017. [Google Scholar]
  23. Guardini, N. Sistemi Informativi Spaziali per la Valorizzazione: Sperimentazione in Ambiente GIS per il Castello del Valentino. Doctoral Dissertation, Politecnico di Torino, Turin, Italy, 2009. [Google Scholar]
  24. Costamagna, E.; Spanò, A. Semantic models for architectural heritage documentation. In Progress in Cultural Heritage Preservation, Proceedings of the 4th International Conference, EuroMed 2012, Limassol, Cyprus, 29 October–3 November 2012; Proceedings 4; Springer: Berlin/Heidelberg, Germany, 2012; pp. 241–250. [Google Scholar]
  25. Raineri, P. L’approccio BIM (Building Information Modeling) per la Documentazione Dell’architettura Storica: Una Proposta Applicativa al Castello del Valentino. Doctoral Dissertation, Politecnico di Torino, Turin, Italy, 2015. [Google Scholar]
  26. Chiabrando, F.; Sammartano, G.; Spanò, A. Historical buildings models and their handling via 3D survey: From points clouds to user-oriented HBIM. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 633–640. [Google Scholar]
  27. Gasbarri, P. I restauri del Castello del Valentino: Proposte per la Raccolta, la Gestione e la Consultazione dei Documenti D’archivio su Modelli Grafici Digitali Informati = The Restorations of the Valentino Castle: Proposals for the Collection, the Management and the Consultation of Archive Documents on Informed Digital Graphic Models. Doctoral Dissertation, Politecnico di Torino, Turin, Italy, 2019. [Google Scholar]
  28. Adamopoulos, E.; Colombero, C.; Comina, C.; Rinaudo, F.; Volinia, M.; Girotto, M.; Ardissono, L. Integrating multiband photogrammetry, scanning, and GPR for built heritage surveys: The façades of Castello del Valentino. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, 8, 1–8. [Google Scholar]
  29. Tanduo, B.; Teppati Losè, L.; Chiabrando, F. Documentation of complex environments in cultural heritage sites. A SLAM-based survey in the Castello del Valentino basement. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2023, 48, 489–496. [Google Scholar]
  30. Barontini, A.; Alarcon, C.; Sousa, H.S.; Oliveira, D.V.; Masciotta, M.G.; Azenha, M. Development and demonstration of an HBIM framework for the preventive conservation of cultural heritage. Int. J. Archit. Herit. 2022, 16, 1451–1473. [Google Scholar]
  31. Anson-Cartwright, T.; Bourguignon, E.; Bromblet, P.; Cassar, J.; de Witte, E.; Delgado-Rodrigues, J.; Fassina, V.; Fitzner, B.; Fortier, L.; Franzen, C.; et al. ICOMOS-ISCS: Illustrated Glossary on Stone Deterioration Patterns; International Council of Monuments and Sites: Paris, France, 2008. [Google Scholar]
  32. Vergès-Belmin, V. Illustrated Glossary on Stone Deterioration Patterns; Icomos: Charenton-le-Pont, France, 2008. [Google Scholar]
  33. Weyer, A.; Roig Picazo, P.; Pop, D.; Cassar, J.; Özköse, A.; Vallet, J.M.; Srša, I. EwaGlos-European Illustrated Glossary of Conservation Terms for Wall Paintings and Architectural Surfaces; Michael Imhof Verlag: Petersberg, Germany, 2015; Volume 17. [Google Scholar]
  34. Fitzner, B.; Heinrichs, K. Damage diagnosis at stone monuments-weathering forms, damage categories and damage indices. Acta-Univ. Carol. Geol. 2001, 1, 12–13. [Google Scholar]
  35. Santoni, A.; Martín-Talaverano, R.; Quattrini, R.; Murillo-Fragero, J.I. HBIM approach to implement the historical and constructive knowledge. The case of the Real Colegiata of San Isidoro (León, Spain). Virtual Archaeol. Rev. 2021, 12, 49–65. [Google Scholar]
  36. Mora, R.; Sánchez-Aparicio, L.J.; Maté-González, M.Á.; García-Álvarez, J.; Sánchez-Aparicio, M.; González-Aguilera, D. An historical building information modelling approach for the preventive conservation of historical constructions: Application to the Historical Library of Salamanca. Autom. Constr. 2021, 121, 103449. [Google Scholar] [CrossRef]
Figure 1. The documentation-oriented HBIM workflow in this project.
Figure 1. The documentation-oriented HBIM workflow in this project.
Ijgi 14 00139 g001
Figure 2. Three-dimensional graph providing a semantic description of a part of a terrace (ontology) and classification strategy of a column in the Valentino Castle project.
Figure 2. Three-dimensional graph providing a semantic description of a part of a terrace (ontology) and classification strategy of a column in the Valentino Castle project.
Ijgi 14 00139 g002
Figure 3. General view of the “foundation” ontology used to share a general view of the building among different domains involved in the historical building information modeling (HBIM) process.
Figure 3. General view of the “foundation” ontology used to share a general view of the building among different domains involved in the historical building information modeling (HBIM) process.
Ijgi 14 00139 g003
Figure 4. Timeline of the Valentino Castle that includes part of the archival documentation collected during the research (left). Chronological development of the Valentino Castle (right).
Figure 4. Timeline of the Valentino Castle that includes part of the archival documentation collected during the research (left). Chronological development of the Valentino Castle (right).
Ijgi 14 00139 g004
Figure 5. (A) Orthoimages of the north wing, generated from point cloud (above is the north façade; below is the south facade). The missing part of the north facade is due to the vegetation near the building. (B) Ground floor plan of the Valentino campus (Area A in blue, Area B in red, Area C in green, and Area D in yellow), and 1st, 2nd, and 3rd floor plans (north wing).
Figure 5. (A) Orthoimages of the north wing, generated from point cloud (above is the north façade; below is the south facade). The missing part of the north facade is due to the vegetation near the building. (B) Ground floor plan of the Valentino campus (Area A in blue, Area B in red, Area C in green, and Area D in yellow), and 1st, 2nd, and 3rd floor plans (north wing).
Ijgi 14 00139 g005
Figure 6. Explosion diagram of different types of architectural elements’ geometric modeling in HBIM.
Figure 6. Explosion diagram of different types of architectural elements’ geometric modeling in HBIM.
Ijgi 14 00139 g006
Figure 7. Four modeling approaches for curved vaults.
Figure 7. Four modeling approaches for curved vaults.
Ijgi 14 00139 g007
Figure 8. South facade decay representation and patch-type object (in-place component/signage) to represent detachment (blue).
Figure 8. South facade decay representation and patch-type object (in-place component/signage) to represent detachment (blue).
Ijgi 14 00139 g008
Figure 9. COBie interoperability tool: Zone Manager.
Figure 9. COBie interoperability tool: Zone Manager.
Ijgi 14 00139 g009
Figure 10. (a) Deviation analysis (distinguishing between errors and simplifications). (b) Exploded axonometry that shows the complex semantic decomposition of each single architectural subcomponent.
Figure 10. (a) Deviation analysis (distinguishing between errors and simplifications). (b) Exploded axonometry that shows the complex semantic decomposition of each single architectural subcomponent.
Ijgi 14 00139 g010
Figure 11. Three basic information transfer strategies: 1. Dynamo; 2. Office Spreadsheet; 3. Database [5].
Figure 11. Three basic information transfer strategies: 1. Dynamo; 2. Office Spreadsheet; 3. Database [5].
Ijgi 14 00139 g011
Figure 12. (a) Example of the different data integrated. (b) The missing point clouds in the north facade were supplemented by the scanned drawing information and other documents (black frame).
Figure 12. (a) Example of the different data integrated. (b) The missing point clouds in the north facade were supplemented by the scanned drawing information and other documents (black frame).
Ijgi 14 00139 g012
Table 1. Comparison and summary of three approaches to deterioration modeling.
Table 1. Comparison and summary of three approaches to deterioration modeling.
Decay ModelingSantoni, A. et al. [35]Santagati, C. et al. [12]Patches Method. [30]
DescriptionUses structural elements and stratigraphic units to organize deterioration documentation, where temporal organization shapes structure and links to information systemsApplies “Filled Region” and Dynamo scripts, introduces adaptive components for curved surfacesModels decay and damage as separate HBIM entities, managing geometric and semantic information separately
Level of geometryHighMediumLow
Level of informationHighHighHigh
Modeling process timeHighMediumLow
Table 2. Main features of the methods/tools for data transfer in HBIM documentation.
Table 2. Main features of the methods/tools for data transfer in HBIM documentation.
MethodDynamoMicrosoft Excel/WPS OfficeWPS OfficeMicrosoft
Excel/WPS
Office
Database
ContentExcel data to Revit family parameters scriptVLOOKUP/XLOOKUP
formula
VBAPredefined formula “Query and input” Database link/link to other sheetsRevit DB linkExternal database
Demand for skillsHighMediumHighLowMediumLowHigh
Transfer efficiencyLowLowMediumHighHighHighHigh
AccuracyHighHighLowLowMediumHighHigh
Time cost (established)LowHighMediumLowMediumHighHigh
Commercial/freeFreeFreeFreeCommercial/freeFreeFreeCommercial/free
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, X.; Teppati Losè, L.; Rinaudo, F. Documentation for Architectural Heritage: A Historical Building Information Modeling Data Modeling Approach for the Valentino Castle North Wing. ISPRS Int. J. Geo-Inf. 2025, 14, 139. https://doi.org/10.3390/ijgi14040139

AMA Style

Li X, Teppati Losè L, Rinaudo F. Documentation for Architectural Heritage: A Historical Building Information Modeling Data Modeling Approach for the Valentino Castle North Wing. ISPRS International Journal of Geo-Information. 2025; 14(4):139. https://doi.org/10.3390/ijgi14040139

Chicago/Turabian Style

Li, Xiang, Lorenzo Teppati Losè, and Fulvio Rinaudo. 2025. "Documentation for Architectural Heritage: A Historical Building Information Modeling Data Modeling Approach for the Valentino Castle North Wing" ISPRS International Journal of Geo-Information 14, no. 4: 139. https://doi.org/10.3390/ijgi14040139

APA Style

Li, X., Teppati Losè, L., & Rinaudo, F. (2025). Documentation for Architectural Heritage: A Historical Building Information Modeling Data Modeling Approach for the Valentino Castle North Wing. ISPRS International Journal of Geo-Information, 14(4), 139. https://doi.org/10.3390/ijgi14040139

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop