Next Article in Journal
Accountability through Environmental and Social Reporting by Wind Energy Sector Companies in Spain
Next Article in Special Issue
WGV: Quantifying Mains Water Savings in a Medium Density Infill Residential Development
Previous Article in Journal
Understanding the Rural Livelihood Stability System: The Eco-Migration in Huanjiang County, China
Previous Article in Special Issue
East Village at Knutsford: A Case Study in Sustainable Urbanism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

The Discrepancy between As-Built and As-Designed in Energy Efficient Buildings: A Rapid Review

Curtin University Sustainability Policy Institute, School of Design and the Built Environment, Curtin University, Perth 6102, Australia
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(16), 6372; https://doi.org/10.3390/su12166372
Submission received: 24 June 2020 / Revised: 5 August 2020 / Accepted: 6 August 2020 / Published: 7 August 2020

Abstract

:
Energy efficient buildings are viewed as one of the solutions to reduce carbon emissions from the built environment. However, studies worldwide indicate that there is a significant gap between building energy targets (as-designed) and the actual measured building energy consumption (as-built). Several underlying causes for the energy performance gap have been identified at all stages of the building life cycle. Focus is generally on the post-occupancy stage of the building life cycle. However, issues relating to the construction and commissioning stages of the building are a major concern, though not usually researched. There is uncertainty on how to address the as-designed versus as-built gap. The objective of this review article is to identify causes for the energy performance gap in buildings in relation to the post-design and pre-occupancy stages and review proposed solutions. The methodology applied in this research is the rapid review, which is a variant of the systematic literature review method. Findings suggest that causes for discrepancies between as-designed and as-built energy performance during the construction and commissioning stages relate to a lack of knowledge and skills, lack of communication between stakeholders and a lack of accountability for building performance post-occupancy. Recommendations to close this gap during this period include better training, improved communication standards, collaboration, energy evaluations based on post-occupancy performance, transparency of building performance, improved testing and verification and reviewed building standards.

1. Introduction

The building sector is responsible for 32% of global energy use and 19% of energy-related greenhouse gas emissions [1]. The energy use and emissions from buildings might double or triple by 2050 unless energy efficiency measures are implemented and best-practices mainstreamed [1]. Most countries and jurisdictions have regulations in place that require new buildings to meet minimum energy efficiency standards. For instance, the European Union requires that all new buildings are built to nearly zero-energy standards starting in 2021 as part of the wider goal to decarbonise the building sector by 2050 [2]. The jurisdiction of California (USA) requires homes built in 2020 and beyond to include renewable energy generation to cover the expected annual electricity needs of buildings [3]. Australia also has legislation in place that requires all new residential homes to comply with energy efficiency standards, although these are less stringent than the European regulations, currently focusing only on thermal performance.
Despite the energy efficiency measures implemented worldwide, international research indicates that there is a significant gap between the building energy targets, modelled during the design stage, and the actual measured building energy consumption once the building is occupied. The operational performance of buildings varies significantly across studies [4], but some have found that it can be up to 2.5 times higher than the energy modelled during the building design stage [5]. This phenomenon, known as the regulatory energy performance gap (EPG), is a concern as it hinders, and does not make the most of, global energy conservation efforts.
In most building codes adopted across jurisdictions, building energy performance is calculated through simulation software [6], which projects energy use during the building operational stage based on assumptions of occupancy, behaviour, technology operation and maintenance, and climate. Building design, construction techniques and materials are also modelled at this instance to enable the building in question to meet set targets. It is often said that one of the explanations for the regulatory EPG lies in the erroneous interpretation of the performance assessment during design. This line of thought argues that energy modelling tools are not intended to predict actual performance, but rather to act as a general guide to inform design [7]. Other types of performance gap measurements, such as static EPG or dynamic EPG, could potentially produce more positive results, as they allow for a calibration of the baseline based on modelling assumptions [8]. The static EPG compares predictions from performance modelling (rather than compliance modelling) with measured energy use; the dynamic EPG compares calibrated predictions from performance modelling, longitudinally, with measured energy use.
Regardless of which definition of EPG is used and how it is interpreted, there is empirical evidence showing that there are shortcomings in the compliance and enforcement of building regulations [1]. Underlying causes for the EPGs have been identified at all stages of the building life cycle [9]. These include the planning stage, the design stage, the construction stage, the commissioning stage and the occupancy stage, comprising of building maintenance and operation. During the planning and design stages, common issues revolve around misunderstanding of design performance targets, complexity of the building design, incorrect modelling assumptions, short comings in the energy rating software, modelling illiteracy or assessor dishonesty [6,10]. During the construction stage, common problems are around improper documentation, improper installation and poor construction quality [6]. In Australia, industry reports such as the National Energy Efficiency Building Project [11] highlight that energy efficiency regulations are often not complied with, there are no measures in place for building verification and there is a lack of accountability. It has also been reported that some builders may not have the required energy knowledge, have poor construction practices, and make product substitutions that differ from the building approved design. At the occupancy stage, discrepancies are usually attributed to occupant behaviour, poor technology control by building managers and users, insufficient maintenance that reduces the efficiency of the technology and variations in the weather that influence temperature and sunlight in the buildings environment [12].
The impact of user behaviour on energy performance has been widely researched and is said to account for between 10% and 80% of the EPG [9]. On the other hand, the EPG relating to the construction and commissioning stages has not been as extensively studied, being the subject of only 7.9% of research [13]. Some of the concerns mentioned above are well-known amongst researchers and building practitioners, however there are few provisions to address them. Recently, the Australian Built Environment Council (ASBEC) [14] proposed a pathway to achieve zero energy buildings in Australia, suggesting approaches to addressing some of the EPGs. However, the successful implementation of energy efficient buildings in Australia and elsewhere depends on a deeper understanding of the root causes for the regulatory EPG; in particular, the causes for the gap between as-designed (i.e., once energy modelling and construction drawings are complete) and as-built (i.e., when the building fabric is ready and technologies are installed, but before occupation). It is also important to understand how this problem can be addressed effectively based on international evidence-based research.
This rapid review examines international literature on the EPG, focusing on the gaps relating to the construction and commissioning stages of energy efficient buildings and collates information on how to address this gap. The next section details the methodology used to carry out this review, followed by a description of the articles analysed, a summary of the main findings, gaps identified for future research and a conclusion.

2. The Rapid Review Methodology

The rapid review methodology was chosen to conduct this research. The rapid review is a variation of the systematic literature review; the main difference being that rapid reviews are conducted in a shorter timeframe, generating trustworthy evidence faster for a timelier adoption by policy makers [15]. Whilst traditional systematic literature reviews can take between one to two years for completion, rapid reviews can be achieved in less than six months, usually 5 weeks, with only a fraction of the resources [16]. This approach has been recommended for dealing with situations that require quick decision-making processes based on scientific empirical evidence. The rapid review methodology approach is encouraged by the World Health Organisation [17] and has been recently used to inform policy during crises such as COVID-19 in 2020 [18,19]. Although rapid reviews have been mostly carried out in the health sector, this methodology has also been adopted in building research to inform policy [20,21]. Research questions are based on relevance for industry and government, who can help to shape the research scope and are also adopters of the review recommendations.
As with systematic literature reviews, rapid reviews follow a rigorous process for article selection that aims to be replicable and minimize the risk of bias [22]. However, some shortcuts are applied to reduce the length of the review. For instance, rapid reviews usually limit article selection to academic literature within specific databases and within set timeframes [22]. Studies analysed in rapid reviews can also be restricted to reviews articles only [22]. The revision of reviews enables the inclusion of a wider number of articles captured through the original review papers, but without the need to analyse them individually. These shortcuts limit the conclusions of rapid reviews when compared to systematic literature reviews, but findings are still more robust than non-systematic reviews [20].
The approach used to conduct this rapid review followed the protocol proposed in Lagisz et al. [20], which include the following steps:
(a)
Problem definition in conjunction with industry stakeholders
(b)
Development of a suitable search string and filtering process
(c)
Screening according to predefined eligibility criteria
(d)
Data extraction and synthesis
(e)
Quality assessment of selected studies
Each of these steps were followed in this rapid review and are described in detail below.

2.1. Problem Definition

The research question that guided this rapid review was set by the research team following consultation with professionals from the building industry who are also potential adopters of the findings of this review. This method ensures that the research is relevant and fills an existing need. The stakeholders involved in the process consisted of an Australian land developer and two city councils. In separate interviews, the parties individually raised concerns around the EPG and practical ways to address it. The concerns revolved mainly around the discrepancies between as-designed and as-built and the fact that there are no provisions to address these in the current Australian National Construction Code.
Following industry engagement and an initial scoping of the literature, the research question for this review was refined to “how can the gap between as-designed and as-built energy efficient buildings be addressed?”
In this article the term as-designed vs. as-built refers to the period between post-design, once energy modelling and construction drawings are complete, and pre-occupancy, before occupants move in and operate the building fabric and technologies. In other words, the differences between as-designed vs. as-built originate in the stages in which design decisions are implemented in practice through procurement, construction and commissioning. Issues relating to the accuracy of simulation software, energy modelling assumptions (such as building thermal properties, weather, occupancy, etc.), occupant behaviour (including operation of the building technologies) and maintenance, whilst important factors to consider are not part of this review. On the other hand, in cases where planning and design outcomes directly impact decision-making during procurement, construction and commissioning, these were addressed.

2.2. Searching and Filtering Process

Three academic databases were primarily selected for this research, consisting of Scopus, Web of Science and ProQuest. These are cross-disciplinary and deemed suitable for research in the built environment [20].
Two search strings were devised for this rapid review. The first search (Search 1) was an attempt to capture review articles about the EPG in low carbon buildings, focusing specifically on the early stages of the building life cycle (pre-occupancy). The second search (Search 2) focused on mechanisms to ensure building compliance. The researchers attempted to combine Search 1 and Search 2 in a single search string. However, this combined search string was too limiting and only returned a small number of articles in all databases.
Both searches were conducted on the 1st of April 2020 in the three academic databases mentioned above and the results were combined to answer the research question. Database search engines were screened through article titles, abstracts and keywords. The search was limited to articles written in English, studies published in peer-reviewed academic journals, research published since 2010, review articles and articles with full-text availability.
Search 1 combined synonyms of the following keywords: ‘energy performance gap’, ‘buildings’, ‘low carbon’, ‘pre-occupancy’ and ‘review’. The specific string used for this search was the following:
((“energy performance gap” OR “energy gap” OR “performance gap”) AND (building* OR hous* OR home) AND (“low carbon” OR “low-carbon” OR “energy efficien*” OR green OR “sustainab*” OR “net zero energy” OR “zero energy” OR “high efficien*” OR “passive”) AND (“construction” OR “commission*” OR “pre-occupancy” OR “life cycle” OR “life-cycle”) AND ( “systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”))
Search 2 combined synonyms of the following keywords: ‘energy performance gap’, ‘compliance’ and ‘review’. The specific string used for this search was the following:
((“energy performance gap” OR “energy gap” OR “performance gap”) AND (“cause*” OR “verification” OR “compliance” OR “assess*” OR “solution*” OR polic* OR “clos* the gap”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”))
Search strings and specific filters applied to the different academic databases for both searches 1 and 2 can be found in Appendix A.
An additional search on Google Scholar and Google was conducted to capture relevant industry reports and additional academic articles of interest that may not have been found through the primary academic databases. In both Google Scholar and Google, the terms ‘energy performance gap’ buildings review were used for the search. Given that the results in Google Scholar and Google are sorted by relevance as well as number of citations, only the first three pages of results were screened. Articles and reports were selected according to their scope, study eligibility criteria (described in Section 2.3) and whether they consisted of reviews.

2.3. Screening According to Eligibility Criteria

All records from Search 1, Search 2 and additional Google Scholar and Google articles were exported to the Endnote reference management software. Duplicates were excluded and titles and abstracts were screened for relevance. Articles deemed eligible consisted of either academic review articles or selected grey literature reviews (government and industry reports) that met the study scope, providing answers to the research question. Case study articles were excluded. The articles included addressed at least one of the following topics:
  • EPG in energy efficient or low carbon buildings, discussing specifically the early stages of the building life cycle; that is, the pre-occupancy stages and in particular the construction and commissioning stages. Articles that were purely about occupant behaviour and did not mention the pre-occupancy stages, were excluded.
  • Studies that provided recommendations on how the EPG can be addressed in low carbon buildings.
The remaining articles, deemed relevant after title and abstract screening, were read in full and further screened according to the same criteria above. At this stage, specific reasons for article exclusion were recorded (Appendix B). A total of 151 original records were identified through the search; nine of which were included for analysis in this rapid review after the screening process and eligibility evaluation. The search and screening processes are summarized in the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) diagram in Figure 1.

2.4. Data Extraction and Synthesis

Results synthesis in rapid reviews is usually descriptive [16,23]. Although a quantitative summary of the data is sometimes possible, this depends on the nature and quality of the articles analysed [24]. In this rapid review, data were synthetised qualitatively only as most of the included articles did not present quantitative analysis.
For each of the nine articles included in this rapid review, the following characteristics were extracted: authors and year of publication, study title, country of origin, location of case studies reviewed, review type, number of articles reviewed, study funding, conflict of interests, study theme and scope.
Specific data relevant to the research question relating to the EPG during the construction and commissioning stages of building were extracted and synthetised. Data included causes for the EPG affecting the construction and commissioning stages of the building life cycle, and recommendations to close the gap. The different stages of the building life cycle affect each other; for instance, inadequate planning and poor design documentation affect decisions made during procurement, construction and commissioning. In the case of flow-on effects such as these, EPG causes relating to earlier stages of the building life cycle (i.e., planning and design) were also acknowledged.

2.5. Quality Assessment of Selected Studies

Assessment of the quality of the included studies is necessary to provide robust results and verification of the rapid review. This was done through A Measurement Tool to Assess Systematic Reviews version 2 (AMSTAR2) checklist [25], which consists of 16 questions that were answered for each of the articles (Appendix C). These questions addressed studies’ methodologies, search strategies, risk of bias assessment and quality of the interpretation of results. The answers for the 16 questions were color-coded and reported in a table, enabling the quality of the articles to be visually verified

3. Articles Overview

This section provides an overview of the nine studies included in this rapid review. The section starts with an assessment of their quality and risk of bias, then outlines the articles general characteristics.

3.1. Quality and Risk of Bias Assessment

All nine articles are reviews, which was part of the predetermined eligibility criteria. However, the quality of each review has some degree of variation. Only two of the articles were systematic literature reviews; six others consisted of narrative reviews of the academic and/or grey literature (i.e., unpublished research such as industry reports, working papers, government documents, theses, fact sheets, etc.); one article reviewed unpublished case studies. Systematic literature reviews are usually considered of higher quality and lower risk of bias when compared to narrative reviews and case studies. For instance, narrative reviews do not follow a transparent and methodical approach of article selection, which means the search cannot be replicated [22]. Literature deemed relevant by the authors is chosen based on their personal judgement and is potentially biased. However, articles within the same category can still range significantly in quality and risk of bias [20]. The AMSTAR2 risk assessment [25] carried out for the nine articles revealed that all articles are of medium quality. The average score for the articles was 4.1 out of 13, the highest scoring paper achieving 6.5 (50%) (Table 1). Most studies failed to provide sufficient detail on the literature search and data extraction procedures (Table 1). Furthermoer, most did not consider the quality and risk of bias of studies reviewed by them. Only three articles explicitly discussed the review methods and selection criteria applied, while only two of them discussed the risk of bias in the articles they reviewed.
Systematic literature reviews are to date relatively uncommon in the field of the built environment and sustainability, which is confirmed by the findings of this rapid review. Shi et al. [4] and Zou et al. [13], who claim to be systematic literature reviews, still present major flaws, showing that there is significant scope for improvement, especially in relation to the reporting of methodologies and study outcomes. This means that the results and conclusions that will be presented in this rapid review, although important and relevant, should be understood within these limitations.

3.2. Studies Characteristics

The articles included in this rapid review (Table 2) were published between 2014 and 2019 and collectively reviewed more than 500 articles and/or case studies. These had a global coverage, reporting results from Asia, Europe, North America, Africa and Australia.
All articles reviewed causes of EPGs in the building sector. Five of them explored all stages of the building life cycle, from planning and design to operation, occupancy and maintenance [4,7,9,13,26], while the other four focused entirely on the early stages of the building life cycle. Alencastro et al. [27] explore exclusively the EPG caused by construction quality defects and McElroy et al. [28] explore the impact of poor installation and commissioning of building technologies. The two reports by the Zero Carbon Hub [29,30] identify the EPG relating to the planning, design, construction and commissioning stages.
The articles discuss EPG causes and recommendations for both residential and non-residential buildings; four articles focusing on both building typologies [4,7,13,27], four focusing on residential uses [26,28,29,30] and only one focusing on non-residential buildings [9]. All articles review and/or propose policy steps to address the EPG based on the literature. All articles also discuss areas needed for future research.

4. Causes for the As-Designed vs. As-Built Performance Gap during the Construction and Commissioning Stage

Despite the review articles originating from a variety of locations worldwide, the main causes for discrepancy between buildings’ as-designed and as-built and how these discrepancies affect energy performance seem to be in agreement. Root causes for the EPG are found across all stages of the building life cycle, starting at the planning stage. Whereas this rapid review focuses on the post-design stage, flow-on effects from planning and design affect construction and commissioning. Where this is the case, these flow-on effects were acknowledged and discussed. This section reviews the causes for the gaps between as-designed and as-built originating for each of the building stages between planning and verification. Appendix D summarises the key findings of the articles analysed in accordance with the scope of this paper.

4.1. Planning and Design

Building design involves many stakeholders with competing interests and energy efficiency is often not the top priority [26]. Each of the parties focus on their own expertise and energy considerations may end up deprioritised [26]. The various aspects of building energy performance are highly dependent on the teams involved in the early planning and design stages and how much dedication is put into the definition of parameters and identification of future scenarios [9].
Usually professionals involved in building concept and planning, where decisions about energy targets are made, are not included in the designing and modelling stages. At the same time, planners do not necessarily understand the implications of aesthetical choices on energy performance and may make decisions that go against energy efficiency [29].
Any decision made during the design stage affects construction. However, the design team often lack construction experience and may not understand the repercussions of early design decisions on implementation [9,13,29]. Designers also do not necessarily possess an adequate understanding of energy efficiency or building physics. Energy illiteracy can result in thermal bridges and air leakages, affecting building thermal performance [27].
If design solutions are too complex, there is a chance that they will not be properly executed at construction [9,13,26]. Site constraints also need to be addressed at this stage, but this is usually not the case [9]. There is generally a lack of clarity in design documentation, in particular how different layers of the building (fabric and services) are supposed to integrate in practical terms [13,29]. Uncertainties and inadequate information can lead to incorrect material specification, design errors and damage to the building fabric. If certain elements are not detailed in the construction drawings, decisions have to be made on-site, leading to faulty construction which in turn affects building performance [13,27].

4.2. Procurement

During procurement, the emphasis is often placed on cost rather than skills or quality. This results in the engagement of contractors who may not have knowledge in energy efficiency and related skills. This in turn, results in inadequate installation of services and building fabric [29].
Change to orders also tend to occur at this stage, either for cost reduction purposes [26], site constraints, delivery delays, as a time saving strategy or due to lack of knowledge [29]. The consequences of substitutions can be lower quality equipment and materials or a complete change to the design intent that affects energy efficiency [13,29]. Building owners, who often have inadequate knowledge in energy and construction, will tend to endorse such changes.

4.3. Construction

Defects during construction are a common occurrence and important contributors to the underperformance of buildings. The number of defects per dwelling usually average between 2.29 and 28.3 and rectification can cost between 3.23% and 23% of the project budget [27]. The most common building defects consist of incorrect installation, accounting for 24% to 40% of the defects, and missing items (also referred to as incomplete installation), accounting for 20% to 55% of defects [27]. The latter is normally detected at the post-handover stage. In terms of where defects occur, external walls, partitions, openings (doors and windows), floors and roofs are the most common locations [27]. Gaps in the building fabric and poor installation of insulation are widespread, greatly impacting thermal performance [27,29]. Other frequent issues affecting thermal performance are thermal bridging, air leakage and gaps in vapour and air barriers [9,27]. During the longer term, sealing degradation, moisture retention and the short lifespan of certain low quality building and insulating materials cause a temporal decrease in thermal performance [27].
Changes, errors, omission and damage have all been identified as possible causes for building defects [27]. Changes affecting construction quality can be due to redesign, a change in construction process, plans, specifications, operational capability of the building or contracting scope [27]. Changes to building design are usually the main cause of building defects, resulting in around 55% of all defects [27]. As discussed in Section 4.1, changes to design specifications are likely to occur when there is insufficient information in the construction drawing or site constraints are not accommodated, leading to on-site decision-making [13]. However, specification changes can also originate from clients’ requests, a change in supply chain or material availability, or cost reduction requirements [9]. None of these changes are likely to be fed back to the design team for performance re-evaluation and thus, the impacts on building energy performance are unknown [26]. Common substitutions include window and doors models, insulation types and thickness, walling types and ductwork materials [29].
Errors, another cause of building defects, can also originate from design as well as manufacturing and incorrect installation. Once again, errors in design are usually the main factors for building defects, causing between 30% and 60% of buildings abnormalities [27]. Omissions are elements that have been forgotten either during design, manufacturing, or construction. Damage to an element of the building can also be a cause of defect, although it occurs less frequently.
The causes of the building defects listed above (i.e., changes, errors, omission and damage) are prompted by poor workmanship, forgetfulness, inefficient project management, poor site management and supervision, deficient communication processes, insufficient planning and inadequate inspection processes [13,27]. Lack of knowledge and skills from small contractors and installer companies is also an issue identified in several studies, especially in relation to energy efficient materials and installation practices [13,26,27]. Some contractors tend to cut corners or carry on with business as usual rather than following design specifications [13,27]. This is an issue that needs to be addressed as building regulations become more stringent and introduce new requirements such as air tightness [9]. Zou et al. [13] also identify owners as a key stakeholder affecting construction processes as they are major decision-makers but do not necessarily understand building energy requirements, potential energy savings and have limited experience and knowledge.

4.4. Technology Installation and Commissioning

EPGs originate not only from building fabric defects, but also from technology that is incorrectly sized and/or installed, not matching with design assumptions or specifications. It is estimated that poor installation and commissioning can have an effect of up to 20% on building energy use [9].
All technologies reported in McElroy et al. [28] presented an EPG. Energy efficiency issues were a common problem encountered for technologies such as domestic wind turbines, condensing boilers, solar thermal hot water systems and ground source heat pumps. High parasitic loads were also found for both solar thermal hot water systems and domestic wind turbines. Another issue was the output temperature of some hot water systems, which was much lower than the recommended and safe range. Incorrect setting parameters have also been observed in regards to other technologies such as ventilation and extraction fans [9]. The Zero Carbon Hub [29] found that low carbon technologies are particularly susceptible to poor installation practices. These include heat pumps and solar systems, renewable technologies, heating systems and insulation of pipework and ductwork [29].
Factors affecting the EPG of technologies in the pre-occupancy stage can be either contextual or caused by poor installation. Contextual factors are operational conditions that are different to the modelled assumptions, such as weather or water temperature input [28]. In terms of installation, oversizing of the technology is a common issue. It has been suggested that installers select the technology size based on personal beliefs rather than actual dwelling/room size and likely demand [28]. Designers could also be responsible for installation errors through incorrect specification of technology systems or insufficient guidance [13,29]. Lack of experience and skills from installation contractors is also a common problem.
Another issue is the fact that the installation of building services is done separately from and subsequently to the building fabric. This uncoordinated approach by contractors often leads to damage to the building fabric or thermal bridges, compromising air tightness and insulation [9,28]. Inadequate communication between stakeholders and poor sequencing of the building processes are issues contributing to material damage and consequently to the EPG [13].

4.5. Verification

Construction defects such as incorrect installation of insulation and gaps in the building fabric are hidden once construction is complete [13]. During post-construction, equipment such as thermal imaging cameras and blower door systems are needed to uncover these defects. However, defects can be identified visually and promptly rectified during construction. Early verification processes are not only cheaper, but they can also prevent project delays and unnecessary costs due to rework [27].
Ideally, performance verification, identification of defects and malfunctions should be carried out throughout the construction process, or at least upon completion [26]. However, building verification is still uncommon, especially in residential buildings. Building performance testing is often not completed due to time and/or budget constraints [13]. When verification of the built form is carried out, testing protocols may not always be followed, and energy efficiency may not be prioritized. Additionally, there is concern over the methods used for as-built tests and interpretation of results. Methods vary greatly across practitioners, who use different definitions of system boundaries and apply different parameters to the same test [29]. Moreover, tools are not properly calibrated, correction factors are not adequately applied, and some equipment is of low quality. These issues mean that the results of verification tests are not uniformly reliable and cannot be trusted [29].

4.6. Issues Found across All Pre-Occupancy Stages

Two further issues have been identified in the literature affecting all stages of the building life cycle, pre-occupancy. These are lack of accountability for building performance and lack of collaboration between stakeholders [13]. These two problems go hand in hand as stakeholders’ involvement in the project stops as soon as their work is complete, without taking further interest in the overall building delivery and performance.
The several stakeholders involved in building construction are usually responsible for their own piece of work. However, the overall quality of the building tends to be the responsibility of no one. Unless there is a main contractor or project manager responsible for the building delivery, changes to building design are not promptly identified [26]. Lack of site supervision and project management means that quality control processes are generally not adequate [7]. Managers are usually reliant on subcontractors and do not focus on energy performance [29]. The responsibility for building faults ends up becoming a burden for the occupants alone, who also incur the costs associated with higher than expected energy bills.
Project documentation is also kept at a team level, and not at a project level. Stakeholders do not tend to communicate or collaborate due to a lack of common interests [13]. This means that decisions made during construction, for instance, are not reported back to the design and planning teams. This creates a discrepancy between design plans and the final built form. Obstacles for collaboration and information transfer include a lack of life cycle thinking and integrated delivery methods [13].
Information integrity is also an issue that has been encountered in construction. Since there is no perceived need for collaboration between stakeholders, information is not always recorded or is insufficient, causing misunderstandings or misinterpretations between different parties [13].

5. Recommendations from the Articles to Address the As-Designed vs. As-Built Performance Gap

Most issues identified in the reviewed articles and discussed above originate from a lack of knowledge and skills by different stakeholders, lack of communication and collaboration, lack of accountability for building performance post-occupancy and insufficient protocols or standards. The most common recommendations in the reviewed articles to close the gap between as-designed and as-built tend to directly address the points above. Key recommendations are discussed below, classified under the four main themes of training, collaboration, performance accountability and standards. Figure 2 provides a summary of the main recommendations from the articles reviewed to address key EPG causes in each of the building stages.

5.1. Training

A general lack of knowledge and skills was identified across all stakeholders involved in building construction. These include planners, designers, procurement staff, managers, builders, testers, inspectors and other contractors. It is suggested that education and upskilling is needed across the industry, for both new and current professionals [30]. Training should focus on energy efficiency requirements and the impact that technologies, materials, construction methods and quality have on the final building thermal performance and energy use [27,30]. Alencastro et al. [27] suggest the use of photographic tools to illustrate the most common building defects and how these can be avoided. For designers, emphasis should be placed on raising awareness about the EPG and generating an understanding of how to create design solutions that are both cost-effective and robust [9].
It has been suggested that certification schemes, or an industry recognized card scheme, should be part of the training strategy to minimize the number of contractors without relevant qualifications [28,30]. Only adequately qualified professionals should be able to conduct building energy modelling, assessments, testing and building performance verification [30]. Government should lead by example, requiring energy certified professionals for development on government land [30].

5.2. Collaboration

The communication of information across different stakeholders is essential to ensure that the final built form reflects the design and energy model [4]. For instance, comprehensive design detailing needs to consider input from builders about site constraints and other practical concerns in order to prevent changes or damage during construction. Yet poor communication is a major issue faced by most building construction projects. Communication could be improved through better protocols, communication guidelines and better management [7].
The lack of a building or facilities manager or inadequate management means that teams may be involved in the building construction work in isolation and there is no one to oversee the building delivery and take responsibility for its quality. Better management can enhance collaboration and reduce the EPG [4].
Often the project manager does not have an adequate understanding of energy requirements. It has been suggested that an energy or sustainability champion could work with the facilities manager to ensure the quality of the building when it comes to energy performance [4,27,30]. This individual would make sure the building meets energy compliance requirements through the design, construction and handover stages, facilitating communication and closing feedback loops [27].
Finding appropriate tools to record and share information has also been a focus of research, but solutions are still mostly at an embryonic stage and require further development. Shi et al. [4] discuss a semantic web that was developed to improve integration of building data sources, including building management systems (BMS) data. Collaboration platforms such as clouds and Building Information Modelling (BIM) have also been attempted and show potential, although they are still relatively new and untested technologies in the construction sector [13].

5.3. Performance Accountability

Under the current building policies across most jurisdictions, building energy ratings and certifications are based on energy simulations completed during the design stage. The performance of the built envelope is usually not verified at building completion and the post-occupancy energy performance of buildings is also not evaluated against predictions. Responsibilities end with each stage of the building construction and no one is held accountable for the overall results.
It is recommended that stakeholders continue their involvement past the delivery of the building and provide longer warranty periods [7]. The Soft Landings framework, implemented in the UK, provides an example of how this can be done [28]. This process aims to keep designers and contractors involved in the building performance beyond completion, also providing up to three years of maintenance services [5]. As part of this strategy, monitoring is carried out and operational building performance results are used to inform future projects [9,28]. However, Soft Landings has drawbacks as stakeholders are reluctant to participate, they do not see value in learning from building performance and building owners may not want to pay for this service [13].
Most studies examined in this rapid review recommend that buildings are rated according to their actual energy performance post-occupancy rather than their predicted performance [9,13,26,28]. This would mean appointing a responsible party for building performance, introducing penalties for non-performance and ensuring that buildings are assessed upon completion and through occupation [9,26]. Performance guarantees would need to be agreed upon as well as a detailed plan of how building commission would be carried out [26].
Gram-Hanssen et al. [26] suggest that voluntary post-occupancy energy classes could be adopted in building regulations to incentivize stakeholders to consider actual, rather than theoretical, building performance. For instance, Display Energy Certificates, adopted in the UK, rate buildings according to their actual energy use [13]. Pay-for-performance (PFP) programmes is another strategy to reward actual rather than predicted energy use [28]. PFP approaches are mostly applicable to retrofits, where subsidies are paid to project owners based on long term savings calculated from metered data or utility bills, from before and after the retrofit [28]. PFP programmes have been mostly adopted in the commercial and public sectors given the costs in obtaining long term data from energy monitoring systems; however, with smart meters being increasingly accessible, there is potential for adoption in the residential sector [28]. An alternative approach is through the taxation of excess energy that is used over the regulatory limit set out for specific buildings [9]. Outcome codes are another approach in early implementation in a few countries such as China and Sweden, where energy budgets are established for different buildings in different climate zones [7]. Penalties and different market mechanisms are currently being tested in conjunction with these novel approaches.
It has been suggested that post-occupancy energy performance data should be made accessible not only to ensure the transparency of the rating process, but also to provide feedback to design teams and gather further evidence on the EPG [9,30]. For instance, benchmarking and transparency policies have been implemented in the USA, where metered energy performance in buildings need to be publicly disclosed on an annual basis [7].

5.4. Standards

To ensure building compliance, construction quality and proper technology installation, a number of articles suggest the development of new guidelines and standards. These include guidelines for common construction processes, equipment maintenance, product installation and commissioning processes [28]. It is recommended that manufacturers improve and/or develop specifications and installer standards for their products as well as a guide on how to measure the performance of products post-installation [30]. The Zero Carbon Hub [30] suggests the development of an industry owned Construction Detail Scheme, available as a public database, where best practices would be described for certain challenging installations. These would list major fabric junctions for mainstream construction types, including masonry, steel, timber and concrete frames.
Standards for residential building monitoring and verification are also required. Building verification is not frequently performed and when it is, it is not consistent as different contractors employ different equipment, methods and parameters. As part of ensuring building quality and compliance, testing should be made mandatory during the construction process, and specific standards should be followed so that results are reliable [9]. Standards for verification exist, such as the International Performance Measurement and Verification Protocol, but these are targeted at larger projects and do not apply to small buildings [28]. A verification standard for buildings should include specific checkpoints during the construction process, at which building performance should be verified [30]. Verification and commissioning should be carried out by an independent subcontractor [7,30].

6. Areas Identified for Future Research from the Articles Reviewed

All articles reviewed identified gaps where further research is required. These include the areas of training, stakeholder collaboration, the use of post-occupancy data for building energy verification, standards and life-cycle thinking. This section will discuss the further research identified in the articles reviewed.
In terms of training, further research is needed on effective strategies to increase energy performance awareness amongst building professionals as well as building owners [27]. It is also important to investigate ways for shifting a box ticking culture, where minimum compliance is targeted [29].
In terms of collaboration, methods to achieve better communication amongst stakeholders, effective tools to promote collaboration and the effects that it will have on the EPG require investigation [4]. Similarly, approaches to achieving information integrity should also be researched. This should include the development of an information collection and transmission system as well as a framework to ensure that stakeholders have a common understanding of the information being communicated [13]. Insight into current stakeholder decision-making criteria, interaction [13] and identification of communication breakdowns [29] would help shape this communication system.
Further research is also needed on the implementation of post-occupancy evaluation mechanisms for buildings. Several differing examples have been implemented worldwide, but they are still being tested and conclusions about their effectiveness cannot yet be drawn [7,26]. It is also likely that greater transparency of building energy data will cause a shift in the EPG, but results are yet to be observed [7]. The definition of responsibilities for energy performance also needs to be explored [29].
In relation to standards, it is necessary to investigate how current building and installation standards are addressing building performance and how these can be extrapolated to also tackle the EPG [28]. The potential for reducing the EPG through better testing standards and methodologies also needs further investigation [29].
More generally, the lack of life-cycle thinking in the building sector was identified by Zou et al. [13] as an issue, since each stage is seen as independent and connections between them are not well understood and not usually a theme of research. However, it has been shown that issues from one stage affect decision-making in the next. It is recommended that future studies focus more on those connections and flow-on effects for the causes of EPGs.

7. Conclusions

The contribution of this article is two-fold. Firstly, it adopts the rapid review methodology, which despite being regularly employed in the health sector, is a novel approach in buildings research. This methodology allows for a thorough review of the literature in just a fraction of the time required to conduct the more traditional systematic literature review. The research question and research scope are partly informed by industry and government, ensuring their significance, while investigating issues raised in the literature as requiring more inquiry. The relevance of the research topic coupled with a quick, but rigorous, review process means that recommendations can be adopted in a timely manner by industry as well as policy makers, with the potential to make a significant impact.
Most importantly, this article contributes to the buildings’ EPG literature, specifically in relation to the construction and commissioning stages of the building lifecycle. The causes for discrepancies between buildings as-designed (i.e., post-design) and as-built (i.e., post-commissioning but prior to occupation), leading to an EPG, have been researched less extensively than EPGs originating from the building design (including modelling) and occupancy phases [13]. Nevertheless, construction and commissioning equally impact on buildings energy efficiency and need to be understood and addressed [9,13].
Overall, discrepancies between buildings as-designed and as-built is a problem occurring worldwide and root causes are found across all stages of the building life cycle. Whilst most causes originate during construction and commissioning, as discussed in this article, flow-on effects also result from inadequate planning and design detailing. Overall, this review suggests that most causes for discrepancies between buildings as-designed and as-built during the construction and commission stage relate to a general lack of knowledge and skills, insufficient communication and collaboration between stakeholders, and a lack of accountability for building performance post-occupancy. Thus, it is unlikely that more stringent energy efficiency regulations would lead to energy reductions in the building sector [25].
The most common recommendations to close the gap between as-designed and as-built during the construction and commissioning stage identified in the articles reviewed tend to directly address the points above as well as improved standards. Key recommendations include:
  • Training and upskilling all new and current industry professionals
  • Adopting appropriate tools to record, maintain and share information between stakeholders, also closing feedback loops
  • Improving management processes, including the appointment of a sustainability champion
  • Extending stakeholders’ involvement past the delivery of the building
  • Evaluating buildings’ energy efficiency based on post-occupancy performance rather than theoretical performance. This means appointing a responsible party for building performance, introducing penalties for non-performance, and ensuring that buildings are assessed upon completion and through occupation
  • Making energy performance data accessible to promote transparency, to provide feedback to design teams and gather further evidence on the EPG
  • Mandating testing and verification during the construction process
  • Developing new guidelines and standards for common construction processes, equipment maintenance, product installation and building monitoring and verification
Future areas for research could include training of contractors and designers to highlight the codes that regulate the building industry and how they can be failed to be met in the final built product, for a variety of reasons, some of which have been outlined in this review. This can also highlight how post-occupancy factors (including technology management and operation, occupant actions and weather variations) influence the building performance.

Author Contributions

Conceptualization, J.K.B. and C.E.; Methodology, J.K.B. and C.E.; Validation, G.M.M. and J.B.; Formal Analysis, C.E. and J.K.B.; Investigation, C.E. and J.K.B.; Resources, C.E. and J.K.B.; Data Curation, C.E. and J.K.B.; Writing—Original Draft Preparation, C.E.; Writing—Review and Editing, J.K.B., J.B., G.M.M.; Visualization, C.E.; Supervision, J.B., G.M.M.; Project Administration, G.M.M.; Funding Acquisition, G.M.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by the CRC for Low Carbon Living Ltd., grant number NR2002 supported by the Cooperative Research Centres program, an Australian Government initiative.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Search String and Filtering Criteria Applied to the Academic Databases, for Search 1 and Search 2

Search/Database [Number of Results]Search StringFilters
Search 1/Scopus
[109 results]
TITLE-ABS-KEY((“energy performance gap” OR “energy gap” OR “performance gap”) AND (building* OR hous* OR home) AND (“low carbon” OR “low-carbon” OR “energy efficien*” OR green OR “sustainab*” OR “net zero energy” OR “zero energy” OR “high efficien*” OR “passive”) AND (“construction” OR “commission*” OR “pre-occupancy” OR “life cycle” OR “life-cycle”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”)) AND (LIMIT-TO (PUBYEAR, 2020) OR LIMIT-TO (PUBYEAR, 2019) OR LIMIT-TO (PUBYEAR, 2018) OR LIMIT-TO (PUBYEAR, 2017) OR LIMIT-TO (PUBYEAR, 2016) OR LIMIT-TO (PUBYEAR, 2015) OR LIMIT-TO (PUBYEAR, 2014) OR LIMIT-TO (PUBYEAR, 2013) OR LIMIT-TO (PUBYEAR, 2012) OR LIMIT-TO (PUBYEAR, 2011) OR LIMIT-TO (PUBYEAR, 2010)) AND (LIMIT-TO (DOCTYPE, “re”)) AND (EXCLUDE (SUBJAREA, “CHEM”) OR EXCLUDE (SUBJAREA, “MATE”) OR EXCLUDE (SUBJAREA, “PHYS”) OR EXCLUDE (SUBJAREA, “CENG”) OR EXCLUDE (SUBJAREA, “BIOC”) OR EXCLUDE (SUBJAREA, “MEDI”) OR EXCLUDE (SUBJAREA, “PHAR”) OR EXCLUDE (SUBJAREA, “MATH”) OR EXCLUDE (SUBJAREA, “AGRI”) OR EXCLUDE (SUBJAREA, “NURS”)) AND (LIMIT-TO (LANGUAGE, “English”)
Search 1/Web of Science
[7 results]
TS=((“energy performance gap” OR “energy gap” OR “performance gap”) AND (building* OR hous* OR home) AND (“low carbon” OR “low-carbon” OR “energy efficien*” OR green OR “sustainab*” OR “net zero energy” OR “zero energy” OR “high efficien*” OR “passive”) AND (“construction” OR “commission*” OR “pre-occupancy” OR “life cycle” OR “life-cycle”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”))Timespan: All years. Indexes: SCI-EXPANDED, SSCI, A&HCI, CPCI-S, CPCI-SSH, BKCI-S, BKCI-SSH, ESCI, CCR-EXPANDED, IC.
Search 1/ProQuest
[11 results]
noft(((“energy performance gap” OR “energy gap” OR “performance gap”) AND (building* OR hous* OR home) AND (“low carbon” OR “low-carbon” OR “energy efficien*” OR green OR “sustainab*” OR “net zero energy” OR “zero energy” OR “high efficien*” OR “passive”) AND (“construction” OR “commission*” OR “pre-occupancy” OR “life cycle” OR “life-cycle”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”)))Last 10 Years
Scholarly Journals
Search 2/Scopus
[6 results]
TITLE-ABS-KEY((“energy performance gap” OR “energy gap” OR “performance gap”) AND (“cause*” OR “verification” OR “compliance” OR “assess*” OR “solution*” OR polic* OR “clos* the gap”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”)) AND (LIMIT-TO (PUBYEAR, 2020) OR LIMIT-TO (PUBYEAR, 2019) OR LIMIT-TO (PUBYEAR, 2018) OR LIMIT-TO (PUBYEAR, 2017) OR LIMIT-TO (PUBYEAR, 2016) OR LIMIT-TO (PUBYEAR, 2015) OR LIMIT-TO (PUBYEAR, 2014) OR LIMIT-TO (PUBYEAR, 2013) OR LIMIT-TO (PUBYEAR, 2012) OR LIMIT-TO (PUBYEAR, 2011) OR LIMIT-TO (PUBYEAR, 2010)) AND (EXCLUDE (SUBJAREA, “MATE”) OR EXCLUDE (SUBJAREA, “PHYS”) OR EXCLUDE (SUBJAREA, “CHEM”) OR EXCLUDE (SUBJAREA, “MEDI”) OR EXCLUDE (SUBJAREA, “CENG”) OR EXCLUDE (SUBJAREA, “BIOC”) OR EXCLUDE (SUBJAREA, “MATH”) OR EXCLUDE (SUBJAREA, “NURS”) OR EXCLUDE (SUBJAREA, “PHAR”) OR EXCLUDE (SUBJAREA, “AGRI”) OR EXCLUDE (SUBJAREA, “PSYC”) OR EXCLUDE (SUBJAREA, “IMMU”)) AND (LIMIT-TO (DOCTYPE, “re”)) AND (LIMIT-TO (LANGUAGE, “English”)) AND (EXCLUDE (EXACTKEYWORD, “Photocatalysis) OR EXCLUDE (EXACTKEYWORD, “Solar Cells”) OR EXCLUDE (EXACTKEYWORD, “Titanium Dioxide”) OR EXCLUDE (EXACTKEYWORD, “Light Absorption”) OR EXCLUDE (EXACTKEYWORD, “Photocatalysts”) OR EXCLUDE (EXACTKEYWORD, “Wide Band Gap Semiconductors”) OR EXCLUDE (EXACTKEYWORD, “Catalysis”) OR EXCLUDE (EXACTKEYWORD, “Electrode”) OR EXCLUDE (EXACTKEYWORD, “Fuel Cell”) OR EXCLUDE (EXACTKEYWORD, “Dye-sensitized Solar Cells”) OR EXCLUDE (EXACTKEYWORD, “Electric Drives”) OR EXCLUDE (EXACTKEYWORD, “Gallium Nitride”) OR EXCLUDE (EXACTKEYWORD, “Hydrogen Production”) OR EXCLUDE (EXACTKEYWORD, “III-V Semiconductors”) OR EXCLUDE (EXACTKEYWORD, “Light”) OR EXCLUDE (EXACTKEYWORD, “Power Converters”) OR EXCLUDE (EXACTKEYWORD, “Silicon Carbide”) OR EXCLUDE (EXACTKEYWORD, “Absorption”) OR EXCLUDE (EXACTKEYWORD, “Absorption Spectroscopy”) OR EXCLUDE (EXACTKEYWORD, “Cadmium”) OR EXCLUDE (EXACTKEYWORD, “Cadmium Sulfide”) OR EXCLUDE (EXACTKEYWORD, “Catalyst”) OR EXCLUDE (EXACTKEYWORD, “Copper”) OR EXCLUDE (EXACTKEYWORD, “Electrodes”) OR EXCLUDE (EXACTKEYWORD, “Electrolyte”) OR EXCLUDE (EXACTKEYWORD, “Heterojunctions”) OR EXCLUDE (EXACTKEYWORD, “Hydrogen”) OR EXCLUDE (EXACTKEYWORD, “Hydrogen Production Rate”) OR EXCLUDE (EXACTKEYWORD, “Indium”) OR EXCLUDE (EXACTKEYWORD, “Integrated Motor Drives”) OR EXCLUDE (EXACTKEYWORD, “Nanocrystals”) OR EXCLUDE (EXACTKEYWORD, “Nanostructures”) OR EXCLUDE (EXACTKEYWORD, “Organic Pollutants”) OR EXCLUDE (EXACTKEYWORD, “Parasitic Inductances”) OR EXCLUDE (EXACTKEYWORD, “Photocatalyst”) OR EXCLUDE (EXACTKEYWORD, “Photochemistry”) OR EXCLUDE (EXACTKEYWORD, “Semiconductor Doping”) OR EXCLUDE (EXACTKEYWORD, “Semiconductor Quantum Dots”) OR EXCLUDE (EXACTKEYWORD, “Solar Radiation”) OR EXCLUDE (EXACTKEYWORD, “Solar Spectrum”) OR EXCLUDE (EXACTKEYWORD, “Transparency”) OR EXCLUDE (EXACTKEYWORD, “Water Absorption”) OR EXCLUDE (EXACTKEYWORD, “Water Pollution”) OR EXCLUDE (EXACTKEYWORD, “Wide Band Gap”) OR EXCLUDE (EXACTKEYWORD, “Zinc Oxide”) OR EXCLUDE (EXACTKEYWORD, “Absorber Layers”) OR EXCLUDE (EXACTKEYWORD, “Absorption Co-efficient”) OR EXCLUDE (EXACTKEYWORD, “Absorption Coefficient”) OR EXCLUDE (EXACTKEYWORD, “Absorption Spectrum”) OR EXCLUDE (EXACTKEYWORD, “Alloy”) OR EXCLUDE (EXACTKEYWORD, “Analogous Structures”) OR EXCLUDE (EXACTKEYWORD, “Automotive Applications”) OR EXCLUDE (EXACTKEYWORD, “Automotive Industry”) OR EXCLUDE (EXACTKEYWORD, “Azo Dyes”) OR EXCLUDE (EXACTKEYWORD, “Back Surface Fields”) OR EXCLUDE (EXACTKEYWORD, “Band Gap”) OR EXCLUDE (EXACTKEYWORD, “Band Gap Energy”) OR EXCLUDE (EXACTKEYWORD, “Band Notch”) OR EXCLUDE (EXACTKEYWORD, “Band Structure Engineering”) OR EXCLUDE (EXACTKEYWORD, “Band-notch Characteristics”) OR EXCLUDE (EXACTKEYWORD, “Binding Energy”) OR EXCLUDE (EXACTKEYWORD, “Biological Materials”) OR EXCLUDE (EXACTKEYWORD, “Bipolar Semiconductor Devices”) OR EXCLUDE (EXACTKEYWORD, “Black TiO2”) OR EXCLUDE (EXACTKEYWORD, “Cadmium Compounds”) OR EXCLUDE (EXACTKEYWORD, “Cadmium Telluride”) OR EXCLUDE (EXACTKEYWORD, “Capacitors”) OR EXCLUDE (EXACTKEYWORD, “Carbon Nitride”) OR EXCLUDE (EXACTKEYWORD, “Carrier Concentration”) OR EXCLUDE (EXACTKEYWORD, “Carrier Diffusion Length”) OR EXCLUDE (EXACTKEYWORD, “Carrier Selection”) OR EXCLUDE (EXACTKEYWORD, “Catalyst Activity”) OR EXCLUDE (EXACTKEYWORD, “Chalcopyrite”) OR EXCLUDE (EXACTKEYWORD, “Charge Carriers”) OR EXCLUDE (EXACTKEYWORD, “Charge Collection Efficiency”) OR EXCLUDE (EXACTKEYWORD, “Chemical Compound”) OR EXCLUDE (EXACTKEYWORD, “Chromium Compounds”) OR EXCLUDE (EXACTKEYWORD, “Circuit Oscillations”) OR EXCLUDE (EXACTKEYWORD, “Co-doping”) OR EXCLUDE (EXACTKEYWORD, “Conductivity Modulation”) OR EXCLUDE (EXACTKEYWORD, “Conjugated Polymers”) OR EXCLUDE (EXACTKEYWORD, “Conjugated Structures”) OR EXCLUDE (EXACTKEYWORD, “Connectors”) OR EXCLUDE (EXACTKEYWORD, “Connectors (structural)”) OR EXCLUDE (EXACTKEYWORD, “Contamination”) OR EXCLUDE (EXACTKEYWORD, “Conventional Capacitors”) OR EXCLUDE (EXACTKEYWORD, “Conversion Efficiency”) OR EXCLUDE (EXACTKEYWORD, “Copper Vanadate”) OR EXCLUDE (EXACTKEYWORD, “Crystalline Silicons”) OR EXCLUDE (EXACTKEYWORD, “Dissolved Organic Matter”) OR EXCLUDE (EXACTKEYWORD, “Dissolved Organic Matters”) OR EXCLUDE (EXACTKEYWORD, “Dissolved Oxygen”) OR EXCLUDE (EXACTKEYWORD, “EBG”) OR EXCLUDE (EXACTKEYWORD, “EV”)
Search 2/Web of Science
[9 results]
TS=((“energy performance gap” OR “energy gap” OR “performance gap”) AND (“cause*” OR “verification” OR “compliance” OR “assess*” OR “solution*” OR polic* OR “clos* the gap”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”))Refined by: PUBLICATION YEARS: (2019 OR 2012 OR 2018 OR 2011 OR 2017 OR 2010 OR 2016 OR 2015 OR 2014 OR 2013) AND DOCUMENT TYPES: (REVIEW) AND [excluding] WEB OF SCIENCE CATEGORIES: (CHEMISTRY MULTIDISCIPLINARY OR HOSPITALITY LEISURE SPORT TOURISM OR CHEMISTRY PHYSICAL OR CHEMISTRY INORGANIC NUCLEAR OR OPTICS OR CHEMISTRY ORGANIC OR CRYSTALLOGRAPHY OR POLYMER SCIENCE OR SPECTROSCOPY OR UROLOGY NEPHROLOGY)
Search 2/ProQuest
[26 results]
noft(((“energy performance gap” OR “energy gap” OR “performance gap”) AND (“cause*” OR “verification” OR “compliance” OR “assess*” OR “solution*” OR polic* OR “clos* the gap”) AND (“systematic review” OR “systematic literature review” OR review OR “meta analysis” OR “meta-analysis”)))Scholarly Journals
Last 10 Years
NOT (mathematical analysis AND condensed matter AND optical properties AND solar cells AND organic chemistry AND photovoltaic cells AND thin films AND x-ray diffraction AND electronic structure AND graphene AND insulators AND spin-orbit interactions AND density functional theory AND superconductivity AND adsorption AND spectrum analysis AND electrons AND magnetic fields AND optoelectronics AND phase transitions AND absorption AND conduction bands AND electronics AND excitons AND fermions AND magnetism AND markets AND perovskites AND phases AND photocatalysis AND photoelectric emission AND refractivity AND semiconductors AND titanium dioxide AND valence band AND band gap AND ferromagnetism AND ground state AND heterostructures AND molybdenum disulfide AND monolayers AND morphology AND photoluminescence AND photons AND quantum wells AND substrates AND superconductors AND transition metals AND zinc oxide AND aluminum AND annealing AND benzene AND brillouin zones AND carrier density AND chemical bonds AND corrosion AND corrosion effects AND corrosion inhibitors AND dielectric properties AND doping AND emitters AND ferroelectric materials AND fourier transforms AND holes (electron deficiencies) AND impurities AND inhibition AND nanocrystals AND nanoparticles AND nickel AND phosphorene AND photoelectric effect AND endoscopy AND flux density AND gender differences AND human rights AND intubation AND lean manufacturing AND medical screening AND neural networks AND obesity AND precipitation AND rechargeable batteries AND acoustic waves AND acuity AND adaptive control AND advantaged AND age groups AND age related diseases AND students AND ability tests AND academic achievement AND academic degrees AND achievement tests AND air quality AND ambition AND amplitudes AND analogies AND anelasticity AND apl (programming language) AND aviation AND backscattering AND charter of rights-canada AND compression tests AND condensates AND cooking AND curricula)
Article OR Literature Review OR Review
English

Appendix B. Excluded Studies at the Full-Text Eligibility Stage

Full ReferenceReason for Exclusion
E.H. Borgstein, R. Lamberts, J.L.M. Hensen, Evaluating energy performance in non-domestic buildings: A review, Energy and Buildings 128 (2016) 734–755.The main focus of the article is on methods to evaluate the performance of buildings. The EPG is mentioned but only briefly.
P. De Wilde, The gap between predicted and measured energy performance of buildings: A framework for investigation, Automation in Construction 41 (2014) 40–49.The article discussion, conclusion and recommendations are based on a pilot study.
P.G. Tuohy, G.B. Murphy, Closing the gap in building performance: learning from BIM benchmark industries, Architectural Science Review 58(1) (2015) 47–56.The article is not a literature review.

Appendix C. AMSTAR2 Checklist for Article Quality Assessment

Question (Recommendations)Decision Rules and Comments
Q1. Are the research questions and inclusion criteria for the review clearly delineated?1 = “Yes” = Who (Population/Subject), What (Intervention, Comparator group, Outcome), Where and When described.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = research question and inclusion criteria not outlined in detail.
Q2. Did the report of the review contain an explicit statement that the review methods were established prior to the conduct of the review and did the report justify any significant deviations from the protocol?1 = “Yes” = The authors state that they had a written protocol or guide that included ALL the following: review question(s), a search strategy, inclusion/exclusion criteria, risk of bias assessment.
0.5 = “Can’t answer/not sure/partially” = The authors state that they had a written protocol or guide that included ALL the following: review question(s), a search strategy, inclusion/exclusion criteria, a risk of bias assessment.
0 = “No” = no mention of a priori design of the systematic review, as listed above.
Q3. Did the review authors explain their selection of the study designs for inclusion in the review?1 = “Yes” = explicit justification of the study designs/types included in the review.
0.5 = “Can’t answer/not sure/partially” = more than one online source but no supplementary sources or one online source and one supplementary source. Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = only one online source or no supplementary search used
Q4. Did the review authors use a comprehensive literature search strategy?1 = “Yes” = searched at least 2 databases (relevant to research question), provided key word and/or search strategy, justified publication restrictions (e.g., language), AND searched the reference lists/bibliographies of included studies, searched trial/study registries, included/consulted content experts in the field, where relevant, searched for grey literature, conducted search within 24 months of completion of the review.
0.5 = “Can’t answer/not sure/partially” = searched at least 2 databases (relevant to research question), provided key word and/or general search strategy, justified publication restrictions (e.g., language).
0 = “No” = no information on search strategy, or not fulfilling criteria for “Yes” and “Partially”.
Q5. Did the review authors perform study selection in duplicate?1 = “Yes” = either ONE of the following: at least two reviewers independently agreed on selection of eligible studies and achieved consensus on which studies to include OR two reviewers selected a sample of eligible studies and achieved good agreement (at least 80%), with the remainder selected by one reviewer.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = only one reviewer involved in the study selection or no description how many reviewers participated in study selection.
Q6. Did the review authors perform data extraction in duplicate?1 = “Yes” = either ONE of the following: at least two reviewers achieved consensus on which data to extract from included studies OR two reviewers extracted data from a sample of eligible studies and achieved good agreement (at least 8%), with the remainder extracted by one reviewer.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = only one reviewer involved in the study selection or no description how many reviewers participated in data extraction.
Q7. Did the review authors provide a list of excluded studies and justify the exclusions?1 = “Yes” = provided a list of all potentially relevant studies that were read in full-text form but excluded from the review AND justified the exclusion from the review of each potentially relevant study.
0.5 = “Can’t answer/not sure/partially” = only provided a list of all potentially relevant studies that were read in full-text form but excluded from the review, but not justified the exclusion from the review of each potentially relevant study that were read in full-text.
0 = “No” = No list of studies excluded at a full-text stage.
Q8. Did the review authors describe the included studies in adequate detail?1 = “Yes” = ALL the following: Who (Population), What (Intervention, Comparator group, Outcome), Where and When described in detail.
0.5 = “Can’t answer/not sure/partially” = Who (Population), What (Intervention, Comparator group, Outcome), Where and When briefly described, or only some of these described in detail. Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = no, or partial description of the included studies
Q9. Did the review authors use a satisfactory technique for assessing the risk of bias (RoB) in individual studies that were included in the review?1 = “Yes” = specifically mentions RoB assessment of individual included studies.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper. RoB mentioned or not sufficiently assessed (e.g., if multiple sources of bias potentially present, but not all assessed).
0 = “No” = no mention of RoB assessment of individual included studies.
[RoB sources: from confounding, from selection bias, from exposure bias, from selective reporting of outcomes, selection of the reported result from among multiple measurements or analyses of a specified outcome].
Q10. Did the review authors report on the sources of funding for the studies included in the review?1 = “Yes” = Must have reported on the sources of funding for individual studies included in the review. Note: Stating that the reviewers looked for this information but it was not reported by study authors, also qualifies.
0.5 = “Can’t answer/not sure/partially” = sources of funding mentioned for individual studies included in the review, or reported only for some of the included studies. Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = no report of the sources of funding for individual studies included in the review.
Q11. If meta-analysis was performed did the review authors use appropriate methods for statistical combination of results?1 = “Yes” = The authors justified combining the data in a meta-analysis AND they used an appropriate technique to combine study results and adjusted for heterogeneity if present AND investigated the causes of any heterogeneity or adjusted for heterogeneity or confounding if present.
0.5 = “Can’t answer/not sure/partially” = Requirements for “Yes” only partially fulfilled. Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = no justification of meta-analysis or inappropriate statistical methods were used for quantitatively combining and analysing the data, heterogeneity not assessed.
N/A = “Not Applicable” = No meta-analysis conducted.
Q12. If meta-analysis was performed, did the review authors assess the potential impact of RoB in individual studies on the results of the meta-analysis or other evidence synthesis?1 = “Yes” = included only low risk of bias studies OR the authors performed analyses to investigate possible impact of RoB on summary estimates of effect.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = no assessment of the potential impact of RoB.
N/A = “Not Applicable” = No meta-analysis conducted.
Q13. Did the review authors account for RoB in individual studies when interpreting/discussing the results of the review?1 = “Yes” = included only low risk of bias studies OR the review provided a discussion of the likely impact of RoB on the results.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = no discussion of the potential impact of RoB in individual studies.
Q14. Did the review authors provide a satisfactory explanation for, and discussion of, any heterogeneity observed in the results of the review?1 = “Yes” = There was no significant heterogeneity in the results OR if heterogeneity was present the authors performed an investigation of sources of any heterogeneity in the results and discussed the impact of this on the results of the review.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = No explanation or discussion of heterogeneity present in the results.
Q15. If they performed quantitative synthesis did the review authors carry out an adequate investigation of publication bias (small study bias) and discuss its likely impact on the results of the review?1 = “Yes” = The authors performed graphical or statistical tests for publication bias and discussed the likelihood and magnitude of impact of publication bias.
0.5 = “Can’t answer/not sure/partially” = more than one online source but no supplementary sources or one online source and one supplementary source. Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = The authors did not perform any tests for publication bias and did not discuss potential impact of publication bias.
N/A = “Not Applicable” = No meta-analysis conducted.
Q16. Did the review authors report any potential sources of conflict of interest, including any funding they received for conducting the review?1 = “Yes” = The authors reported no competing interests OR the authors described their funding sources and how they managed potential conflicts of interest.
0.5 = “Can’t answer/not sure/partially” = Cannot decide between “yes” and “no”, basing on the information provided in the paper.
0 = “No” = The authors did not provide statement on competing interests and funding sources, and how they managed potential conflicts of interest.

Appendix D. Summary of Key Findings from Each of the Reviewed Articles

ReferenceKey FindingsKey Recommendations
Alencastro (2018) [27]Houses have on average 2.29 to 28.3 defects, most relating to thermal performance such as poor installation, gaps in the building fabric and thermal bridging through structural elements. Other general faults include incorrect installation and missing items in external walls, partitions, doors and windows, and floors and roofs.
Most of these are the result of damage occurring during installation, change in or omission of materials and inefficient management during construction. These defects can result in an increase of up to 52% in total project costs.
Construction companies should provide appropriate training to increase awareness of the impact of the quality of work on building thermal performance and to utilise photographic tools to show how those defects commonly happen and how to avoid them.
An energy champion should be appointed to monitor project progress to ensure ongoing compliance with relevant energy performance targets, during the design and construction, handover and close-out stages. Energy performance awareness amongst clients, project teams and the workforce is needed to drive these changes.
Gram-Hanssen (2018) [26]Causes of EPG originating from the building construction stage include:
-
design changes due to contractors’ incorrect installation or due to the design being too complex for contractors to implement;
-
lack of knowledge and skills in regards to energy efficient materials, leading to business-as-usual; and
-
changes during the tendering process favouring cost reduction.
Quality control can be difficult, as the costs and benefits accrue to different actors. There is usually no single person responsible for the overall quality of the entire building to make sure it performs as specified. When there is a main contractor or system integrator amongst the contractors, then it is more likely that changes will be discovered and reported back to the designers.
Commissioning could be a way to correct these problems. It involves verifying performance measurements and checking for malfunctioning technologies and solutions across all phases from design, construction to operation. However, commissioning of residential buildings is uncommon.
It is recommended that emphasis is placed on post-occupancy evaluations rather than pre-construction evaluations. Project owners would have to agree on performance guaranties, including mandatory plans for how commissioning would be done, particularly in instances where the energy-consumption goals are not reached.
Individuals should be appointed responsible for an integrated approach to ensure a systematic assessment of the building at the time of delivery as well as at later stages of use.
IPECC (2019) [7]Usually the EPG of non-residential buildings is more significant than the EPG of residential buildings.There needs to be better management of the quality control process throughout design, construction and operation to ensure design intent is met. Greater communication standards need to be put in place between stakeholders to ensure comprehensive design detailing is performed early so changes can be made then. Ongoing feedback to the design team post-occupancy would also help inform future buildings’ design. Better training and education on design for sustainability is also required.
Target policy areas to close the EPG are greater transparency of operational building energy performance and regulation of the building operational performance along with penalties for non-compliance.
McElroy (2019) [28]All energy efficient technologies reviewed in this study presented performance gaps.
Some of the causes were contextual, as some of the operation conditions were different than expected. Others were due to low quality installation, as some of the technologies were oversized and unrelated to the house size.
The article calls for a need for further field trials of specific technologies, new and current.
policy recommendations include:
-
Defining key parameters to be analysed in evaluations;
-
Setting up quality standards for carrying out monitoring of low-carbon technologies once installed;
-
Defining key aspects to be covered by post-installation audits; and
-
Setting appropriate methods for evaluation, monitoring and verification. It is suggested that a detailed global standard for monitoring and verification is implemented in the residential sector.
Pay-for-performance programmes are also suggested as they reward real savings achieved over time rather than theoretical savings.
Other recommendations include ensuring that manufacturers develop installer standards for their products, providing accreditation to installers based on training, and reviewing current installation and training guidelines.
Shi (2019) [4]Buildings’ EPG is identified and interpreted in significantly different ways leading to large variations. There is no correlation between the magnitude of the EPG and specific building parameters.
Causes of the EPG between design and as-built include: inappropriate design, malpractice, construction uncertainties and physical changes to the building between design and construction.
Solutions proposed are managerial, technical and hybrid (a mix of the two). These include:
-
Increased communication and collaboration between all stakeholders and in particular between design and construction teams;
-
Managing the building process more effectively to ensure the building is constructed as modelled, with attention to detail;
-
Appointing a sustainability champion to monitor and provide direction as well as developing guides for efficient equipment use, maintenance and commissioning.
Van Dronkelaar (2016) [9]It is estimated that poor commissioning can cause a gap of up to 20%. It is not understood how much construction issues impact on energy use.
From a construction perspective, EPG is caused by:
-
the complexity of the design, making mistakes in construction more likely;
-
low quality on-site workmanship, often affecting insulation and air-tightness;
-
changes after design either for cutting costs or due to site constraints; and
-
poor commissioning, where building services are not properly installed and compromise building operation from the start.
Building audits and monitored energy consumption should become integral to the modelling process.
Key recommendations from this study are:
-
Robust checking and testing during construction to ensure that the quality of construction is maintained;
-
Making energy data accessible for further evidence gathering on the EPG;
-
Penalizing buildings for high operational energy use, such as through an environmental tax. Governments should relate predicted to measured performance through predictive modelling and in use regulation. Design stage calculations and assumptions should also be disclosed as well as operational energy outcomes; and
-
Monitoring buildings and using results to calibrate design models.
Zero Carbon Hub (2014) [29] (presents findings)
Zero Carbon Hub (2014) [30] (proposes recommendations)
This review identified issues in the planning, design, procurement, construction and commissioning, verification and testing stages of the building life cycle. Most issues identified are related to lack of knowledge and skills, lack of communication between stakeholders and lack of accountability.
For instance, the planning stakeholders lack knowledge about the implication of early decisions on building energy performance. Designers lack practical understanding about the building site and construction processes.
Procurement services do not prioritize contractors with energy efficiency skills. Consequently, building fabric and services are incorrectly constructed, installed and commissioned by contractors who do not possess adequate skills.
Verification processes do not prioritize energy performance and testing methodologies are not always followed.
There is also lack of clarity in documentation and lack of integration between different layers of the building design (fabric and services).
Recommendations were separated into priority actions for industry and government.
Industry priorities:
-
Develop innovative methods to demonstrate building performance;
-
Training and upskilling industry professionals;
-
Develop and maintain a Construction Details Scheme (CDS) for the major fabric junctions to ensure as-built energy performance; and
-
Evidence gathering and feedback for continuous improvement of the industry.
Government priorities:
-
Funding research and development into testing, measurement and assessment techniques as well as for the development of a CDS;
-
Ensure only qualified professionals conduct energy modelling and assessments; and
-
Support industry development by leading by example, requiring energy certified operatives and professionals for developments on government land.
Zou (2018) [13]Root causes of building EPG are situated in the design and modelling of the building, the construction and the building operation. These are the EPG causes associated to the construction stage as well as the responsible stakeholders:
-
Limited experience and knowledge—designer
-
Inadequate understanding of building construction and energy—owner
-
Change in orders—owner
-
Poor quality in equipment and materials—supplier
-
Change in materials to reduce costs—contractor
-
Poor workmanship and poor construction techniques—contractor
-
Failure to uncover hidden faults—contractor
-
Performance testing not completed due to time and budget constraints—contractor
In addition, there is no accountability for building performance. Stakeholders do not communicate or collaborate due to a lack of common interest. Obstacles for collaboration include a lack of life cycle thinking and integrated delivery methods as well as the lack of a platform to facilitate information transfer.
Existing strategies for addressing the gap in the building construction stage are considered ‘soft’ measures and include:
-
Policies such as ‘Display Energy Certificates’ (UK), which rate buildings according to their actual energy consumption and ‘Soft Landings’ (UK), which keeps designers and contractors involved in the building operation stage to address the EPG;
-
Energy performance ratings based on actual building performance
It is recommended that further research is conducted in the areas of life cycle thinking of the building EPG, stakeholders’ attributions and decision criteria, stakeholders’ interaction and information integrity.

References

  1. Lucon, O.; Ürge-Vorsatz, D.; Ahmed, A.Z.; Akbari, H.; Bertoldi, P.; Cabeza, L.F.; Eyre, N.; Gadgil, A.; Harvey, L.D.D.; Jiang, Y.; et al. Buildings. In Climate Change 2014: Mitigation of Climate Change. Contribution of Working Group III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change; Edenhofer, O., Pichs-Madruga, R., Sokona, Y., Farahani, E., Kadner, S., Seyboth, K., Adler, A., Baum, I., Brunner, S., Eickemeier, P., et al., Eds.; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
  2. European Commission. Energy Performance of Buildings Directive. Available online: https://ec.europa.eu/energy/topics/energy-efficiency/energy-efficient-buildings/energy-performance-buildings-directive_en (accessed on 29 March 2020).
  3. The California Energy Commission. 2019 Building Energy Efficiency Standards. Available online: https://www.energy.ca.gov/programs-and-topics/programs/building-energy-efficiency-standards/2019-building-energy-efficiency#:~:text=The%202019%20Building%20Energy%20Efficiency,to%2C%20residential%20and%20nonresidential%20buildings.&text=The%20California%20Energy%20Commission%20updates%20the%20standards%20every%20three%20years (accessed on 17 June 2020).
  4. Shi, X.; Si, B.; Zhao, J.; Tian, Z.; Wang, C.; Jin, X.; Zhou, X. Magnitude, causes, and solutions of the performance gap of buildings: A review. Sustainability 2019, 11, 3. [Google Scholar] [CrossRef] [Green Version]
  5. De Wilde, P. The gap between predicted and measured energy performance of buildings: A framework for investigation. Autom. Constr. 2014, 41, 40–49. [Google Scholar] [CrossRef]
  6. Enker, R.A.; Morrison, G.M. The potential contribution of building codes to climate change response policies for the built environment. Energy Effic. 2020, 13, 789–807. [Google Scholar] [CrossRef]
  7. IPECC. Building Energy Performance Gap Issues: An International Review. Available online: https://www.energy.gov.au/sites/default/files/the_building_energy_performance_gap-an_international_review-december_2019.pdf (accessed on 1 April 2020).
  8. Burman, E. Assessing the Operational Performance of Educational Buildings against Design Expectations—A Case Study Approach. Ph.D. Thesis, University College London, London, UK, 2016. [Google Scholar]
  9. Van Dronkelaar, C.; Dowson, M.; Burman, E.; Spataru, C.; Mumovic, D. A Review of the Energy Performance Gap and Its Underlying Causes in Non-Domestic Buildings. Front. Mech. Eng. 2016, 1, 17. [Google Scholar] [CrossRef] [Green Version]
  10. Imam, S.; Coley, D.A.; Walker, I. The building performance gap: Are modellers literate? Build. Serv. Eng. Res. Technol. 2017, 38, 351–375. [Google Scholar] [CrossRef] [Green Version]
  11. DSD. National Energy Efficient Building Project; State of South Australia: Adelaide, Australia, 2014. [Google Scholar]
  12. Gupta, R.; Kapsali, M.; Dwyer, T. Evaluating the ’as-built’ performance of an eco-housing development in the UK. Build. Serv. Eng. Res. Technol. 2016, 37, 220–242. [Google Scholar] [CrossRef]
  13. Zou, P.X.W.; Xu, X.; Sanjayan, J.; Wang, J. Review of 10 years research on building energy performance gap: Life-cycle and stakeholder perspectives. Energy Build. 2018, 178, 165. [Google Scholar] [CrossRef]
  14. ASBEC. ClimateWorks, Built to Perform. An Industry Led Pathway to a Zero Carbon Ready Building Code; The Australian Sustainable Built Environment Council (ASBEC) and ClimateWorks Australia: Darlinghurst, Australia, 2018. [Google Scholar]
  15. USAid. Rapid Review vs. Systematic Review: What Are the Differences? Available online: https://www.heardproject.org/news/rapid-review-vs-systematic-review-what-are-the-differences/ (accessed on 17 June 2020).
  16. VCU Libraries. Rapid Review Protocol. Available online: https://guides.library.vcu.edu/rapidreview (accessed on 17 June 2020).
  17. WHO. Rapid Reviews to Strengthen Health Policy and Systems: A Practical Guide; Tricco, A.C., Langlois, E., Straus, S.E., Eds.; World Health Organization: Geneva, Switzerland, 2017. [Google Scholar]
  18. Brooks, S.K.; Webster, R.K.; Smith, L.E.; Woodland, L.; Wessely, S.; Greenberg, N.; Rubin, G.J. The psychological impact of quarantine and how to reduce it: Rapid review of the evidence. Lancet 2020, 395, 912–920. [Google Scholar] [CrossRef] [Green Version]
  19. Nussbaumer-Streit, B.; Mayr, V.; Dobrescu, A.I.; Chapman, A.; Persad, E.; Klerings, I.; Wagner, G.; Siebert, U.; Christof, C.; Zachariah, C.; et al. Quarantine alone or in combination with other public health measures to control COVID-19: A rapid review. Cochrane Database Syst. Rev. 2020, 4, CD013574. [Google Scholar]
  20. Lagisz, M.; Samarasinghe, G.; Nakagawa, S. Rapid Reviews for the Built Environment—Methodology and Guidelines; CRC LCL: Sydney, Australia, 2018. [Google Scholar]
  21. Graham, P.; Bok, B.; Jinlong, L.; Zwagerman, M.; Burton, C. Policy for Low Carbon (Energy Efficiency) Retrofit/Renovation of Residential Buildings: Rapid Review; CRC LCL: Sydney, Australia, 2019. [Google Scholar]
  22. Grant, M.J.; Booth, A. A typology of reviews: An analysis of 14 review types and associated methodologies. Health Inf. Libr. J. 2009, 26, 91–108. [Google Scholar] [CrossRef] [PubMed]
  23. Tricco, A.C.; Antony, J.; Straus, S.E. Systematic Reviews vs. Rapid Reviews: What’s the Difference? Ph.D. Thesis, University of Toronto, Toronto, ON, Canada, 4 February 2015. [Google Scholar]
  24. Temple University Libraries. Systematic Reviews & Other Review Types. Available online: https://guides.temple.edu/c.php?g=78618&p=4156608 (accessed on 15 July 2020).
  25. Shea, B.J.; Reeves, B.C.; Wells, G.; Thuku, M.; Hamel, C.; Moran, J.; Moher, D.; Tugwell, P.; Welch, V.; Kristjansson, E.; et al. AMSTAR 2: A critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ 2017, 358, j4008. [Google Scholar] [CrossRef] [Green Version]
  26. Gram-Hanssen, K.; Georg, S.; Christiansen, E.; Heiselberg, P. What next for energy-related building regulations?: The occupancy phase. Build. Res. Inf. 2018, 46, 790–803. [Google Scholar] [CrossRef]
  27. Alencastro, J.; Fuertes, A.; de Wilde, P. The relationship between quality defects and the thermal performance of buildings. Renew. Sustain. Energy Rev. 2018, 81, 883–894. [Google Scholar]
  28. McElroy, D.J.; Rosenow, J. Policy implications for the performance gap of low-carbon building technologies. Build. Res. Inf. 2019, 47, 611–623. [Google Scholar] [CrossRef]
  29. Zero Carbon Hub. Closing the Gap between Design and As-Built Performance, Evidence Review Report; Zero Carbon Hub: London, UK, 2014; Available online: http://www.zerocarbonhub.org/sites/default/files/resources/reports/Closing_the_Gap_Between_Design_and_As-Built_Performance-Evidence_Review_Report_0.pdf (accessed on 1 April 2020).
  30. Zero Carbon Hub. Closing the Gap between Design and As-Built Performance, End of Term Report; Zero Carbon Hub: London, UK, 2014; Available online: http://www.zerocarbonhub.org/resources/reports/closing-gap-between-designed-and-built-performance-end-term-report (accessed on 1 April 2020).
Figure 1. PRISMA flow diagram of the search and screening process (adapted from Moher et al. [23]).
Figure 1. PRISMA flow diagram of the search and screening process (adapted from Moher et al. [23]).
Sustainability 12 06372 g001
Figure 2. Causes for the discrepancy between as-designed and as-built and recommendations to address them from the articles reviewed.
Figure 2. Causes for the discrepancy between as-designed and as-built and recommendations to address them from the articles reviewed.
Sustainability 12 06372 g002
Table 1. Response to the quality assessment questions from the AMSTAR2 checklist (Appendix B). Green (or 1) is when the answer to a question is ‘yes’, red (or 0) is when the answer to a question is ‘no’, and yellow (or 0.5) is when the response is ‘unsure’. Questions 11, 12 and 15 were not applicable for these articles. Articles presenting more green fields are of higher quality and present lower risk of bias. The two Zero carbon Hub reports were merged into one for the purpose of this analysis, as one report is the continuation of the other.
Table 1. Response to the quality assessment questions from the AMSTAR2 checklist (Appendix B). Green (or 1) is when the answer to a question is ‘yes’, red (or 0) is when the answer to a question is ‘no’, and yellow (or 0.5) is when the response is ‘unsure’. Questions 11, 12 and 15 were not applicable for these articles. Articles presenting more green fields are of higher quality and present lower risk of bias. The two Zero carbon Hub reports were merged into one for the purpose of this analysis, as one report is the continuation of the other.
First Author (Year)Q1Q2Q3Q4Q5Q6Q7Q8Q9Q10Q13Q14Q16Total Score
Alencastro (2018)10000001000114
Gram-Hanssen (2018)0.50000001000113.5
IPEEC (2019)0.50000001000113.5
McElroy (2019)10.5100001100116.5
Shi (2019)10000001000114
Van Dronkelaar (2016)0.50000001000113.5
Zero Carbon Hub (2014)00.50.500000.50.500103
Zou (2018)1010.50000.5000115
Table 2. Study characteristics from the articles reviewed.
Table 2. Study characteristics from the articles reviewed.
First Author (Year)TitleStudy ScopeTheme and Building TypeLocation ConditionsReview TypeNumber of Articles or Case Studies IncludedStudy FundingConflict of Interests
Alencastro (2018) [27]The relationship between quality defects and the thermal performance of buildingsIdentification of quality defects during building construction causing EPG. Review of causes and impacts on energy performance. The article also identifies gaps in research.Construction
Residential and non-residential
Researchers based in the UK.
Articles reviewed were from Europe, the UK, Australia, China, Malaysia, Singapore, Canada, Iran, Nigeria and the USA.
Narrative literature review of academic articles76 articlesBrazilian Ministry of Science, Technology and Innovation through the Science without Borders research programmeNone declared
Gram-Hanssen (2018) [26]What next for energy-related building regulations?: the occupancy phaseReview of Danish building regulations and how they affect different stages of the building life cycle (technologies, design, construction and operation). The article suggests ways of redesigning the Danish building regulations.All stages of the building life cycle
Residential
Researchers based in Denmark
Article locations not specified
Narrative literature review of academic articles and grey literatureNot statedInnovationsfondenNone declared
IPECC (2019) [7]Building Energy Performance Gap Issues, an international reviewReview of the EPG in buildings, existing modelling systems and their use in demonstrating compliance to building regulations. The article proposes areas of opportunity to address the EPG.All stages of the building life cycle
Residential and non-residential
Researchers based in France.
Articles reviewed were from the UK, Australia and Canada
Narrative literature review of academic articles and grey literature7 articlesEnergy Security and Efficiency Division of the Australian Department of the Environment and EnergyNone declared
McElroy (2019) [28]Policy implications for the performance gap of low-carbon building technologiesReview of the grey literature on the EPG of specific building technologies. The article suggests policy steps to address the issue.Installation and commissioning of building technologies
Residential
Researchers based in the UK and Australia
Case studies reviewed are from the UK
Review of unpublished case studies6 case studiesResearch Council UKNone declared
Shi (2019) [4]Magnitude, causes, and solutions of the performance gap of buildings: A reviewReview of the EPG including definition, magnitude, techniques to measure/determine the EPG, causes and possible solutions.All stages of the building life cycle
Residential and non-residential
Researchers based in China.
Articles reviewed were from Cyprus, Portugal, Belgium, Canada, the UK, Italy, Spain, the USA, Germany, Denmark and Australia
Systematic literature review22 articlesMinistry of Science and Technology of China and the Scientific Research Foundation of Graduate School of Southeast UniversityNone declared
Van Dronkelaar (2016) [9]A review of the energy performance gap and its underlying causes in non-domestic buildingsImpact of EPG causes on energy performance. The article focuses on non-residential buildings.All stages of the building life cycle
Non-residential
Researchers based in the UK.
Case study locations: the UK, Belgium, Australia, the USA, Austria and Canada
Narrative literature review62 case studiesEngineering and Physical Sciences Research Council (EPSRC) and BuroHappold EngineeringNone declared
Zero Carbon Hub (2014) [29]Closing The Gap Between Design & As-Built Performance Evidence Review ReportReport discusses the causes for the gap between design and as-built building performance and reveals the mains priority areas to be addressed.Planning, Design, construction and commissioning
Residential
NGO based in the UK.
Article locations not specified
Narrative literature review and survey100 reports and academic articles (45% research) + survey of 150 assessorsNo funding acknowledgedNone declared
Zero Carbon Hub (2014) [30]Closing The Gap Between Design & As-Built Performance. End of term reportReport discusses strategic steps for industry and government to address the gaps identified in the previous report.Planning, Design, construction and commissioning
Residential
NGO based in the UK.
Article locations not specified
Narrative literature review and survey100 reports and academic articles (45% research) + survey of 150 assessorsNo funding acknowledgedNone declared
Zou (2018) [13]Review of 10 years research on building energy performance gap: Life-cycle and stakeholder perspectivesReview of academic articles on EPG. The article analyses themes studied in previous research; reviews the causes of the gaps and actors involved in each step; reviews solutions currently proposed; and discusses further areas of research.All stages of the building life cycle
Residential and non-residential
Researchers based in Australia and China.
Article locations not specified
Systematic literature review227 articlesAustralian Research Council (ARC) Research Hub and the National Nature Science Foundation of ChinaNone declared

Share and Cite

MDPI and ACS Style

Eon, C.; Breadsell, J.K.; Byrne, J.; Morrison, G.M. The Discrepancy between As-Built and As-Designed in Energy Efficient Buildings: A Rapid Review. Sustainability 2020, 12, 6372. https://doi.org/10.3390/su12166372

AMA Style

Eon C, Breadsell JK, Byrne J, Morrison GM. The Discrepancy between As-Built and As-Designed in Energy Efficient Buildings: A Rapid Review. Sustainability. 2020; 12(16):6372. https://doi.org/10.3390/su12166372

Chicago/Turabian Style

Eon, Christine, Jessica K. Breadsell, Joshua Byrne, and Gregory M. Morrison. 2020. "The Discrepancy between As-Built and As-Designed in Energy Efficient Buildings: A Rapid Review" Sustainability 12, no. 16: 6372. https://doi.org/10.3390/su12166372

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop