Next Article in Journal
Solutions for Modelling the Marine Oil Spill Drift
Previous Article in Journal
A Comprehensive Environmental Cost–Benefit Analysis of Using Reclaimed Water for Irrigation in Southern Spain
Previous Article in Special Issue
The Potential of Wood Construction Waste Circularity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards Sustainable Energy: Predictive Models for Space Heating Consumption at the European Central Bank

NOVA Information Management School, Universidade NOVA de Lisboa, 1070-312 Lisboa, Portugal
*
Author to whom correspondence should be addressed.
Environments 2025, 12(4), 131; https://doi.org/10.3390/environments12040131
Submission received: 19 February 2025 / Revised: 8 April 2025 / Accepted: 18 April 2025 / Published: 21 April 2025

Abstract

:
Space heating consumption prediction is critical for energy management and efficiency, directly impacting sustainability and efforts to reduce greenhouse gas emissions. Accurate models enable better demand forecasting, promote the use of green energy, and support decarbonization goals. However, existing models often lack precision due to limited feature sets, suboptimal algorithm choices, and limited access to weather data, which reduces generalizability. This study addresses these gaps by evaluating various Machine Learning and Deep Learning models, including K-Nearest Neighbors, Support Vector Regression, Decision Trees, Linear Regression, XGBoost, Random Forest, Gradient Boosting, AdaBoost, Long Short-Term Memory, and Gated Recurrent Units. We utilized space heating consumption data from the European Central Bank Headquarters office as a case study. We employed a methodology that involved splitting the features into three categories based on the correlation and evaluating model performance using Mean Squared Error, Mean Absolute Error, Root Mean Squared Error, and R-squared metrics. Results indicate that XGBoost consistently outperformed other models, particularly when utilizing all available features, achieving an R2 value of 0.966 using the weather data from the building weather station. This model’s superior performance underscores the importance of comprehensive feature sets for accurate predictions. The significance of this study lies in its contribution to sustainable energy management practices. By improving the accuracy of space heating consumption forecasts, our approach supports the efficient use of green energy resources, aiding in the global efforts towards decarbonization and reducing carbon footprints in urban environments.

1. Introduction

The study of building energy demand has gained significant importance due to the growing focus on energy sustainability, especially following the European Directive on Energy Performance of Buildings (EPB) implementation. In Europe, buildings are responsible for 40% of total energy consumption and 36% of total CO2 emissions [1]. Accurate predicting building energy consumption is crucial for effective energy management, as it helps identify abnormal energy usage and diagnose potential causes, provided that sufficient historical data is available [2]. However, conventional energy prediction methods often fall short due to their reliance on rigid assumptions and limited adaptability to dynamic energy consumption patterns.
Recently, there has been a shift from merely calculating energy consumption to analyzing the actual energy use of buildings [3,4]. This shift is driven by the complexity of building energy systems and behavior, which non-calibrated models fail to accurately predict, thus requiring real-time data analysis of energy use. Traditionally, estimating building energy use involves applying a model with known system structures, properties, and external variables (forward approach). These engineering methods utilize physical principles to calculate thermal dynamics and energy behavior at the building or sub-component level [5]. Despite their theoretical accuracy, these physics-based models often struggle with scalability and require extensive domain expertise. Furthermore, their reliance on detailed building specifications, which may not always be available, limits their practical applicability in large-scale or heterogeneous building environments.
Various software tools, such as DOE-2 (e.g., version 2.2) [6], EnergyPlus (e.g., version 9.6.0) [7], and TRNSYS (e.g., version 18.02) [8], have been developed for this purpose. However, these tools require detailed knowledge of numerous building parameters and behaviors, which are often unavailable. Consequently, simplified methods for predicting building energy use have been developed. For instance, the steady-state method using degree days was presented in [9]. Yao and Steemers [10] also introduced a simple method for formulating load profiles for U.K. domestic buildings using a thermal dynamic model to predict daily energy demand profiles for appliances, domestic hot water, and space heating. While these methods provide useful approximations, they often lack the flexibility to capture non-linear dependencies between variables and fail to adapt to evolving building occupancy patterns or climatic variations.
Organizational measures for energy efficiency encompass a variety of strategies aimed at reducing consumption while maximizing resource utilization [11,12]. Such methods include the installation of energy-efficient lighting systems [13], improving HVAC for better performance [14], and innovative building technologies [15]. Although these measures significantly contribute to energy conservation, their effectiveness is often constrained by high initial costs, resistance to change, and the lack of real-time monitoring mechanisms. The absence of adaptive strategies also limits their ability to respond to fluctuating energy demands.
In recent years, the development of energy management, especially in space heating consumption (SHC), has been transformed by the introduction and implementation of Artificial Intelligence (AI) with Machine Learning (ML) [16]. AI refers to imitating intelligence processes by computer systems, including learning, reasoning, and self-correction. At the same time, ML is a subset of AI that exclusively deals with developing algorithms that allow computers to learn, make predictions, or take actions based on data [17]. These technologies, including models such as neural networks [18], DTs [19], and support vector machines [20], are developed and used to optimize energy utilization, predict demand, patterns, and costs, and detect inefficiencies in various energy systems. For instance, neural networks can be trained to examine complicated energy data and give insights into energy usage patterns, which will help with making informed decisions regarding energy optimization strategies [21]. However, while ML-based approaches provide substantial improvements in predictive accuracy, their performance is often model-dependent and sensitive to data quality [22]. Many studies fail to incorporate diverse data sources, leading to biased predictions that do not generalize well across different buildings or climatic conditions. Furthermore, the black-box nature of some ML models raises concerns regarding interpretability, making it difficult for facility managers to trust and act on predictions. With the potential for real-time monitoring and adaptive control, Internet of Things (IoT) devices could help in dynamic adjustments that optimize energy consumption according to the situation. Similarly, predictive maintenance models can detect equipment failures and operational deterioration, enabling proactive interventions to be carried out before interruptions and the subsequent wastage of energy resources [14].
While this study presents a novel approach by integrating multiple ML models and localized weather data for SHC prediction, it builds upon a growing body of research exploring data-driven energy forecasting methods. Numerous studies have leveraged ML and DL models for building energy demand estimation, yet many focus on single techniques or limited feature sets, restricting their applicability across different climatic and operational contexts. A comprehensive review of prior works is necessary to contextualize our study, highlight gaps in existing methodologies, and reinforce the need for a comparative ML framework in SHC forecasting.
With new developments in ML, it has been possible to achieve considerable progress in the past few years [23]. For instance, Xue et al. [24] proposed an ML-based framework that applies to multi-step-ahead district heating system load forecasting by testing SVR, deep neural network, and extreme gradient boosting (XGBoost) models. Furthermore, Li & Yao [25] developed a system for the prediction of building heating as well as cooling loads that includes occupant behavior as a predictor variable and also considers five ML models. Additionally, Jovanović, Sretenović, and Živković [26] investigate the prediction of heating energy consumption for a university campus, employing various artificial neural network architectures. However, challenges remain in refining predictive approaches due to the complex relationship between input variables and SHC, particularly in diverse campus settings.
On the other hand, Yuan et al. [27] introduce a novel sample data selection method (SDSM) to enhance the prediction accuracies of Back Propagation Neural Network (BPNN) and Multi-layer Perceptron MLP models for heating energy consumption, demonstrating significant reductions in training and prediction errors for BPNN models. Moreover, Potočnik, Škerl, and Govekar [28] present an ML-based approach for short-term heat demand forecasting in district heating systems, highlighting the superiority of Gaussian Process Regression (GPR) in achieving accurate forecasts for the most prominent Slovenian DH system. Jang et al. [29] focus on enhancing the prediction accuracy of building heat consumption using LSTM models, showing improved performance when operation pattern data from non-residential buildings is incorporated. However, challenges remain, such as the limited applicability of SDSM to MLP models and the need for more comprehensive data to enhance model accuracy in diverse building scenarios.
ML models have been applied to predict building heat loads, optimize energy consumption, and enhance sustainability. Dalipi et al. [30] introduce a supervised ML model for predicting heat load in a district heating system, evaluating SVR, Partial Least Square, and RF algorithms. This study focuses on different algorithms’ performance in a district heating context, highlighting their varying predictive capabilities. In another approach, Moradzadeh et al. [31] propose a methodology for forecasting heating and cooling loads in residential buildings using MLP and SVR techniques. This study explicitly targets residential buildings and emphasizes the predictive accuracy of these models in that context. Abdelkader et al. [32] address the need for energy-efficient buildings by comparing various ML models, finding that the radial basis neural network performs better. However, they identify limitations such as the focus on specific meteorological parameters and reliance on simulated data, which may impact the real-world applicability of these models.
In addressing the imperative for energy efficiency across diverse domains, Shen et al. [33] emphasize optimizing energy usage in greenhouses to reduce production costs, employing mathematical modeling and algorithmic optimization. Similarly, Moradzadeh et al. [34] contribute to advancing accurate building energy consumption prediction, focusing on residential buildings’ cooling and heating loads. Their novel hybrid model, Gaussian SVR, integrates the Group Method of Data Handling (GMDH) and SVR techniques, presenting promising results, albeit with complexities and applicability concerns. Furthermore, while Shen et al. achieve promising outcomes, challenges persist due to the intricacies of greenhouse energy exchange and seasonal variations, suggesting further research to expand the model’s scope to address summer cooling and optimize temperature settings throughout the week.
Building energy consumption prediction, particularly for heating loads, has been explored through various ML and optimization techniques such as SVR, neural networks, and hybrid models as shown in Table 1. These studies have demonstrated promising results despite the absence of certain critical variables, indicating the potential for further advancements in energy forecasting methodologies. However, the exclusion of essential parameters suggests the need for more comprehensive models that incorporate a broader range of influential factors. Addressing these gaps, this study aims to enhance the precision and generalizability of SHC predictions at the European Central Bank headquarters by evaluating and comparing multiple ML models using data from multiple weather stations. The goal is to improve demand forecasting accuracy, support sustainable energy management, and contribute to global decarbonization efforts.
The contributions of this study are as follows:
  • We conducted a thorough comparison of various ML and DL models, including K-Nearest Neighbor (KNN), Support Vector Regression (SVR), Decision Tree (DT), Linear Regression (LR), XGBoost, Random Forest (RF), Gradient Boosting (GB), AdaBoost, Long-Short-Term-Memory (LSTM), and Gated Recurrent Unit (GRU), to determine the most accurate model for SHC prediction.
  • We incorporated data from four distinct weather stations, improving the generalizability of the findings and ensuring that the models were evaluated across diverse climatic conditions.
  • We utilized two comprehensive feature sets (Feature Set 1 and Feature Set 2) derived from detailed weather data, enhancing the robustness and depth of the analysis.
Furthermore, this study underscores the potential for organizations to develop in-house energy management solutions in compliance with the EPB European Directive, promoting energy efficiency and sustainability.

2. Dataset

The dataset includes operational and environmental aspects of SHC consumption at the European Central Bank (ECB) Headquarters in Frankfurt. This state-of-the-art building has multiple sensor networks as part of an advanced Building Control System, enabling granular analysis.
Data from the district heating network supplying the building was collected. It was complemented with data from several weather stations located around and on top of the building, selected based on proximity. This comprehensive dataset offers a rich resource for understanding the intricate interplay between environmental variables and heating demands.
These data are used for historical analysis to understand building heating demands and make high-level estimations for the future. However, adopting ML techniques would significantly enhance the accuracy and efficiency of these forecasts. ML can analyze complex patterns and relationships within the data that traditional methods might overlook, leading to more precise and reliable predictions. This comprehensive dataset, therefore, offers a rich resource for understanding the intricate interplay between environmental variables and heating demands, providing a solid foundation for advanced predictive modeling through ML.

2.1. Heating Consumption Data

SHC data give an overview of energy consumption within a building by providing a list of important parameters, including SHC value, volume, inflow water temperature, and return water temperature. These data were obtained from a district heating network and represent the actual heating demand required to heat the entire building area, which is actively used by real-world ventures. The heating load in the studied building is primarily used for space heating and hot water supply. The contribution of each component varies, with space heating being more dependent on weather conditions such as outdoor temperature and humidity, while hot water supply exhibits a more stable demand pattern influenced by occupancy schedules. The district heating system is the baseload supplier, giving the building foundation a stable and diversified heat provision.
This data were collected by a smart meter, which is fitted to the premises’ technical areas. This device, outfitted with different sensors, detects and accurately documents the heating usage. The data monitor is situated at the interface between the district heating network provider and the office building’s heating system. This data monitor is an integral part of the overall data acquisition process.
The Heating Degree Day (HDD) is a metric that can estimate the energy demand required to heat a building. It represents the number of degrees by which a day’s average temperature falls below a specific base temperature, the threshold below which buildings require heating [35]. In this case, the 15 °C threshold has been selected based on expert knowledge, ensuring accuracy and relevance in measuring the building heating requirements. It is important to note that different climatic regions may employ different base temperatures to account for local variations in heating needs. Heating Degree Hours (HDH) is used in this study because it provides a more granular and precise measure of heating demand than HDD, particularly when evaluating hourly temperature data. The formula for calculating HDH is as follows:
15 T o u t ( h ) 24               i f   T o u t h < 15 0                             i f   T o u t h 15
where Tout is the hourly outside temperature, and HDH represents the hourly degree hours.
By using HDH, we can capture short-term variations in temperature that influence heating needs, leading to more accurate predictions of energy consumption for Heating.

2.2. Weather Variables from the Building Weather Station (BWS)

The weather variable dataset consists of various meteorological parameters for evaluating atmospheric conditions, including temperature, humidity, wind speed, etc. These factors reflect weather pattern dynamics that influence building operations and energy usage. This dataset, collected from a weather station atop the ECB skyscraper, provides localized insights. It has up-to-date weather information that applies to the immediate surroundings of the building. Positioning the weather station at such an elevation maximizes the gathering of appropriate data that echo the building’s atmosphere.
Connection with the Building Automation System (BAS) enhances the efficiency and availability of data. The weather station has a smooth link with the BAS, making data transmission and storage much easier. With this integration in place, the BAS control rooms are adequately equipped to provide the building managers and operators with real-time monitoring of weather conditions, thus enabling them to make informed decisions based on up-to-date weather information. Additionally, the BAS’s historical data analysis functionality allows retrospective weather trends and patterns. The BAS makes it possible to archive weather data over time, allowing for deep analysis and enabling key actors to find connections between weather variables and building performance metrics.

2.3. Local Weather Stations

The local weather data comprise a comprehensive archive of meteorological information collected from three local weather stations: Frankfurt Airport (station 1420), Frankfurt am Main–Westend (station 1424), and Offenbach Weather Park (station 7341). These stations are placed explicitly at varying distances from the main study building, each allowing for the specific study of localized weather conditions that may influence operations and energy management. This dataset is sourced from DWD (Deutscher Wetterdienst) and their climate data center, which is renowned as the most reliable and authoritative source for historical weather data. The DWD obtains a broader picture to understand weather phenomena and trends using weather stations covering diverse areas. The data acquisition process from the DWD’s climate data center is precise and includes careful extraction and compilation processes. Researchers access historical weather data from selected weather stations through the Deutscher Wetterdienst Climate Data Center on the DWD website.

2.4. Trends

Figure 1 illustrates the total heating degree hours (HDH15) categorized by season across four weather stations: the BWS, 7341, 1420, and 1424. Observing the data, the BWS records the highest total heating degree hours during the winter season, surpassing 7000, while 1424 exhibits the lowest total, with around 1420 heating degree hours. Conversely, as the graph indicates, the total heating degree hours exhibit a consistent pattern across all stations, with the winter season consistently registering the highest values and the summer season showing the lowest.
The second feature set covers various environmental factors relating to weather, time, and environmental conditions, vital to understanding the intricate interactions affecting SHC inside the building. This representation excludes direct measurements, in this case, ‘Heating water volume m³’, ‘Return temperature °C’, and ‘Flow temperature °C’, aimed at capturing the broader context of heating demand. For meteorological variables, it incorporates the following values: humidity, temperature, dew point, air temperatures, vapor pressure, absolute humidity, visibility, and wind speed and temporal indicators, such as season, month, weekday, day, and hour, it provides a complete framework that embraces the external factors that affect energy demand for Heating. These features are selected based on their well-established impact on heating demand, as demonstrated in the existing literature on building energy modeling. Moreover, introducing the smart meters’ data transmission helps to include features like precipitation yes/no, air pressure, wind direction, and year. Thus, the study becomes more diversified as it allows the examination of long-term trends and seasonal variations in heating demand.
This trend can be attributed to the fundamental principle that colder temperatures in winter necessitate increased heating to maintain indoor comfort levels. As a result, wintertime calls for more space SHC than other seasons and, hence, higher heating degree hours. On the other hand, in the summer period, less heat is required due to higher outside temperatures, reducing the number of heating degree hours.
Examining variations among the weather stations, the BWS records the highest total heating degree hours during winter seasons, indicating that it experiences colder temperatures or requires more heating resources than the other stations during the winter. On the other hand, local weather station 1424 frequently shows the lowest total heating degree hours, which may be due to milder temperatures or lower heating requirements.

2.5. Feature Selection

This study considers Feature Set 1 and Feature Set 2, offering unique insights into the factors influencing SHC and building energy efficiency. Feature Set 1 will cover the operational features, such as those relating to the heating system, that enable the analysis and improvement of the building’s performance. The measurements, like ‘Heating water volume m3’, ‘Return temperature °C’, and ‘Flow temperature °C’ are very informative. They provide the operating mode of the heating infrastructure for accurate monitoring and control. The selection of these features is based on their direct relevance to heating system performance, as supported by prior studies on energy consumption modeling in buildings. Besides that, one of the ways of enhancing the dataset accuracy is by using other indicators such as humidity, dew point, air temperature, vapor pressure, absolute humidity, visibility, relative humidity, sunshine duration, month, precipitation yes/no, air pressure, wind speed, wind direction, and day.

2.6. Feature Selection Methodology

These features are selected for their ability to capture both direct and indirect influences on SHC patterns within the building. First, the features of Feature Set 1 are the three dimensions, ‘Heating water volume m³’, ‘Return temperature [°C]’, and ‘Flow temperature [°C]’, as they are the parameters that determine the operation of the heating system in the building. Furthermore, Feature Set 2 has a vast collection of meteorological, temporal, and environmental variables like humidity levels, air temperature, and precipitation. These features are selected for their indirect but significant impact on heating demands, reflecting the ambient conditions and external factors that influence indoor temperature regulation and energy usage.

Feature Divisions

In this study, features were categorized into three groups: 3 main features, 7 main features, and all features, based on correlation analysis and model interpretability. This selection ensures better generalization, computational efficiency, and a more robust training process by focusing on highly correlated attributes to simplify model complexity. It is pertinent to mention that this study did not use Principal Component Analysis (PCA) or Recursive Feature Elimination (RFE) for feature selection, as our primary objective was to analyze the impact of high-dimensional feature sets on heating consumption prediction. Instead of reducing dimensionality through PCA, which transforms features into principal components, or using RFE, which iteratively removes less significant features, we relied on correlation-based feature selection. This approach allowed us to retain interpretable variables and assess their individual and combined influence on model performance.
The correlations between features and the target variable, Heating, vary across different weather stations due to each location’s unique environmental conditions and building characteristics. For instance, in Feature Set 1, the correlation between ‘Heating water volume m3’ and Heating may be higher for one weather station than others, reflecting the specific heating system dynamics in that building. Similarly, in Feature Set 2, the correlation between ‘humidity temperature’ and Heating may differ among weather stations, indicating variations in the influence of meteorological factors on heating demand.
Table 2 presents the correlation values for Feature Set 1, (which includes operational features related to the heating system, such as ‘Heating water volume m³’, ‘Return temperature °C’, and ‘Flow temperature °C’) across all four weather stations reveals notable variations in the relationships between features and the target variable, Heating. As an illustration, ‘Water volume m3’ shows the most significant correlations throughout all stations, which proves it is the main factor determining energy consumption for office space heating. In contrast, attributes like ‘Hour’ and ‘Flow temperature °C’ are generally low correlations, implying a weaker association between heating demand and these features. The most apparent way correlations differ across weather stations is that there are prominent differences in correlation for some features among different weather stations. For instance, at the BWS, HDH15 emerges as the second-strongest correlation, while at the 1420 weather station, ‘humidity temperature’ occupies that place.
Similarly, the depiction of correlation in Table 3 for Feature Set 2 (which consists of environmental and temporal factors like humidity, temperature, dew point, wind speed, and time indicators such as season, month, and hour) across all four weather stations shows individual relations among attributes and the objective variable Heating. However, HDH15 seems to have robust correlations with all stations over the years, thus highlighting their effect on how heating demand is distributed. However, the ‘Hour’ and ‘Day’ variables are most weakly correlated within the entire group, representing the lowest degrees of relationship with the heating demand. Also, the associations among the parameters are slightly different for different weather stations. For example, ‘humidity temperature’ as a component for SHC yields the highest correlation value at the 1420 station, but ‘BWS global radiation’ shows the highest correlation at the BWS. These variations point out that such variables should be considered when forecasting the heating usage by a station and developing models for it.

3. Methodology

The methodology involves collecting weather data from multiple weather stations, which is then stored in a centralized database for processing. The data undergoes feature extraction, where it is categorized into technical (Feature Set 1) and non-technical (Feature Set 2) features. Feature selection is then performed to refine these sets for analysis. The selected features are further divided into three categories: a subset of 3 features, a subset of 7 features, and the entire feature set. These feature sets are subsequently used to model heating demand, with the performance of the models evaluated using Mean Squared Error (MSE), Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), and R-Squared metrics to determine accuracy and reliability. The proposed methodology is shown in Figure 2.

3.1. Preprocessing

The preprocessing starts by separating numeric columns, excluding ‘Year’, ‘Month’, and ‘Hour’, from non-numeric ones. Missing values are hierarchically imputed using group means, prioritized by ‘Year’, ‘Month’, and ‘Hour’. The remaining missing values are then filled with group means based on ‘Year’ and ‘Month’, followed by ‘Year’ and ‘Season’. Finally, any remaining missing values are imputed with the annual average. Categorical variables, including ‘Season’ and ‘Weekday’, are transformed into numerical format through label encoding. Moreover, hyperparameters used for each model were carefully selected through grid search and cross-validation techniques.

3.2. Modelling

The study assesses a range of models like KNN, SVR, DT, Linear Regression, XGBoost, RF, GB, AdaBoost, LSTM, and GRU to establish the significance of using localized weather data on the accuracy of predictive models for SHC. Hyperparameter tuning was conducted using grid search with cross-validation to ensure optimal performance of each model. Variations in hyperparameter tuning, such as learning rate and max depth, significantly impact the predictive accuracy of ensemble models. A lower learning rate allows for more gradual learning, reducing the risk of overfitting but requiring longer training times. In contrast, a higher learning rate speeds up convergence but may lead to suboptimal solutions. Similarly, increasing max depth enhances the model’s ability to capture complex patterns but can also increase the risk of overfitting. Details of these models are listed in Table 4.
The chosen building for this study, the European Central Bank Headquarters, was selected due to its availability of high-quality, detailed weather and energy consumption data. This building also represents a complex urban structure with diverse heating requirements, making it a suitable case study for testing the robustness of the models. Experiments on other buildings were not carried out due to the unavailability of similarly detailed datasets that include localized weather conditions and operational parameters. However, the methodology is designed to be generalizable and can be applied to other buildings with appropriate data availability.

3.3. Evaluation Metrics

In this research, various evaluation metrics are used to evaluate the developed models for measuring the SHC. The metrics are Mean Squared Error (MSE), Mean Absolute Error (MAE), Root Mean Squared Error (RMSE), Mean Absolute Percentage Error (MAPE), and R-squared (R²).
Mean Squared Error (MSE): MSE measures the average of the squares of the errors between actual ( y i ) and predicted ( y i ^ ) values [46]. This study quantifies the average squared difference between the observed and predicted SHC values, providing insight into the overall accuracy of the predictive model. MSE is particularly important for penalizing more significant errors more heavily, making it a critical metric when large prediction deviations are undesirable.
M S E : 1 n i = 1 n ( y i y i ^ ) 2
Mean Absolute Error (MAE): MAE calculates the average of the absolute differences between actual ( y i ) and predicted ( y i ^ ) values. It measures the average magnitude of prediction errors without considering their direction [47]. In the context of this study, MAE evaluates the average magnitude of errors in predicting SHC, offering a straightforward measure of model performance. Unlike MSE, MAE treats all errors equally, making it more interpretable and suitable for understanding the typical prediction error.
M A E : 1 n i = 1 n y i y i ^ 2
Root Mean Squared Error (RMSE): RMSE is the square root of the average of the squared differences between actual ( y i ) and predicted ( y i ^ ) values [48]. It measures the standard deviation of the prediction errors and is interpretable in the same units as the target variable. In this study, RMSE assesses the typical error magnitude of the predictive model in predicting SHC. RMSE is crucial when the scale of prediction errors needs to be expressed in the same unit as the target variable, making it easier to contextualize the error magnitude. It is also more sensitive to outliers than MAE, which may be advantageous in specific scenarios.
R M S E : 1 n i = 1 n ( y i y i ^ ) 2
R-Squared (R2): R2, also known as the coefficient of determination, measures the proportion of the variance in the dependent variable (y) that is explained by the independent variable(s) ( y ¯ ) [49]. It ranges from 0 to 1, where 1 indicates that the model perfectly predicts the dependent variable based on the independent variables, and 0 indicates that the model does not explain any variability in the dependent variable. In this study, R² assesses the goodness of fit of the predictive model to the observed SHC data, indicating how well the model captures the variability in the target variable based on the features used for prediction. R² is particularly important for evaluating the model’s explanatory power and understanding how well the model generalizes to unseen data. It provides an intuitive measure of model effectiveness by comparing explained variance to total variance.
R 2 = 1 1 n i = 1 n ( y i y i ^ ) 2 1 n i = 1 n ( y i y ¯ ) 2
Loss Function: The loss function was constructed using standard regression metrics, including Mean Squared Error (MSE) as the primary optimization criterion. MSE was chosen due to its sensitivity to large errors, ensuring that the model minimizes significant deviations in predictions. Additionally, we monitored Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) to assess model performance comprehensively. These loss functions were optimized using the gradient boosting framework of XGBoost, which iteratively reduces errors by adjusting model weights based on residuals from previous iterations.

3.4. Experimental Setup

Experiments are conducted using common Python libraries for data preprocessing, model development, and evaluation. Pandas is utilized for data manipulation and analysis, while NumPy supports numerical computing and array operations. Scikit-learn provides a range of algorithms for ML tasks, including model training, evaluation, and preprocessing. DL models are constructed using TensorFlow and PyTorch, which offer high-level APIs for building neural networks and optimizing performance. The matplotlib library facilitates data visualization to track dataset characteristics and model performance. Random seed values are set to ensure reproducibility across experiments. The dataset is divided into training and testing sets with an 80–20 split, where 80% of the data are for training and the rest 20% are for testing. In addition, a K-fold cross-validation with K = 5 is applied during the training phase to minimize the risk of overfitting and verify the models’ generalization capacity.
Several experiments are conducted based on the systematic division of the feature sets. The details of these feature sets are shown in Table 5.

4. Results

We tested the performance of the different predictive models built from various feature sets. Each model was evaluated considering MSE, MAE, RMSE, and R2 to assess its predictive accuracy.

4.1. Feature Set 1

The evaluation of predictive models using Feature Set 1 consists of a comprehensive array of operational and environmental features crucial for understanding SHC patterns. Various combinations of Feature Set 1 are utilized for model evaluation, with feature divisions based on correlations with the target variable. The main aim is to identify the most influential features and assess their impact on model performance in predicting SHC across different weather stations.
Across various weather stations, the XGBoost ensemble learning model consistently outperforms other ML and DL models in predicting SHC, as shown in Table 6. For weather station 1420, XGBoost excels with an MSE of 2.5, MAE of 0.085, RMSE of 15.7, and an R² value of 97.6 in Feature Set 1 with only 3 features. These results indicate that even with a minimal feature set, XGBoost effectively captures the underlying data patterns, making it a suitable choice for scenarios with limited data availability. It maintains its lead with excellent metric values in the 7-feature and all-feature sets, underscoring its accuracy and the strength of ensemble learning techniques. This suggests that adding more features enhances the model’s ability to generalize, further improving prediction performance. This trend continues for weather station 1424, where XGBoost achieves the lowest MSE of 2.5175, MAE of 8.4946, RMSE of 15.8668, and an R² value of 97.5322 in the 3-feature set. The model’s ability to perform well with different feature subsets demonstrates its adaptability and robustness in various climatic conditions. Consistent performance across different feature sets at this station further highlights XGBoost’s ability to capture complex data relationships.
The robustness of XGBoost is also evident at weather stations 7341 and the BWS, as shown in Table 7. For station 7341, XGBoost achieves an MSE of 2.906, MAE of 8.875, RMSE of 17.0471, and an R² value of 97.1514 in the 3-feature set. These findings reinforce the effectiveness of gradient boosting in refining predictions by iteratively reducing errors. It also achieves lower MSE, MAE, and RMSE values and higher R² values in the larger feature sets. Similarly, at the BWS, XGBoost demonstrates superior accuracy with an MSE of 2.1923, MAE of 8.0211, RMSE of 14.8064, and an R² value of 97.851 in the 3-feature set. These results indicate that XGBoost effectively learns from different environmental conditions, making it a viable option for diverse geographical locations. Its consistent top performance across all feature sets at this station highlights its effectiveness in predicting SHC based on local weather data. Such reliability across multiple weather stations confirms that XGBoost is a robust and scalable model for SHC prediction, offering practical applications for real-world energy forecasting scenarios.

Discussion Feature Set 1

The XGBoost model outperforms all other models within both divisions of 3 Features, 7 Features, and All Features datasets, as the findings from Feature Set 1 indicate. XGBoost is the algorithm that has the best MSE, MAE, RMSE, and R² values. This consistent performance underscores the strength of ensemble learning techniques, particularly XGBoost, in effectively capturing the nonlinear relationships between features and SHC. Unlike other models, XGBoost leverages its gradient boosting framework to iteratively refine weak learners, reducing bias and variance effectively. Using gradient-boosting algorithms enables better handling of complex interactions and outliers, contributing to its superior predictive accuracy compared to other models. Among all the models considered, the XGBoost model is consistently better than KNN, SVR, DT, Linear Regression, RF, GB, and AdaBoost, as well as Deep Learning models such as LSTM and GRU, which exhibit lower predictive performance in this study. The lower performance of LSTM and GRU may be attributed to the relatively small dataset size and the absence of sequential dependencies in the feature set, which limits the advantages of recurrent architectures. Irrespective of the dataset type and weather station type, XGBoost demonstrates superior accuracy.
When comparing the results achieved by considering different weather stations, BSW has the smallest MSE, MAE, and RMSE in all models and feature divisions among the other weather stations, thus supporting the better predictive accuracy resulting from this station. This means that the environment and the building characteristics at the BWS may be the ones that are more suitable for the accurate prediction of SHC at the ECB. The superior performance at the BSW suggests that this station’s localized weather conditions and building characteristics are more representative of the factors influencing SHC at the ECB. This finding emphasizes the importance of selecting relevant weather stations for predictive modeling and highlights the potential for station-specific optimization in future applications.
Regarding the division of features, the All Features set is the most consistent in producing less MSE, MAE, and RMSE for each weather station and model, implying better predictive ability compared to the 3 Features and 7 Features sets. This superior performance is attributed to XGBoost’s ability to effectively handle high-dimensional data and select the most relevant features through its built-in feature importance mechanism. Thus, combining operational and environmental features and behavioral patterns results in accurate model performance in predicting the amount of Heating consumed. Including a diverse combination of operational and environmental features, along with behavioral patterns, enhances the model’s ability to generalize and capture key predictors of SHC. This result justifies using comprehensive feature sets to improve the reliability and accuracy of predictions.

4.2. Feature Set 2

In this section, we present the results of Feature Set 2, where different combinations of features were evaluated to investigate their impact on predictive model performance. Like Feature Set 1, Feature Set 2 was also constructed based on correlations among weather parameters and their potential influence on SHC.
XGBoost consistently emerges as the top performer in predictive accuracy across various feature divisions, as shown in Table 8. In the 3-Features division, XGBoost achieves the lowest MSE of 28.7, MAE of 35.7, RMSE of 53.6, and an R² value of 71.8. This suggests that even a limited subset of features contributes significantly to accurate predictions, demonstrating the model’s efficiency in handling high-impact variables. Similarly, in the 7-Features and All Features divisions, XGBoost consistently maintains its lead with the lowest MSE, MAE, and RMSE values and the highest R² values. These observations highlight that adding more features enhances model performance but does not drastically alter the ranking of XGBoost as the best performer. These results demonstrate XGBoost’s robustness in effectively leveraging feature correlations to improve predictions, irrespective of the feature division. Its gradient-boosting mechanism allows it to capture complex relationships, providing a clear justification for its superior performance across all datasets. This ability to integrate non-linear dependencies among features further validates its suitability for SHC forecasting. This consistent performance highlights the robustness of XGBoost in leveraging different feature divisions to accurately predict SHC for weather station 1420. For weather station 1424, XGBoost is the top-performing model in predictive accuracy, as shown in Table 8. The strong correlation between predicted and actual values suggests that the model efficiently learns from historical data, making it a practical choice for real-world deployment. This reliable performance underscores XGBoost’s suitability in utilizing various feature divisions to accurately forecast SHC for weather station 1424.
Similarly, for weather station 7341, XGBoost consistently demonstrates superior performance across feature divisions within Feature Set 2, as shown in Table 9. In the 3-Features division, XGBoost achieves an MSE of 37.1, MAE of 41.8, RMSE of 60.9, and an R² value of 63.6. While the R² value is slightly lower in this case, it still indicates a strong relationship between predicted and actual SHC values, confirming the model’s dependability. In the 7-Features and All Features divisions, XGBoost excels with low MSE, MAE, and RMSE and high R² values. This further emphasizes the importance of incorporating a well-balanced feature set to enhance predictive accuracy. This performance emphasizes XGBoost’s capability to employ different feature divisions to precisely predict SHC for weather station 7341. Likewise, XGBoost consistently emerges as the top-performing model regarding predictive accuracy for the BWS, as shown in Table 9. In the 3-Features division, XGBoost achieves an MSE of 27.6, MAE of 35.3, RMSE of 52.5, and an R² value of 73. These values suggest that even with fewer features, XGBoost maintains a high level of accuracy, making it a computationally efficient option. In the 7-Features and All Features divisions, XGBoost maintains its lead with notably low MSE, MAE, and RMSE values, and high R² values as shown in Table 9. The findings indicate that XGBoost’s ensemble learning mechanism is highly effective in capturing SHC patterns, making it an optimal choice for energy demand forecasting in varying conditions.

Discussion Feature Set 2

In Feature Set 2, we find that XGBoost is the highest-scoring model for weather stations and other features such as temperature, humidity, and pressure. This superior performance is primarily due to XGBoost’s ability to handle missing data effectively, optimize decision trees through gradient boosting, and assign different weights to features, thereby improving model robustness. Unlike traditional machine learning models that rely on a fixed set of assumptions, XGBoost dynamically adjusts to patterns in the data, capturing complex dependencies and nonlinear relationships more effectively. In comparison, deep learning models such as LSTM and GRU exhibit lower predictive performance in this study. Their comparatively weaker results may be attributed to the limited dataset size and the absence of sequential dependencies in the features, which reduces the effectiveness of recurrent architectures for this particular task. This leads us to conclude that XGBoost is very useful in considering all the information given by Feature Set 2, which results in highly accurate predictions of SHC at different weather stations.
The best-performing weather station varies depending on the feature division. However, the BWS tends to exhibit the best results across different feature divisions. This indicates that the BWS has more representative or informative data for predicting SHC than other weather stations in the dataset.
The feature division aspect of XGBoost is comparable to other methods, consistently performing exceptionally well across all divisions. Notably, the ‘All Features’ division is the most effective, indicating that the comprehensive set of features in Feature Set 2 yields more precise predictions for SHC than the subsets of 3 or 7 features. Training a model using all available features likely enriches the data, providing a deeper understanding and more accurate modeling of complex heating behaviors.
To minimize the risk of overfitting, cross-validation techniques were rigorously applied during model training. We employed k-fold cross-validation to ensure that models generalized well to unseen data and to prevent excessive reliance on any particular subset of the dataset. Furthermore, XGBoost’s built-in regularization mechanisms, such as L1 and L2 penalties, help control model complexity, reducing overfitting risks while maintaining high predictive accuracy. The robustness of XGBoost across different feature sets and weather stations further supports the model’s ability to generalize well beyond the training data.

4.3. Comparison

Different studies have explored various machine learning techniques for heat load prediction, each utilizing distinct feature sets and methodologies to enhance forecasting accuracy. The MLR-ANN model [50] incorporates historical and current outdoor temperature, wind speed, solar radiation, and seasonal information, achieving a strong RMSE of 82 with an impressive R² of 98.2. However, the absence of reported MSE and MAE values limits a comprehensive evaluation of its performance. Similarly, the Bi-LSTM model [51] leverages both past and future weather information, demonstrating reasonable accuracy with an MAE of 14 and an RMSE of 19. Despite these promising results, the lack of an R² value makes it difficult to assess its overall fit compared to other models.
The Parallel Convolutional Neural Network–Long Short-Term Memory Attention (PCLA) model [52] enhances heat load forecasting by integrating spatial and temporal features using district heater-related variables, weather forecasts, and time factors. This approach results in a lower MSE of 66.2, an MAE of 57.1, and an R² of 94.2, demonstrating strong predictive capabilities. However, this study’s XGBoost model surpasses all previous techniques, achieving the lowest MSE (3), MAE (1.8), and RMSE (5.4), along with the highest R² (99.7). The comparison of this study with existing studies is shown in Table 10.

4.4. Deployment Considerations and Policy Implications

For real-world applications, the deployment of predictive models like XGBoost for SHC forecasting can be integrated into Building Management Systems (BMS) to enhance energy efficiency. The models can be deployed as part of an automated control system that continuously monitors operational and environmental variables, optimizing heating consumption in response to real-time data. Deployment can be achieved through cloud-based solutions or on-premises implementations, where the trained models are embedded in BMS software to provide real-time predictive insights. Additionally, an API-based integration could facilitate seamless communication between the predictive model and existing building automation platforms, ensuring adaptability to different infrastructure settings.
This study serves as a framework for EU institutions to enhance their understanding of building heating demands and develop more efficient, data-driven energy management strategies. Leveraging ML-based heating predictions enables policymakers to create adaptive and resource-efficient in-house solutions that align with EU energy regulations, including the Energy Performance of Buildings Directive (EPBD) and the EU Green Deal’s carbon neutrality goals. Integrating predictive modeling into energy policies facilitates real-time monitoring, proactive energy adjustments, and optimized resource utilization, leading to reduced carbon footprints and the promotion of sustainable energy practices.

5. Limitations of the Study and Models

Despite the strong predictive performance of XGBoost and other models evaluated in this study, several limitations must be acknowledged. First, the generalizability of the models is constrained by the specific dataset used, which is derived from the European Central Bank Headquarters. While the models demonstrated high accuracy within this setting, their effectiveness in other buildings with different structural characteristics, occupancy patterns, and heating systems remains uncertain. Second, although weather data from multiple stations were considered, variations in local microclimates and unaccounted environmental factors may impact the accuracy of predictions. Additionally, while XGBoost outperformed other models, its computational complexity and higher training time compared to simpler models such as Decision Trees or Linear Regression could be a limiting factor in real-time applications or scenarios with limited computational resources. Moreover, deep learning models like LSTM and GRU, despite their ability to capture temporal dependencies, exhibited lower performance due to limited historical data, indicating the need for larger datasets to leverage their full potential. Finally, the study primarily focused on supervised learning approaches, leaving room for future research to explore hybrid models incorporating unsupervised learning techniques for feature extraction or reinforcement learning for adaptive energy management strategies. Addressing these limitations through expanded datasets, feature engineering, and model optimization can further enhance the robustness and applicability of SHC prediction models.

6. Conclusions

This study employed various combinations of feature sets to investigate their impact on the accuracy of predictive models for SHC across different weather stations. We identified XGBoost as the top-performing model consistently across all feature divisions and weather stations through comprehensive evaluations. XGBoost demonstrated superior predictive accuracy, effectively utilizing the comprehensive information in Feature Set 2 to enhance prediction performance. For instance, at the BWS, XGBoost achieved an MSE of 2.1923, MAE of 8.0211, RMSE of 14.8064, and an R² value of 97.851 in the 3-feature set, showcasing its robustness. Similarly, at weather station 1420, XGBoost excelled with an MSE of 2.5, MAE of 8.5, RMSE of 15.7, and an R² value of 97.6. Our analysis revealed that the BWS generally exhibited the best results, indicating its potential for providing more representative data for SHC prediction. Moreover, the All Features division consistently outperformed the subsets of features (3 or 7), emphasizing the importance of utilizing comprehensive feature sets to capture intricate relationships within the data. For example, across all weather stations, the All Features division consistently resulted in lower error metrics, with XGBoost achieving its highest accuracy levels.
This study focuses on a single building due to data availability, but the methodology is designed to be generalizable to other office buildings with similar heating demand characteristics. The selected building, consisting of two interconnected skyscrapers, presents a unique case with complex heating dynamics influenced by its architectural design and urban environment. The proposed approach is highly scalable, as it can be implemented in other buildings with access to granular heating consumption data from Building Management Systems (BMS). These systems collect detailed operational data, which, when combined with weather data obtained from an open-source website, enables easy adaptation to various buildings and locations. However, the accuracy of the model relies heavily on the quality and granularity of the data, highlighting the importance of robust data management practices for effective implementation and scalability across different settings.
In future work, we will delve deeper into feature engineering to identify additional variables that could enhance predictive performance. Exploring advanced ensemble techniques or DL architectures explicitly tailored for the SHC prediction task could also be beneficial. Additionally, an analysis of computational efficiency, including training times and model complexity trade-offs, will be conducted to assess real-world implementation feasibility. Furthermore, incorporating external factors such as building characteristics, occupancy patterns, or socioeconomic factors could improve the robustness and generalizability of the predictive models. An extended evaluation of model interpretability using SHAP (Shapley Additive Explanations) values will also be explored to identify the most influential predictors, ensuring better transparency and trust in the predictions. Overall, continued research in this domain holds the potential to refine predictive models and contribute to more efficient energy management strategies in office building settings.
Moreover, this study’s findings can extend to residential buildings by adjusting feature selection and model training to account for occupancy patterns, insulation levels, and heating system variations. Residential heating consumption is more dynamic due to diverse user behaviors and seasonal changes. XGBoost’s ability to capture complex relationships between weather, operations, and heating suggests its potential for residential heating forecasts. Integrating smart meters and IoT-based monitoring can further enhance energy efficiency, helping homeowners and policymakers reduce costs and carbon footprints. Future work can validate the model’s adaptability using diverse residential datasets.
This study underscores organizations’ ability to develop in-house energy management solutions, enabling them to autonomously meet their energy responsibilities. By aligning with the EPB European Directive’s standards for energy performance in buildings, our approach bridges a critical gap in research, highlighting the practical application of advanced predictive models in enhancing energy efficiency. The insights gained from this study can guide organizations in leveraging their resources to meet stringent energy efficiency measures, ultimately contributing to sustainable energy management and compliance with European standards.

Author Contributions

Conceptualization, F.A.; methodology, F.A.; software, F.A.; validation, M.C. and N.C.-R.; formal analysis, F.A.; investigation, F.A.; resources, F.A.; data curation, F.A.; writing—original draft preparation, F.A.; writing—review and editing, M.C. and N.C.-R.; visualization, F.A.; supervision, M.C. and N.C.-R.; project administration, F.A.; funding acquisition, M.C. and N.C.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by national funds through FCT (Fundação para a Ciência e a Tecnologia), under the project—UIDB/04152/2020 (DOI: 10.54499/UIDB/04152/2020)—Centro de Investigação em Gestão de Informação (MagIC)/NOVA IMS).

Data Availability Statement

Restrictions apply to the availability of these data. The data were obtained from the European Central Bank and are not publicly available. Access to these data is subject to the ECB’s approval. Interested researchers should contact the corresponding author, who will facilitate the request process with the ECB.

Conflicts of Interest

The authors have no competing interests to declare that are relevant to the content of this article.

References

  1. EU. Council. Directive 2010/31/EU of the European Parliament and of the Council of 19 May 2010 on the Energy Performance of Buildings (Recast); European Commission: Brussels, Belgium, 2010. [Google Scholar]
  2. Mirjalili, M.A.; Aslani, A.; Zahedi, R.; Soleimani, M. A comparative study of machine learning and deep learning methods for energy balance prediction in a hybrid building-renewable energy system. Sustain. Energy Res. 2023, 10, 8. [Google Scholar] [CrossRef]
  3. Marín-García, D.; Bienvenido, D.H.; Nieto-Julián, E.; Campos, J.J.M.; Farinha, M.J.O.; Farinha, F. Analysis of the Regulations That Affect Energy Efficiency with Respect to Consumption of HVAC System for Residential Buildings in Southern Spain and Portugal; Springer: Cham, Switzerland, 2019. [Google Scholar]
  4. Bienvenido-Huertas, D.; Sánchez-García, D.; Marín-García, D.; Rubio-Bellido, C. Analysing energy poverty in warm climate zones in Spain through artificial intelligence. J. Build. Eng. 2023, 68, 106116. [Google Scholar] [CrossRef]
  5. Zhao, H.X.; Magoulès, F. A review on the prediction of building energy consumption. Renew. Sustain. Energy Rev. 2012, 16, 3586–3592. [Google Scholar] [CrossRef]
  6. DOE. DOE 2. Available online: https://www.doe2.com/ (accessed on 22 May 2024).
  7. E. Plus. Energy Plus. Available online: https://energyplus.net/ (accessed on 22 May 2024).
  8. TRNSYS. Transient System Simulation Tool. Available online: https://www.trnsys.com/ (accessed on 22 May 2024).
  9. Al-Homoud, M.S. Computer-aided building energy analysis techniques. Build. Environ. 2001, 36, 421–433. [Google Scholar] [CrossRef]
  10. Yao, R.; Steemers, K. A method of formulating energy load profile for domestic buildings in the UK. Energy Build. 2005, 37, 663–671. [Google Scholar] [CrossRef]
  11. Shi, C.; Zheng, J.; Wang, Y.; Gan, C.; Zhang, L.; Sheldon, B.W. Machine Learning-Driven Scattering Efficiency Prediction in Passive Daytime Radiative Cooling. Atmosphere 2025, 16, 95. [Google Scholar] [CrossRef]
  12. Nuthakki, S.; Kulkarni, C.S.; Kathiriya, S.; Nuthakki, Y. Artificial Intelligence Applications in Natural Gas Industry: A Literature Review. Int. J. Eng. Adv. Technol. 2024, 13, 64–70. [Google Scholar] [CrossRef]
  13. Muhamad, W.N.W.; Zain, M.Y.M.; Wahab, N.; Aziz, N.H.A.; Kadir, R.A. Energy Efficient Lighting System Design for Building. In Proceedings of the ISMS 2010—UKSim/AMSS 2010 International Conference on Intelligent Systems, Modelling and Simulation, Liverpool, UK, 27–29 January 2010; pp. 282–286. [Google Scholar] [CrossRef]
  14. Yang, Y.; Hu, G.; Spanos, C.J. Stochastic Optimal Control of HVAC System for Energy-Efficient Buildings. IEEE Trans. Control Syst. Technol. 2022, 30, 376–383. [Google Scholar] [CrossRef]
  15. Rocha, P.; Siddiqui, A.; Stadler, M. Improving energy efficiency via smart building energy management systems: A comparison with policy measures. Energy Build. 2015, 88, 203–213. [Google Scholar] [CrossRef]
  16. Budler, L.C.; Gosak, L.; Stiglic, G. Review of artificial intelligence-based question-answering systems in healthcare. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2023, 13, e1487. [Google Scholar] [CrossRef]
  17. Hartmann, T.; Moawad, A.; Schockaert, C.; Fouquet, F.; Le Traon, Y. Meta-Modelling Meta-Learning. In Proceedings of the 2019 ACM/IEEE 22nd International Conference on Model Driven Engineering Languages and Systems (MODELS), Munich, Germany, 15–20 September 2019; pp. 300–305. [Google Scholar] [CrossRef]
  18. Mostafa, N.; Ramadan, H.S.M.; Elfarouk, O. Renewable energy management in smart grids by using big data analytics and machine learning. Mach. Learn. Appl. 2022, 9, 100363. [Google Scholar] [CrossRef]
  19. Liu, X.; Ding, Y.; Tang, H.; Xiao, F. A data mining-based framework for the identification of daily electricity usage patterns and anomaly detection in building electricity consumption data. Energy Build. 2021, 231, 110601. [Google Scholar] [CrossRef]
  20. Almuhaini, S.H.; Sultana, N. Forecasting Long-Term Electricity Consumption in Saudi Arabia Based on Statistical and Machine Learning Algorithms to Enhance Electric Power Supply Management. Energies 2023, 16, 2035. [Google Scholar] [CrossRef]
  21. Behrang, M.A.; Assareh, E.; Assari, M.R.; Ghanbarzadeh, A. Using bees algorithm and artificial neural network to forecast world carbon dioxide emission. Energy Sources Part A Recovery Util. Environ. Eff. 2011, 33, 1747–1759. [Google Scholar] [CrossRef]
  22. Nuthakki, S. Conversational AI and LLM’s Current and Future Impacts in Improving and Scaling Health Services. Int. J. Comput. Eng. Technol. 2023, 14, 149–155. [Google Scholar] [CrossRef]
  23. Cui, X.; Zhu, J.; Jia, L.; Wang, J.; Wu, Y. A novel heat load prediction model of district heating system based on hybrid whale optimization algorithm (WOA) and CNN-LSTM with attention mechanism. Energy 2024, 312, 133536. [Google Scholar] [CrossRef]
  24. Xue, P.; Jiang, Y.; Zhou, Z.; Chen, X.; Fang, X.; Liu, J. Multi-step ahead forecasting of heat load in district heating systems using machine learning algorithms. Energy 2019, 188, 116085. [Google Scholar] [CrossRef]
  25. Li, X.; Yao, R. A machine-learning-based approach to predict residential annual space heating and cooling loads considering occupant behaviour. Energy 2020, 212, 118676. [Google Scholar] [CrossRef]
  26. Jovanović, R.; Sretenović, A.A.; Živković, B.D. Ensemble of various neural networks for prediction of heating energy consumption. Energy Build. 2015, 94, 189–199. [Google Scholar] [CrossRef]
  27. Yuan, T.; Zhu, N.; Shi, Y.; Chang, C.; Yang, K.; Ding, Y. Sample data selection method for improving the prediction accuracy of the heating energy consumption. Energy Build. 2018, 158, 234–243. [Google Scholar] [CrossRef]
  28. Potočnik, P.; Škerl, P.; Govekar, E. Machine-learning-based multi-step heat demand forecasting in a district heating system. Energy Build. 2021, 233, 110673. [Google Scholar] [CrossRef]
  29. Jang, J.; Han, J.; Leigh, S.B. Prediction of heating energy consumption with operation pattern variables for non-residential buildings using LSTM networks. Energy Build. 2022, 255, 111647. [Google Scholar] [CrossRef]
  30. Dalipi, F.; Yildirim Yayilgan, S.; Gebremedhin, A. Data-Driven Machine-Learning Model in District Heating System for Heat Load Prediction: A Comparison Study. Appl. Comput. Intell. Soft Comput. 2016, 2016, 3403150. [Google Scholar] [CrossRef]
  31. Moradzadeh, A.; Mansour-Saatloo, A.; Mohammadi-Ivatloo, B.; Anvari-Moghaddam, A. Performance evaluation of two machine learning techniques in heating and cooling loads forecasting of residential buildings. Appl. Sci. 2020, 10, 3829. [Google Scholar] [CrossRef]
  32. Abdelkader, E.M.; Al-Sakkaf, A.; Ahmed, R. A comprehensive comparative analysis of machine learning models for predicting heating and cooling loads. Decis. Sci. Lett. 2020, 9, 409–420. [Google Scholar] [CrossRef]
  33. Shen, Y.; Wei, R.; Xu, L. Energy consumption prediction of a greenhouse and optimization of daily average temperature. Energies 2018, 11, 65. [Google Scholar] [CrossRef]
  34. Moradzadeh, A.; Mohammadi-Ivatloo, B.; Abapour, M.; Anvari-Moghaddam, A.; Roy, S.S. Heating and Cooling Loads Forecasting for Residential Buildings Based on Hybrid Machine Learning Applications: A Comprehensive Review and Comparative Analysis. IEEE Access 2022, 10, 2196–2215. [Google Scholar] [CrossRef]
  35. Meng, Q.; Xi, Y.; Zhang, X.; Mourshed, M.; Hui, Y. Evaluating multiple parameters dependency of base temperature for heating degree-days in building energy prediction. Build. Simul. 2021, 14, 969–985. [Google Scholar] [CrossRef]
  36. Guo, G.; Wang, H.; Bell, D.; Bi, Y.; Greer, K. KNN model-based approach in classification. In Proceedings of the on the Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE: OTM Confederated International Conferences, CoopIS, DOA, and ODBASE 2003, Catania, Sicily, Italy, 3–7 November 2003; pp. 986–996. [Google Scholar]
  37. Zhang, F.; O’Donnell, L.J. Support Vector Regression; Academic Press: Cambridge, MA, USA, 2020. [Google Scholar]
  38. Myles, A.J.; Feudale, R.N.; Liu, Y.; Woody, N.A.; Brown, S.D. An introduction to decision tree modeling. J. Chemom. A J. Chemom. Soc. 2004, 18, 275–285. [Google Scholar] [CrossRef]
  39. Su, X.; Yan, X.; Tsai, C.L. Linear regression. Wiley Interdiscip. Rev. Comput. Stat. 2012, 4, 275–294. [Google Scholar] [CrossRef]
  40. Chen, T.; Guestrin, C. XGBoost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
  41. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  42. Natekin, A.; Knoll, A. Gradient boosting machines, a tutorial. Front. Neurorobot. 2013, 7, 21. [Google Scholar] [CrossRef] [PubMed]
  43. Solomatine, D.P.; Shrestha, D.L. AdaBoost. RT: A boosting algorithm for regression problems. In Proceedings of the IEEE International Joint Conference on Neural Networks, Budapest, Hungary, 25–29 July 2004; Volume 2, pp. 1163–1168. [Google Scholar] [CrossRef]
  44. Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput. 2000, 12, 2451–2471. [Google Scholar] [CrossRef] [PubMed]
  45. Yao, K.; Cohn, T.; Vylomova, K.; Duh, K.; Dyer, C. Depth-Gated Recurrent Neural Networks. arXiv 2015, arXiv:1508.03790. [Google Scholar]
  46. Alaraj, M.; Kumar, A.; Alsaidan, I.; Rizwan, M.; Jamil, M. Energy Production Forecasting from Solar Photovoltaic Plants Based on Meteorological Parameters for Qassim Region, Saudi Arabia. IEEE Access 2021, 9, 83241–83251. [Google Scholar] [CrossRef]
  47. Dadhich, M.; Pahwa, M.S.; Jain, V.; Doshi, R. Predictive Models for Stock Market Index Using Stochastic Time Series ARIMA Modeling in Emerging Economy. In Advances in Mechanical Engineering; Springer: Berlin/Heidelberg, Germany, 2021; pp. 281–290. [Google Scholar] [CrossRef]
  48. Shrivastava, S.; Bal, P.K.; Ashrit, R.; Sharma, K.; Lodh, A.; Mitra, A.K. Performance of NCUM global weather modeling system in predicting the extreme rainfall events over the central India during the Indian summer monsoon 2016. Model. Earth Syst. Environ. 2017, 3, 1409–1419. [Google Scholar] [CrossRef]
  49. Brahimi, T. Using artificial intelligence to predict wind speed for energy application in Saudi Arabia. Energies 2019, 12, 4669. [Google Scholar] [CrossRef]
  50. Hua, P.; Wang, H.; Xie, Z.; Lahdelma, R. District heating load patterns and short-term forecasting for buildings and city level. Energy 2024, 289, 129866. [Google Scholar] [CrossRef]
  51. Cui, M. District heating load prediction algorithm based on bidirectional long short-term memory network model. Energy 2022, 254, 124283. [Google Scholar] [CrossRef]
  52. Chung, W.H.; Gu, Y.H.; Yoo, S.J. District heater load forecasting based on machine learning and parallel CNN-LSTM attention. Energy 2022, 246, 123350. [Google Scholar] [CrossRef]
Figure 1. Trends of heating hours across weather stations in all seasons.
Figure 1. Trends of heating hours across weather stations in all seasons.
Environments 12 00131 g001
Figure 2. Proposed methodology.
Figure 2. Proposed methodology.
Environments 12 00131 g002
Table 1. Overview of existing studies.
Table 1. Overview of existing studies.
Ref.YearRegionFeaturesTechnique (s)Model (s)LimitationThis Study
[24]2019ChinaHeat load, specific heat capacity of water, total mass flow rate of District Heating System, supply and return temperature at substations.Machine Learning, Deep Learning, Ensemble LearningSupport Vector Regression, Deep Neural Network, Extreme Gradient BoostingFurther exploration is needed for application in model predictive control and DHS operation optimization.XGBoost also performed well in our study, but its effectiveness varies based on data complexity.
[25]2020ChinaExterior wall and window U-value, exterior window, inflation rate, % of time occupant stay at home, infiltration rate. Machine Learning, Deep LearningSVT, ANNSuggested sample sizes for training and validation sets may not be universally applicable.This study addresses feature variability by integrating dynamic environmental factors.
[26]2015NorwayMean daily outside temperature, daily wind speed, total daily solar radiation, minimum daily temperature, maximum daily temperature, relative humidity, day of the week, month of the year, and SHC of the previous day.Deep LearningFFNN, RBFN, ANFISComplexity of the relationship between input variables and SHC.This study used tree-based models for better interpretability of complex interactions.
[27]2018ChinaMeteorological parametersDeep LearningBPNN, MLRExclusion of significant factors like solar radiation due to limited practical engineering data.GPR’s computational complexity limits its scalability; our approach prioritizes efficiency.
[28]2021SloveniaPast aggregated statistics, currently available data sampled in 3 h, weather forecast, time, and seasonal cycle information.Machine LearningGPRNeed for systematic analysis to fine-tune model parameters for specific factors of different DH systems.Ensemble models provide an alternative that balances accuracy and explainability.
[29]2022South KoreaBuilding environmental data, outdoor environmental data, patterns of energy consumption.Deep LearningLSTMNeed for more comprehensive data beyond essential sensor readings for improved model performance.This study prioritizes real-time data over simulated inputs.
[30]2016NorwayTime of day, forward temperature, return temperature, flow rate, and heat load.Machine Learning, Ensemble LearningSVR, RF, PLSFocus on specific meteorological parameters, potentially overlooking other influential factors like wind speed and humidity.This study goes beyond seasonal variations by considering long-term operational trends.
[31]2020GreeceRelative compactness, surface area, wall area, roof area, overall height, heating area.Machine Learning, Deep LearningSVR, MLRReliance on simulated data may limit real-world applicability.
[32]2020-Relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, and glazing area distribution.Deep LearningANN, GRNN, RBNNReliance on simulated data and limited consideration of real-world factors may impact applicability.
[33]2018ChinaActuator status data, environmental data, greenhouse real-time heating power.Optimization AlgorithmsPSO, GALimited applicability due to seasonal variations and complexity of energy exchange in greenhouses.
[34]2022GreeceRelative compactness, surface area, wall area, roof area, overall height, heating areaMachine Learning, Deep LearningHybrid modelComplexity and applicability to real-world data beyond energy consumption prediction.
Table 2. Correlation of Feature Set 1 (Operational + Environmental).
Table 2. Correlation of Feature Set 1 (Operational + Environmental).
FeatureCorrelationFeatureCorrelationFeatureCorrelationFeatureCorrelation
73411420BWS1424
Heating water volume m395.8874Heating water volume m395.8874Heating water volume m395.8874Heating water volume m395.8874
Heating Degree Hours (HDH15)77.1261humidity temperature78.8903Heating Degree Hours (HDH15)85.2723Heating Degree Hours (HDH15)80.6293
humidity temperature74.1434Heating Degree Hours (HDH15)77.9527air temperature79.5549wet bulb78.6527
air temperature72.5457dew point75.4706relative humidity39.2373air temperature75.2487
dew point70.2496air temperature73.2Return temperature °C35.5769dew point71.782
vapor pressure66.3501vapor pressure68.0134Season33.9704vapor pressure68.0187
absolute humidity65.7995absolute humidity66.9944global radiation21.1192absolute humidity67.1705
Return temperature °C35.5769Visibility43.3005Brightness highest value20.1686Return temperature °C35.5769
Season33.9704Return temperature °C35.5769Month19.4003Season33.9704
visibility30.4258Season33.9704Year8.9028relative humidity28.3693
relative humidity22.5466relative humidity26.9666wind speed8.8434Month19.4003
Month19.4003sunshine duration25.0405amount precipitation3.8948Year8.9028
wind speed15.1184Month19.4003air pressure3.5429precipitation yes/no7.7676
precipitation yes/no12.1546precipitation yes/no15.133Weekday2.7477Weekday2.7477
coverage clouds10.2735air pressure10.9283Day2.6939Day2.6939
air pressure9.9306Year8.9028Flow temperature °C1.9519air pressure2.3972
Year8.9028wind speed5.6757Hour1.4639Flow temperature °C1.9519
highest wind peak8.7917wind direction3.5276precipitation yes/no1.3629Hour1.4639
Weekday2.7477Weekday2.7477wind direction1.1482precipitation0.4398
Day2.6939Day2.6939precipitation0.9242
Flow temperature °C1.9519Flow temperature °C1.9519
Hour1.4639Hour1.4639
precipitation height0.9622Precipitation0.925
wind direction0.9327
Table 3. Correlation of Feature Set 2.
Table 3. Correlation of Feature Set 2.
FeatureCorrelationFeatureCorrelationFeatureCorrelationFeatureCorrelation
73411420BWS1424
Heating Degree Hours (HDH15)77.1261humidity temperature78.8903Heating Degree Hours (HDH15)85.2723Heating Degree Hours (HDH15)80.6293
humidity temperature74.1434Heating Degree Hours (HDH15)77.9527air temperature79.5549wet bulb78.6527
air temperature72.5457dew point75.4706relative humidity39.2373air temperature75.2487
dew point70.2496air temperature73.2Season33.9704dew point71.782
vapor pressure66.3501vapor pressure68.0134global radiation21.1192vapor pressure68.0187
absolute humidity65.7995absolute humidity66.9944Brightness highest value20.1686absolute humidity67.1705
Season33.9704visibility43.3005Month19.4003Season33.9704
Visibility30.4258Season33.9704Year8.9028relative humidity28.3693
relative humidity22.5466relative humidity26.9666wind speed8.8434Month19.4003
Month19.4003sunshine duration25.0405amount precipitation3.8948Year8.9028
wind speed15.1184Month19.4003air pressure3.5429precipitation yes/no7.7676
precipitation yes/no12.1546precipitation yes/no15.133Weekday2.7477Weekday2.7477
coverage clouds10.2735air pressure10.9283Day2.6939Day2.6939
air pressure9.9306Year8.9028Hour1.4639air pressure2.3972
Year8.9028wind speed5.6757precipitation yes/no1.3629Hour1.4639
highest wind peak8.7917wind direction3.5276wind direction1.1482precipitation0.4398
Weekday2.7477Weekday2.7477precipitation0.9242
Day2.6939Day2.6939
Hour1.4639Hour1.4639
precipitation height0.9622precipitation0.925
wind direction0.9327
Table 4. Summary of models.
Table 4. Summary of models.
MethodDescriptionAdvantagesDisadvantagesPeculiarities
Machine Learning
KNN [36] Predicts based on the ‘k’ most similar data pointsSimple and intuitiveComputationally intensive for large datasetsUses local weather data; sensitive to the choice of k
SVR [37]Finds the optimal hyperplane to minimize prediction errorsHandles non-linear relationships wellSensitive to parameter selection and kernel choiceSuitable for non-linear effects between inputs and target
DT [38]Recursively divides input space into smaller regionsEasy to interpret; captures non-linear relationshipsProne to overfittingProvides insight into feature importance
Linear Regression [39]Fits a linear equation to minimize differences between observed and predicted valuesEasy to use, interpret, and implementLimited to linear relationshipsSimple and fast; serves as a baseline model
Ensemble Learning
XGBoost [40]Ensemble method combining multiple decision treesHandles complex non-linear relationships well; robustCan be computationally expensiveEffective for large datasets; includes regularization
RF [41]Combines predictions from multiple decision treesReduces overfitting; handles complex relationshipsLess interpretable than single decision treesLeverages diverse perspectives of individual trees
GB [42]Iteratively trains decision trees to correct previous errorsHigh predictive accuracyProne to overfitting if not finely tunedOptimizes a loss function; each tree corrects the residuals of the previous.
AdaBoost [43]Sequentially trains weak learners, focusing on misclassified instancesImproves model performance iterativelySensitive to noisy dataWeights instances based on difficulty to predict correctly
Deep Learning
LSTM [44]Recurrent neural network designed to handle long sequences and temporal dependenciesSuitable for sequential data; handles vanishing gradient problemRequires significant computational resourcesIncludes 64 units in the LSTM layer; uses early stopping to prevent overfitting
GRU [45]RNN variant that regulates information flow for modeling temporal dependenciesSimplified architecture compared to LSTM; less computationally intensiveCan still be computationally expensiveUses 64 units in the GRU layer; early stopping is included to optimize performance
Table 5. Feature divisions for experiments.
Table 5. Feature divisions for experiments.
FeaturesFeature Set 1Feature Set 2
1420
3 FeaturesHeating water volume m³, Heating Degree Hours (HDH15), humidity temperatureHeating Degree Hours (HDH15), dew point, humidity temperature
7 FeaturesHeating water volume m³, air temperature, Heating Degree Hours (HDH15), dew point, absolute humidity, vapor pressure, humidity temperatureair temperature, Heating Degree Hours (HDH15), dew point, absolute humidity, vapor pressure, humidity temperature, visibility
AllAll the features included in Table 2All the features included in Table 3
1424
3 FeaturesHeating water volume m³, Heating Degree Hours (HDH15), wet bulbair temperature, Heating Degree Hours (HDH15), wet bulb
7 FeaturesHeating water volume m³, air temperature, Heating Degree Hours (HDH15), absolute humidity, vapor pressure, dew point, wet bulbSeason, air temperature, Heating Degree Hours (HDH15), absolute humidity, vapor pressure, dew point,
wet bulb
AllAll the features included in Table 2All the features included in Table 3
7341
3 FeaturesHeating water volume m³, Heating Degree Hours (HDH15), humidity temperatureair temperature, Heating Degree Hours (HDH15), humidity temperature
7 FeaturesHeating water volume m³, air temperature,
Heating Degree Hours (HDH15), absolute humidity, vapor pressure, dew point, wet bulb
Season, air temperature, Heating Degree Hours (HDH15), absolute humidity, vapor pressure, humidity temperature, dew point
AllAll the features included in Table 2All the features included in Table 3
BWS
3 FeaturesHeating water volume m³, air temperature,
Heating Degree Hours (HDH15)
air temperature, relative humidity, Heating Degree Hours (HDH15)
7 FeaturesSeason, Return temperature °C, Heating water volume m³, air temperature, relative humidity, global radiation, Heating Degree Hours (HDH15)Month, Season, air temperature, relative humidity, radiation, Brightness highest value, Heating Degree Hours (HDH15)
AllAll the features included in Table 2All the features included in Table 3
Table 6. Evaluation of models on 1420 and 1424 weather station dataset using Feature Set 1.
Table 6. Evaluation of models on 1420 and 1424 weather station dataset using Feature Set 1.
1420
3 Features7 FeaturesAll Features
ModelsMSEMAERMSER2MSEMAERMSER2MSEMAERMSER2
Machine LearningMachine LearningMachine Learning
KNN3.069.2217.49973.089.4717.596.93.069.217.497
SVR3.671119.1796.44.0811.0220.1963.671119.196.4
DT4.8611.4222.0595.244.811.2621.9954.8811.422.195.2
Linear Regression6.8114.3726.193.326.6713.8925.893.46.8114.326.193.3
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost2.58.515.797.62.48.315.597.72.48.415.797.5
RF2.88.916.697.32.58.515.997.52.78.816.697.2
GB2.89.516.897.22.99.71797.22.89.516.897.2
AdaBoost7.820.827.992.48.722.329.591.59.122.830.191
Deep LearningDeep LearningDeep Learning
LSTM3.29.517.996.93.2101896.79.320.130.690.7
GRU3.31018.396.73096.117.4978.118.528.592
1424
3 Features7 FeaturesAll Features
Machine LearningMachine LearningMachine Learning
KNN3.19.117.59739.317.397.10.738.399.3
SVR3.510.618.696.63.810.819.596.31.66.612.798.4
DT4.311.220.795.84.21120.495.92.49.615.697.6
Linear Regression6.513.925.693.66.513.725.693.65.913.324.294.3
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost2.58.515.997.52.48.415.497.70.41.96.399.6
RF2.68.816.297.42.58.615.897.50.42.66.499.6
GB2.89.616.897.22.89.616.897.21.15.210.399
AdaBoost11.326.833.688.911.327.633.68910.227.931.989.6
Deep LearningDeep LearningDeep Learning
LSTM3.29.517.996.93.2101896.79.320.230.690.8
GRU3.31018.396.73096.117.4977.317.72792
Table 7. Evaluation of models on the 7341 and BWS weather station dataset using Feature Set 1.
Table 7. Evaluation of models on the 7341 and BWS weather station dataset using Feature Set 1.
7341
3 Features7 FeaturesAll Features
ModelsMSEMAERMSER2MSEMAERMSER2MSEMAERMSER2
Machine LearningMachine LearningMachine Learning
KNN2.58.61697.51.77.312.998.40.73.18.299.3
SVR2.81016.797.31.7713.298.31.46.311.798.7
DT3.610.418.896.52.16.714.4983.110.917.796.9
Linear Regression6.314.12593.95.8142494.35.613.923.894.5
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost2.2814.897.91.14.910.698.90.31.85.499.7
RF2.38.815.197.81.25.310.898.90.42.56.199.6
GB2.58.815.997.51.77.213.198.315.11099
AdaBoost7.720.927.792.5925.430.191.110.428.132.389.8
Deep LearningDeep LearningDeep Learning
LSTM3.29.517.996.93.2101896.78.82029.891.2
GRU3.31018.396.73096.117.497819.228.392.1
BWS
3 Features7 FeaturesAll Features
Machine LearningMachine LearningMachine Learning
KNN3.39.518.196.839.317.397.10.738.199.4
SVR4.111.220.2963.810.819.596.31.66.612.798.4
DT5.211.922.894.94.21120.495.92.49.615.697.6
Linear Regression3.39.518.196.839.317.397.10.738.199.4
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost2.98.91797.22.48.415.497.70.41.96.399.6
RF3.19.917.6972.58.615.897.50.42.66.499.6
GB3.19.217.796.92.89.616.897.21.15.210.399
AdaBoost9.122.230.291.19.923.831.490.310.227.13289.9
Deep LearningDeep LearningDeep Learning
LSTM3.29.517.996.93.2101896.79.92131.590.2
GRU3.31018.396.73096.117.4977.418.127.392.6
Table 8. Evaluation of models on 1420 and 1424 weather station dataset using Feature Set 2.
Table 8. Evaluation of models on 1420 and 1424 weather station dataset using Feature Set 2.
1420
3 Features7 FeaturesAll Features
ModelsMSEMAERMSER2MSEMAERMSER2MSEMAERMSER2
Machine LearningMachine LearningMachine Learning
KNN33.237.657.667.429.935.254.770.713.321.336.586.9
SVR31.938.656.568.729.2365471.411.721.734.388.5
DT40.440.963.660.449.443.470.351.610.218.43290
Linear Regression36.244.760.164.533.841.958.166.92939.753.971.6
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost28.735.753.671.82633.55174.54.614.121.495.5
RF34.43858.766.328.834.553.771.74.713.221.895.3
GB30.337.955.170.327.73652.672.99.120.630.191.1
AdaBoost34.841.65965.935.644.559.765.138.354.261.962.4
Deep LearningDeep LearningDeep Learning
LSTM3138.15569.529.437.254.271.19.320.130.690.7
GRU30.737.85569.527.735.952.672.88.118.528.592
1424
3 Features7 FeaturesAll Features
Machine LearningMachine LearningMachine Learning
KNN3942.362.461.835.239.659.465.59.918.231.590.3
SVR34.240.158.566.432.237.856.768.512.622.535.587.7
DT44.544.666.756.452.546.672.548.5916.929.991.2
Linear Regression34.642.658.866.133.342.157.767.431.440.95669.2
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost34.240.458.566.531.538.156.169.14.413.420.995.7
RF39.842.7636138.240.761.862.64.312.420.795.8
GB33.540.457.967.230.838.655.569.88.419.52991.8
AdaBoost36.443.860.364.335.343.959.465.431.847.756.468.8
Deep LearningDeep LearningDeep Learning
LSTM3340.4586731.238.255.969.39.3620.230.690.8
GRU33.540.457.96731.138.755.769.47.317.727.192.7
Table 9. Evaluation of models on the 7341 and BWS datasets using Feature Set 2.
Table 9. Evaluation of models on the 7341 and BWS datasets using Feature Set 2.
7341
3 Features7 FeaturesAll Features
ModelsMSEMAERMSER2MSEMAERMSER2MSEMAERMSER2
Machine LearningMachine LearningMachine Learning
KNN42.54465.258.435.539.759.665.215.423.439.284.9
SVR3841.861.662.732.838.157.367.812.422.435.187.9
DT48.646.369.752.352.446.372.448.611.419.133.788.8
Linear Regression40.345.763.560.536.844.860.66431.741.556.368.9
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost37.141.860.963.631.938.156.568.74.71421.795.4
RF42.64465.358.237.740.461.463513.622.495.1
GB36.842.460.763.932.139.456.768.59.220.730.391
AdaBoost40.445.963.660.439.347.462.761.543.459.365.957.4
Deep LearningDeep LearningDeep Learning
LSTM37.842.961.562.832.439.156.968.29.92131.590.2
GRU38.143.761.862.532.139.256.668.57.418.127.392.6
BWS
3 Features7 FeaturesAll Features
Machine LearningMachine LearningMachine Learning
KNN31.136.955.769.522.429.447.378.111.219.633.589
SVR27.335.452.373.222.229.747.278.211.120.733.389.1
DT45.343.367.355.630.53255.270.17.11526.693.1
Linear Regression27.736.752.672.826.736.851.773.825.336.150.375.2
Ensemble LearningEnsemble LearningEnsemble Learning
XGBoost27.635.352.5731826.942.582.33.612.11996.5
RF32.437.65768.217.425.541.7833.51118.696.6
GB26.835.551.873.719.228.543.881.27.518.427.492.6
AdaBoost28.137.75372.426.138.151.174.430.747.855.469.9
Deep LearningDeep LearningDeep Learning
LSTM27.235.752.273.223.632.348.676.78.892029.891.2
GRU27.836.452.872.621.330.346.2798.0519.228.392.1
Table 10. Comparison of this study with existing studies.
Table 10. Comparison of this study with existing studies.
Ref.Technique NameFeaturesMSEMAERMSER2
Hua et al. [50]MLR-ANNHistorical and current outdoor temperature, wind speed, solar radiation, and season--8298.2
Cui [51]Bi-LSTMPast and future weather information-1419-
Chung et al. [52]Parallel Convolutional Neural Network–Long Short-Term Memory Attention (PCLA)District heater-related variables, heat load-derived variables, weather forecasts, time factors66.257.1-94.2
This StudyXGBoostoperational and environmental features0.31.85.499.7
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Almeida, F.; Castelli, M.; Côrte-Real, N. Towards Sustainable Energy: Predictive Models for Space Heating Consumption at the European Central Bank. Environments 2025, 12, 131. https://doi.org/10.3390/environments12040131

AMA Style

Almeida F, Castelli M, Côrte-Real N. Towards Sustainable Energy: Predictive Models for Space Heating Consumption at the European Central Bank. Environments. 2025; 12(4):131. https://doi.org/10.3390/environments12040131

Chicago/Turabian Style

Almeida, Fernando, Mauro Castelli, and Nadine Côrte-Real. 2025. "Towards Sustainable Energy: Predictive Models for Space Heating Consumption at the European Central Bank" Environments 12, no. 4: 131. https://doi.org/10.3390/environments12040131

APA Style

Almeida, F., Castelli, M., & Côrte-Real, N. (2025). Towards Sustainable Energy: Predictive Models for Space Heating Consumption at the European Central Bank. Environments, 12(4), 131. https://doi.org/10.3390/environments12040131

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop