Next Article in Journal
Enabling Diffusion Model for Conditioned Time Series Generation
Previous Article in Journal
Urban Inspirations: Crafting Unique Texture Patterns for Car Interiors
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Prediction of the Characteristics of Concrete Containing Crushed Brick Aggregate †

by
Marijana Hadzima-Nyarko
1,*,
Miljan Kovačević
2,
Ivanka Netinger Grubeša
3 and
Silva Lozančić
1
1
Faculty of Civil Engineering and Architecture Osijek, Josip Juraj Strossmayer University of Osijek, Vladimira Preloga 3, 31000 Osijek, Croatia
2
Faculty of Technical Sciences, University of Pristina, Knjaza Milosa 7, 38220 Kosovska Mitrovica, Serbia
3
Department of Construction, University North, 104. Brigade 3, 42000 Varaždin, Croatia
*
Author to whom correspondence should be addressed.
Presented at the 10th International Conference on Time Series and Forecasting, Gran Canaria, Spain, 15–18 July 2024.
Eng. Proc. 2024, 68(1), 24; https://doi.org/10.3390/engproc2024068024
Published: 8 July 2024

Abstract

:
The construction industry faces the challenge of conserving natural resources while maintaining environmental sustainability. This study investigates the feasibility of using recycled materials, particularly crushed clay bricks, as replacements for conventional aggregates in concrete. The research aims to optimize the performance of both single regression tree models and ensembles of regression trees in predicting concrete properties. The study focuses on optimizing key parameters like the minimum leaf size in the models. By testing various minimum leaf sizes and ensemble methods such as Random Forest and TreeBagger, the study evaluates metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and the coefficient of determination (R2). The analysis indicates that the most influential factors on concrete characteristics are the concrete’s age, the amount of superplasticizer used, and the size of crushed brick particles exceeding 4 mm. Additionally, the water-to-cement ratio significantly impacts the predictions. The regression tree models showed optimal performance with a minimum leaf size, achieving an RMSE of 4.00, an MAE of 2.95, an MAPE of 0.10, and an R2 of 0.96.

1. Introduction

The efficient utilization of limited natural resources is vital for environmental preservation. Since aggregates comprise approximately 60 to 75 percent of a concrete’s total volume, reducing natural aggregate consumption will significantly benefit the environment [1].
Earthquakes can produce substantial amounts of construction debris, which necessitate proper disposal. This issue can be mitigated or minimized by accurately predicting the risk and extent of earthquake damage, followed by the timely and appropriate renovation of susceptible structures. Also, the EU stipulates that a minimum of 70% of non-hazardous construction waste must be recovered by recycling, preparation for reuse, and other material recovery procedures, which is not even close to being achieved.
Therefore, using recycled materials instead of natural aggregates in concrete is important due to their environmentally sustainable nature. Including crushed clay bricks and clay roof tiles as substitutes for traditional aggregates is particularly noteworthy. This approach addresses waste management concerns and contributes to conserving natural resources. Consequently, concrete with recycled brick emerges as a promising and eco-friendly material deserving thorough exploration to uncover its unique properties.
The feasibility of using crushed brick as an aggregate hinges on the material’s condition, ensuring it meets the essential aggregate properties and the specific requirements for the final concrete product or the structure in which it will be incorporated. The properties of crushed brick as an aggregate that differ from the properties of aggregates from natural deposits are density, water absorption, and pollution.
Numerous countries have recently published insightful studies regarding using recycled brick concrete. Nevertheless, the majority of the research has concentrated on using recycled brick as a coarse aggregate, with relatively few studies investigating its application as a fine aggregate or in combinations of both types of aggregates. Generally, recycled brick is lighter than traditional aggregates, resulting in a notable reduction in the structure’s self-weight when incorporated into concrete. This reduction in self-weight offers benefits in terms of transportation and cost-effectiveness. Several studies indicate that substituting 15–25% of coarse aggregate with recycled brick or 50% of fine aggregate with recycled brick leads to concrete with mechanical properties nearly equivalent to conventional concrete [1,2,3]. Additionally, recycled brick can enhance concrete’s fire resistance due to its favorable electrical conductivity and thermal expansion properties. Furthermore, the strength of concrete containing recycled brick remains unaffected by the strength of the recycled brick itself. Research conducted by Khalaf and DeVenny [4], as well as Rekha and Malasani [5], has shown that concrete made with recycled brick aggregates can have heat resistance properties that are comparable to or even better than those of concrete made with natural aggregates at elevated temperatures. With numerous advantages, recycled brick concrete holds significant potential in the construction industry. One crucial mechanical property is its compressive strength, highlighting the need for further research.
The research by Kim et al. [6] explored the use of machine learning integrated with micromechanics to predict the mechanical behavior of concrete with crushed clay brick aggregates. Through experimental studies evaluating different mix ratios and aggregate replacement rates, it was found that concrete with crushed clay brick typically exhibits lower compressive strength. Their innovative approach combines experimental data with micromechanical models to enhance prediction accuracy, offering insights that could lead to more sustainable construction practices by optimizing recycled materials.
Lang et al. [7] employed artificial intelligence techniques, such as artificial neural networks and multigene genetic programming, to create precise models for predicting the compressive strength and elastic modulus of recycled brick aggregate concrete (RBAC). Their research highlighted that the mechanical properties of RBAC are mainly influenced by the cement paste’s standard strength, the water-to-cement ratio, the sand-to-aggregate mass ratio, the replacement ratio of the recycled brick aggregates, and the mass-weighted water absorption ratio of coarse aggregates. These AI models effectively captured the trends of these variables, providing dependable predictive outcomes. The study achieved accuracy with an RMSE of 2.797 and a MAPE of 0.122.
Thi Mai et al. [8] proposed an effective method for predicting the compressive strength (CS) of recycled brick aggregate concrete (RBAC) using ensemble machine learning (ML) models, including Gradient Boosting, Light Gradient Boosting, AdaBoost, Extreme Gradient Boosting, Stacking, and Voting. Their analysis found that the Stacking model performed the best and most consistently in predicting the compressive strength of recycled brick concrete (RBC). During validation, the Stacking model achieved an average R2 score of 0.88, with RMSE, MAE, and MAPE values of 3.92 MPa, 2.80 MPa, and 0.13, respectively. The model’s robustness was confirmed through stable prediction metrics across 20 simulations. In the testing phase, the Stacking model maintained high reliability, showing R2 scores up to 0.95, an RMSE as low as 2.70 MPa, and an MAE at 2.09 MPa, demonstrating its practical applicability.
This research explores the potential of RBAC in construction applications, assessing its mechanical properties and structural viability. The study employs sophisticated artificial intelligence techniques, focusing on tree-based methods, to create predictive models for the compressive strength of RBAC. These models aim to support the integration of RBAC in engineering applications, promoting both dependability and eco-friendliness. The investigation into RBAC not only supports waste reduction but also advances the utilization of recycled materials in load-bearing applications, aligning with global sustainability objectives.

2. Tree-Based Methods

2.1. Regression tree

Regression trees, a subset of decision trees, are used for forecasting continuous numerical outcomes. In regression trees, each leaf represents a specific numerical value. Binary recursive splitting, implemented in regression trees, is a process where nodes are split into two child nodes repeatedly, forming a binary tree (Figure 1).
Let the dataset consist of p values of the input variable and one value of the output variable, for each of N observations, i.e., ( x i , y i ) where i = 1, 2, …, N, and x i = ( x i 1 , x i 2 , , x i p ) . Within the tree formation algorithm, it is necessary to determine the variables (splitting variables) on which the splitting will be performed and the value (split points) on which the splitting will be performed.
At each node, the dataset is split based on a feature value (input variable) that optimizes a specific criterion. Building regression trees entails identifying the optimal variables and split points to effectively divide the input space into regions. This process involves minimizing a specific mathematical expression (Equation (1)) across all input variables to reduce the sum of the squared differences between the observed and predicted values within these regions [10,11,12,13,14].
min j , s min c 1 x i R 1 ( j , s ) ( y i c 1 ) 2 + min c 2 x i R 2 ( j , s ) ( y i c 2 ) 2
Suppose that it is segmented the space in M regions R 1 , R 2 , ,   R M .
In this context, the output model assigns a constant value c m to each region, which is represented by the following form (Equation (2)):
f x = m = 1 M c m I x R m .
c ^ m represents the mean value y i for the region R m , respectively (3):
c ^ m = a v e r a g e y i x i R m .
After determining the split points, the tree construction process progressively partitions the regions by optimizing at each step, focusing on immediate benefits (greedy approach).

2.2. Bagging and Random Forest

An individual regression tree model may perform well on the training set but might not generalize effectively to the test dataset. Bootstrap aggregation (Bagging) is a technique that can help address this issue.
Given a set of n independent observations Z 1 , Z 2 , ,   Z n , each with a variance of σ 2 , the variance of the mean value Z ¯ of the observations will have a value of σ 2 /n. Bootstrap aggregation, or Bagging, requires creating multiple training datasets to lower variance by averaging their outcomes. This is achieved through the bootstrap sampling technique, where repeated samples are drawn with replacement from the original training set. Bagging leverages this method to improve model generalization [10,11,12,13,14].
In this way, it is possible to generate B different bootstrap training sets, on which we can train models, and thus obtain B different models of regression trees.
When a model is trained on each bootstrap sample, it produces a prediction function f ^ * b ( x ) at point x . By averaging the predictions from all B models, the aggregate prediction function can be expressed as follows (Equation (4)):
f ^ b a g x = 1 B b = 1 B f ^ * b x .
Typically, the ensemble consists of several hundred to several thousand models [10].
The Random Forest method [10,11,12,13,14] is quite similar to the Bagging method; it is based on regression trees and the segmentation of the input space into simpler areas using binary recursive segmentation. When creating the tree, we also use the greedy algorithm and the same procedure for determining the variable on which the splitting will be performed as well as the splitting points.
The Random Forest method distinguishes itself by not utilizing all variables during model generation. Instead, it selects a random subset of variables for each split, resulting in decorrelated regression trees. This approach reduces variance when the ensemble’s results are averaged, leading to a more robust mean prediction.
The process of forming different models is based on the bootstrap method, by means of which are formed a large number of training sets for forming regression tree models. When constructing regression trees (Figure 2), only a random subset of variables is selected for each split, rather than using the entire set of available variables. Splits are performed based solely on these chosen variables.
If the complete set of input variables were used, the resulting model would be the same as the one generated by the Bagging method.

3. Case Study

The creation of a model for forecasting the compressive strength of concrete over time (CS) was realized based on the collected database, consisting of 264 concrete samples, from the literature [2,3,16,17,18,19,20].
The strength of concrete CS was analyzed as an output variable as a function of the following input variable: water–cement factor (WC), cement (C), sand (S), fine aggregate (FA), coarse aggregate (CA), superplasticizer (SP), crushed brick aggregate whose grains are smaller than 4 mm (CB1), and aggregate from broken bricks whose grains are larger than 4 mm (CB2). The total database is divided into approximately 70% of the data (Table 1) for model training and approximately 30% (Table 2) for model testing.
The division into sets for training and testing was performed to make both subsets have as similar statistical characteristics as possible. Before the final adoption of a certain model, it was necessary to examine the generalization properties of the model. Different accuracy measures were used to evaluate different types of model accuracy. The model’s quality in the study was assessed using statistical criteria, including RMSE, MAE, MAPE, and R.

4. Discussion

The fitrtree Matlab function is used to fit a regression tree to data. Its default parameters include using the ‘squared error’ split criterion for node-splitting, a minimum of one observation per leaf node, and no limit on the number of splits (tree depth), allowing the tree to grow until all leaves are pure or contain fewer than the minimum number of observations. In this research, the minimum number of data points assigned to a terminal leaf ranges from 1 to 10. The accuracy of the Regression Trees model is given in Table 3.
The process of creating the regression tree model was performed algorithmically through the following steps:
  • Initiation: The algorithm starts at the root node with the entire dataset.
  • Calculating squared differences: For each potential split, determine the sum of squared differences between the observed and predicted values as specified in Equation (1).
  • Select the best split: Choose the split that results in the lowest squared error values, partitioning the data into two child nodes.
  • Recursive splitting: Repeat steps 2 and 3 for each child node, continuing the process of splitting until stopping criteria are reached (e.g., when a node has less than a predetermined number of samples).
  • Creation of the final model: After reaching the stopping criteria, the algorithm finalizes the structure of the decision tree. Each terminal node represents a decision or prediction based on the mean values within that node.
The structure of the optimal regression tree model from Table 3 is presented in the Appendix A (Figure A1: Optimal regression tree model).
  • Bagging algorithm:
  • For values b = 1, 2, …, B where B = 500 and represents the number of generated regression trees.
  • A bootstrap sample of size N was generated, as was the original set, using the training data set.
  • The generated regression tree T b was trained on the thus defined bootstrap sample, recursively repeating the following steps for each node in the tree:
    (a)
    m random variables were chosen from p possible variables (with the difference that with the TreeBagger algorithm the number of random variables is equal to the total number of variables, while with the Random Forest model that number is less than the total number of variables).
    (b)
    The best variable and the value of the point of division, that is, the point of splitting between the m selected variables, was found.
    (c)
    The node is divided into left and right part.
2.
A set of trees T b 1 B is formed.
3.
When predicting the regression, the averaging of the output of the generated trees was used:
f ^ R F x = b = 1 B T b ( x ) .
When constructing tree submodels for ensembles using the TreeBagger and Random Forest algorithms, the minimum number of data points per terminal leaf varied from 1 to 10 in increments of 1.
In the Random Forest ensemble tree model, the size of the subset of input variables was varied, on which splitting was performed from 1 to the total number of variables, which in this research is 9. All forecast models were evaluated with respect to the defined accuracy criteria of RMSE, MAE, MAPE, and R (Figure 3).
Regarding the defined criteria of accuracy in terms of RMSE and MAE, the Random Forest model with the following hyperparameters proved to be optimal: the number of ensemble trees equal to 500, the number of variables and tree splitting equal to 8, and the minimum number of data per terminal leaf (min leaf size) equal to 1. For MAPE and R values, the optimal TreeBagger model had the following hyperparameters: 500 trees in the ensemble, all variables considered for splitting, and a minimum of 1 data point per terminal leaf.
In this research, the importance of predictors in a model is assessed by observing the change in the out-of-bag (OOB) prediction error when the values of each predictor are randomly shuffled (Figure 4). A significant increase in OOB error after permutation indicates a high importance of that predictor. This process is repeated for all predictors to rank their influence on the model’s predictive performance.
Optimal values for all models according to the defined criteria are bolded and shown in Table 4.

5. Conclusions

This study demonstrates the potential of machine learning methods in promoting sustainable concrete production through the use of crushed brick aggregate. By applying sophisticated tree-based models like regression trees, Random Forest, and TreeBagger, the research successfully predicted the compressive strength of concrete made with recycled brick aggregates.
Among the models tested, the regression tree model emerged as the most precise, with an impressive Root Mean Squared Error (RMSE) of 4.00, Mean Absolute Error (MAE) of 2.95, and Mean Absolute Percentage Error (MAPE) of 0.10, along with a coefficient of determination (R2) of 0.96. These results not only confirm the model’s accuracy, but also demonstrate its practical reliability.
The importance of specific features in predicting concrete characteristics was also a critical part of our analysis. Factors such as the age of the concrete, the addition of superplasticizer, and the size of the crushed bricks proved pivotal. The model’s ability to accurately weigh these variables significantly enhanced its predictive performance.
Using machine learning models in this context supports a more targeted and efficient approach to designing sustainable concrete mixes. The precision of these models offers a promising avenue for future research focusing on optimizing the mix designs for even better environmental and mechanical outcomes.
This research not only advances our understanding of sustainable materials, but also paves the way for more scientifically informed decisions in the construction industry, promoting the broader adoption of recycled materials.

Author Contributions

Conceptualization, M.K. and M.H.-N.; methodology, M.K.; software, M.K.; validation, M.K., M.H.-N., S.L. and I.N.G.; formal analysis, M.K., S.L. and I.N.G.; investigation, M.H.-N.; writing—original draft preparation, M.K. and M.H.-N.; writing—review and editing, M.K., M.H.-N., S.L. and I.N.G.; visualization, M.K.; supervision, M.K. and M.H.-N.; project administration, M.H.-N. and M.K.; funding acquisition, M.H.-N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the authors.

Acknowledgments

The results presented in this scientific paper have been obtained through the research activities within the projects: 2023-1-HR01-KA220-HED-000165929 “Intelligent Methods for Structures, Elements and Materials” [https://im4stem.eu/en/home/] co-funded by the European Union under the program Erasmus+ KA220-HED—Cooperation partnerships in higher education, and “Sustainable Building Composites” of University North, Croatia.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Figure A1. Optimal regression tree model.
Figure A1. Optimal regression tree model.
Engproc 68 00024 g0a1

References

  1. Bektas, F.; Wang, K.; Ceylan, H. Effect of crushed brick aggregate on mortar durability. Constr. Build. Mater. 2009, 23, 1909–1914. [Google Scholar] [CrossRef]
  2. Cachim, P.B. Mechanical properties of brick aggregate concrete. Constr. Build. Mater. 2009, 23, 1292–1297. [Google Scholar] [CrossRef]
  3. Khatib, J.M. Properties of concrete incorporating fine recycled aggregate. Cem. Concr. Res. 2005, 35, 763–769. [Google Scholar] [CrossRef]
  4. Khalaf, F.M.; DeVenny, A.S. Recycling of demolished masonry rubble as coarse aggregate in concrete: Review. J. Mater. Civ. Eng. 2004, 16, 331–340. [Google Scholar] [CrossRef]
  5. Rekha, K.; Malasani, P. Response of Recycled Brick Aggregate Concrete to High Temperatures. Int. J. Innov. Technol. Explor. Eng. (IJITEE) 2019, 8, 224–227. [Google Scholar] [CrossRef]
  6. Kim, H.K.; Lim, Y.; Tafesse, M.; Kim, G.M.; Yang, B. Micromechanics-integrated machine learning approaches to predict the mechanical behaviors of concrete containing crushed clay brick aggregates. Constr. Build. Mater. 2022, 317, 125840. [Google Scholar] [CrossRef]
  7. Lang, L.; Xu, J.; Yuan, J.; Yong, Y. Compressive strength and elastic modulus of RBAC: An analysis of existing data and an artificial intelligence based prediction. Case Stud. Constr. Mater. 2023, 18, e02184. [Google Scholar] [CrossRef]
  8. Thi Mai, H.V.; Nguyen, M.H.; Trinh, S.H.; Ly, H.B. Toward improved prediction of recycled brick aggregate concrete compressive strength by designing ensemble machine learning models. Constr. Build. Mater. 2023, 369, 130613. [Google Scholar] [CrossRef]
  9. Kovačević, M.; Lozančić, S.; Nyarko, E.K.; Hadzima-Nyarko, M. Modeling of Compressive Strength of Self-Compacting Rubberized Concrete Using Machine Learning. Materials 2021, 14, 4346. [Google Scholar] [CrossRef] [PubMed]
  10. Hastie, T.; Tibsirani, R.; Friedman, J. The Elements of Statistical Learning; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  11. Breiman, L.; Friedman, H.; Olsen, R.; Stone, C.J. Classification and Regression Trees, 1st ed.; Chapman and Hall/CRC: Wadsworth, OH, USA, 1984. [Google Scholar]
  12. Breiman, L. Bagging Predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef]
  13. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  14. Kovačević, M.; Antoniou, F. Machine-Learning-Based Consumption Estimation of Prestressed Steel for Prestressed Concrete Bridge Construction. Buildings 2023, 13, 1187. [Google Scholar] [CrossRef]
  15. Kovačević, M.; Ivanišević, N.; Petronijević, P.; Despotović, V. Construction cost estimation of reinforced and prestressed concrete bridges using machine learning. Građevinar 2021, 73, 727. [Google Scholar]
  16. Ge, P.; Huang, W.; Zhang, J.; Quan, W.; Guo, Y. Microstructural analysis of recycled brick aggregate concrete modified by silane. Struct. Concrete 2022, 23, 2352–2364. [Google Scholar] [CrossRef]
  17. Janković, K.; Bojović, D.; Nikolić, D.; Lončar, L.; Romakov, Z. Frost resistance of concrete with crushed brick as aggregate. FactaUniv. Ser. Archit. Civ. Eng. 2010, 8, 155–162. [Google Scholar] [CrossRef]
  18. Cavalline, T.L.; Weggel, D.C. Recycled brick masonry aggregate concrete. Struct. Surv. 2013, 31, 160–180. [Google Scholar] [CrossRef]
  19. Dang, J.; Zhao, J. Influence of waste clay bricks as fine aggregate on the mechanical and microstructural properties of concrete. Constr. Build. Mater. 2019, 228, 116757. [Google Scholar] [CrossRef]
  20. Ahmad, S.I.; Hossain, M.A. Water Permeability Characteristics of Normal Strength Concrete Made from Crushed Clay Bricks as Coarse Aggregate. Adv. Mater. Sci. Eng. 2017, 7279138, 1–9. [Google Scholar] [CrossRef]
Figure 1. Segmentation of the input space into distinct regions and the corresponding 3D regression surface represented within the framework of a regression tree [9].
Figure 1. Segmentation of the input space into distinct regions and the corresponding 3D regression surface represented within the framework of a regression tree [9].
Engproc 68 00024 g001
Figure 2. Creating regression tree ensembles using the Bagging algorithm [15].
Figure 2. Creating regression tree ensembles using the Bagging algorithm [15].
Engproc 68 00024 g002
Figure 3. Assessment of the accuracy of RF and TB models based on the number of randomly selected splitting variables and minimum leaf size: (a) RMSE, (b) MAE, (c) MAPE, (d) R.
Figure 3. Assessment of the accuracy of RF and TB models based on the number of randomly selected splitting variables and minimum leaf size: (a) RMSE, (b) MAE, (c) MAPE, (d) R.
Engproc 68 00024 g003aEngproc 68 00024 g003b
Figure 4. Importance of predictors (input variables’ importance).
Figure 4. Importance of predictors (input variables’ importance).
Engproc 68 00024 g004
Table 1. Statistical evaluation of input and output parameters for the training set.
Table 1. Statistical evaluation of input and output parameters for the training set.
MinMaxAverageStd.ModeCount
WC0.301.050.760.180.60185
C [kg/m3]300.00514.00414.9963.29400.00185
S [kg/m3]0.00847.40280.60313.420.00185
FA [kg/m3]0.00960.00262.71344.010.00185
CA [kg/m3]0.001309.00718.73488.000.00185
SP [kg/m3]0.0010.002.482.990.00185
CB1 [kg/m3]0.00660.00163.13211.230.00185
CB2 [kg/m3]0.001000.00240.73318.370.00185
CS [MPa]3.0090.0029.5227.0528.00185
Table 2. Statistical evaluation of input and output parameters for the testing set.
Table 2. Statistical evaluation of input and output parameters for the testing set.
MinMaxAverageStd.ModeCount
WC0.300.640.480.100.6079
C [kg/m3]300.00514.00416.8168.66400.0079
S [kg/m3]0.00847.40312.08316.090.0079
FA [kg/m3]0.00960.00250.67333.300.0079
CA [kg/m3]0.001287.78655.45512.680.0079
SP [kg/m3]0.0010.002.672.920.0079
CB1 [kg/m3]0.00660.00135.63192.890.0079
CB2 [kg/m3]0.001013.00296.29372.110.0079
CS [MPa]3.0090.0025.7721.7928.0079
Table 3. Accuracy of obtained regression tree models for compressive strength prediction.
Table 3. Accuracy of obtained regression tree models for compressive strength prediction.
Min Leaf SizeRMSEMAEMAPE/100R
14.00182.94840.09940.9643
24.08363.03100.10310.9630
34.12203.05240.10560.9613
44.55943.13720.11190.9525
54.62773.27880.11940.9519
65.66863.77640.14580.92876
76.63244.19680.16190.8960
87.37895.28180.18740.8716
97.49625.36440.19290.8672
106.98745.09395.09390.8841
Table 4. Comparative evaluation of the results from various tree-based models for predicting compressive strength (CS).
Table 4. Comparative evaluation of the results from various tree-based models for predicting compressive strength (CS).
ModelRMSEMAEMAPE/100R
Regression tree4.00182.94840.09940.9643
TreeBagger4.18202.83870.10220.9618
Random Forest4.22802.84830.10200.9608
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hadzima-Nyarko, M.; Kovačević, M.; Netinger Grubeša, I.; Lozančić, S. Prediction of the Characteristics of Concrete Containing Crushed Brick Aggregate. Eng. Proc. 2024, 68, 24. https://doi.org/10.3390/engproc2024068024

AMA Style

Hadzima-Nyarko M, Kovačević M, Netinger Grubeša I, Lozančić S. Prediction of the Characteristics of Concrete Containing Crushed Brick Aggregate. Engineering Proceedings. 2024; 68(1):24. https://doi.org/10.3390/engproc2024068024

Chicago/Turabian Style

Hadzima-Nyarko, Marijana, Miljan Kovačević, Ivanka Netinger Grubeša, and Silva Lozančić. 2024. "Prediction of the Characteristics of Concrete Containing Crushed Brick Aggregate" Engineering Proceedings 68, no. 1: 24. https://doi.org/10.3390/engproc2024068024

Article Metrics

Back to TopTop