Fang GH, Lin ZM, Xie CZ, Han QZ, Hong MY, Zhao XY. Optimized Machine Learning Model for Predicting Compressive Strength of Alkali-Activated Concrete Through Multi-Faceted Comparative Analysis.
MATERIALS (BASEL, SWITZERLAND) 2024;
17:5086. [PMID:
39459790 PMCID:
PMC11509232 DOI:
10.3390/ma17205086]
[Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/04/2024] [Revised: 09/19/2024] [Accepted: 10/16/2024] [Indexed: 10/28/2024]
Abstract
Alkali-activated concrete (AAC), produced from industrial by-products like fly ash and slag, offers a promising alternative to traditional Portland cement concrete by significantly reducing carbon emissions. Yet, the inherent variability in AAC formulations presents a challenge for accurately predicting its compressive strength using conventional approaches. To address this, we leverage machine learning (ML) techniques, which enable more precise strength predictions based on a combination of material properties and cement mix design parameters. In this study, we curated an extensive dataset comprising 1756 unique AAC mixtures to support robust ML-based modeling. Four distinct input variable schemes were devised to identify the optimal predictor set, and a comparative analysis was performed to evaluate their effectiveness. After this, we investigated the performance of several popular ML algorithms, including random forest (RF), adaptive boosting (AdaBoost), gradient boosting regression trees (GBRTs), and extreme gradient boosting (XGBoost). Among these, the XGBoost model consistently outperformed its counterparts. To further enhance the predictive accuracy of the XGBoost model, we applied four state-of-the-art optimization techniques: the Gray Wolf Optimizer (GWO), Whale Optimization Algorithm (WOA), beetle antennae search (BAS), and Bayesian optimization (BO). The optimized XGBoost model delivered superior performance, achieving a remarkable coefficient of determination (R2) of 0.99 on the training set and 0.94 across the entire dataset. Finally, we employed SHapely Additive exPlanations (SHAP) to imbue the optimized model with interpretability, enabling deeper insights into the complex relationships governing AAC formulations. Through the lens of ML, we highlight the benefits of the multi-faceted synergistic approach for AAC strength prediction, which combines careful input parameter selection, optimal hyperparameter tuning, and enhanced model interpretability. This integrated strategy improves both the robustness and scalability of the model, offering a clear and reliable prediction of AAC performance.
Collapse