Data-Driven Prediction of Metal Fatigue Life
DOI:
https://doi.org/10.54691/20jg0s64Keywords:
Additive Manufacturing; Fatigue Life Prediction; Hybrid Leader-Based Optimization; Deep Learning; GRU and Transformer Models.Abstract
With the rapid development of additive manufacturing (AM) technology, its applications in aerospace, medical, and automotive industries are becoming increasingly widespread. This paper investigates the fatigue life prediction of additively manufactured metallic materials and proposes a parallel deep learning architecture combining Gated Recurrent Unit (GRU) and Transformer models, optimized using Hybrid Leader-Based Optimization (HLBO) to improve prediction accuracy. The model simultaneously processes the outputs of both GRU and Transformer models, leveraging their respective strengths to learn features from both local and global perspectives, thereby enhancing the accuracy of fatigue life prediction. The paper first reviews the theoretical background of GRU and Transformer, then presents the proposed parallel model architecture in detail. The core principles of the HLBO optimization algorithm are introduced, along with its application in hyperparameter optimization. Experimental results show that the proposed parallel deep learning model significantly outperforms traditional single deep learning models in terms of prediction accuracy and generalization ability. Furthermore, the performance of the HLBO optimization algorithm is evaluated using benchmark test functions such as the Ackley and Rosenbrock functions to validate its global and local search capabilities. Experimental results demonstrate that HLBO performs excellently in handling high-dimensional, multi-modal optimization problems and effectively enhances the prediction capability of the model. Finally, the model is trained and validated using the FatigueData-AM2022 dataset. The results show that the proposed model exhibits strong adaptability and high precision in predicting the fatigue life of different additively manufactured metallic materials, providing an effective tool and method for the fatigue life assessment of additively manufactured components.
Downloads
References
[1] Abroug F, Monnier A, Arnaud L, Balcaen Y, Dalverny O. High cycle fatigue strength of additively manufactured AISI 316L Stainless Steel parts joined by laser welding. Eng Fract Mech 2022;275:108865.
[2] Yue P, Ma J, Dai CP, Zhang JF, Du W. Probabilistic framework for reliability analysis of gas turbine blades under combined loading conditions. Structures 2023; 55:1437–46.
[3] Wu H, Sun C, Xu W, Chen X, Wu X. A novel evaluation method for high cycle and very high cycle fatigue strength. Eng Fract Mech 2023;109482.
[4] Li H, Huang CG, Guedes SC. A real-time inspection and opportunistic maintenance strategies for floating offshore wind turbines. Ocean Eng 2022;256:111433.
[5] Xu S, Zhu SP, Hao YZ, Liao D, Qian G. A new critical plane-energy model for multiaxial fatigue life prediction of turbine disc alloys. Eng Fail Anal 2018;93: 55–63.
[6] Zhang W, Wu C, Zhong H, Li Y, Wang L. Prediction of undrained shear strength using extreme gradient boosting and random forest based on Bayesian optimization. Geosci Front 2021;12:469–77.
[7] Agrawal A, Deshpande PD, Cecen A, Basavarsu GP, Choudhary AN, Kalidindi SR. Exploration of data science techniques to predict fatigue strength of steel from composition and processing parameters. Integr Mater Manuf Innov 2014;3:90–108.
[8] Sameen MI, Pradhan B, Lee S. Self-Learning Random Forests Model for Mapping Groundwater Yield in Data-Scarce Areas. Nat Resour Res 2019;28:757–75.
[9] Dehghani, M., Trojovský, P. Hybrid leader based optimization: a new stochastic optimization algorithm for solving optimization applications. Sci Rep 12, 5549 (2022).
[10] Zhang, Z., Xu, Z. Fatigue database of additively manufactured alloys. Sci Data 10, 249 (2023).
Downloads
Published
Issue
Section
License
Copyright (c) 2025 Frontiers in Sustainable Development

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.






