Li, Yixin, Hu, Fu, Liu, Ying ORCID: https://orcid.org/0000-0001-9319-5940, Ryan, Michael ORCID: https://orcid.org/0000-0002-8104-0121 and Wang, Ray 2023. A hybrid model compression approach via knowledge distillation for predicting energy consumption in additive manufacturing. International Journal of Production Research 61 (13) , pp. 4525-4547. 10.1080/00207543.2022.2160501 |
PDF
- Accepted Post-Print Version
Download (605kB) |
Abstract
Recently, additive manufacturing (AM) has received increased attention due to its high energy consumption. By extracting hidden information or highly representative features from energy-relevant data, knowledge distillation (KD) reduces predictive model complexity and computational load. By using almost predetermined and fixed models, the distillation process restricts students and teachers from learning from one model to another. To reduce computational costs while maintaining acceptable performance, a teacher assistant (TA) was added to the teacher-student architecture. Firstly, a teacher ensemble was combined with three baseline models to enhance accuracy. In the second step, a teacher ensemble (TA) was formed to bridge the capacity gap between the ensemble and the simplified model. As a result, the complexity of the student model was reduced. Using geometry-based features derived from layer-wise image data, a KD-based predictive model was developed to evaluate the feasibility and effectiveness of two independently trained student models. In comparison with independently trained student models, the performance of the proposed method has the lowest RMSE, MAE, and training time.
Item Type: | Article |
---|---|
Date Type: | Publication |
Status: | Published |
Schools: | Engineering |
Publisher: | Taylor & Francis |
ISSN: | 1366-588X |
Date of First Compliant Deposit: | 20 December 2022 |
Date of Acceptance: | 15 November 2022 |
Last Modified: | 11 Nov 2024 15:15 |
URI: | https://orca.cardiff.ac.uk/id/eprint/155041 |
Actions (repository staff only)
Edit Item |