Next Article in Journal
A Modular, Cost-Effective, and Pumpless Perfusion Assembly for the Long-Term Culture of Engineered Microvessels
Previous Article in Journal
Scanning Mirror Benchmarking Platform Based on Two-Dimensional Position Sensitive Detector and Its Accuracy Analysis
Previous Article in Special Issue
Development of GUI-Driven AI Deep Learning Platform for Predicting Warpage Behavior of Fan-Out Wafer-Level Packaging
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting Thermal Resistance of Packaging Design by Machine Learning Models

1
Interdisciplinary Program of Education, National Chi Nan University, Nantou 54561, Taiwan
2
Siliconware Precision Industries Co., Ltd., No. 123, Sec. 3, Dafeng Rd., Dafeng Vil., Tanzi Dist., Taichung City 42749, Taiwan
3
Department of Information Management, National Chi Nan University, Nantou 54561, Taiwan
4
PhD Program in Strategy and Development of Emerging Industries, National Chi Nan University, Nantou 54561, Taiwan
*
Author to whom correspondence should be addressed.
Micromachines 2025, 16(3), 350; https://doi.org/10.3390/mi16030350
Submission received: 25 February 2025 / Revised: 14 March 2025 / Accepted: 18 March 2025 / Published: 19 March 2025

Abstract

:
Thermal analysis is an indispensable aspect of semiconductor packaging. Excessive operating temperatures in integrated circuit (IC) packages can degrade component performance and even cause failure. Therefore, thermal resistance and thermal characteristics are critical to the performance and reliability of electronic components. Machine learning modeling offers an effective way to predict the thermal performance of IC packages. In this study, data from finite element analysis (FEA) are utilized by machine learning models to predict thermal resistance during package testing. For two package types, namely the Quad Flat No-lead (QFN) and the Thin Fine-pitch Ball Grid Array (TFBGA), data derived from finite element analysis, are employed to predict thermal resistance. The thermal resistance values include θJA, θJB, θJC, ΨJT, and ΨJB. Five machine learning models, namely the light gradient boosting machine (LGBM), random forest (RF), XGBoost (XGB), support vector regression (SVR), and multilayer perceptron regression (MLP), are applied as forecasting models in this study. Numerical results indicate that the XGBoost model outperforms the other models in terms of forecasting accuracy for almost all cases. Furthermore, the forecasting accuracy achieved by the XGBoost model is highly satisfactory. In conclusion, the XGBoost model shows significant promise as a reliable tool for predicting thermal resistance in packaging design. The application of machine learning techniques for forecasting these parameters could enhance the efficiency and reliability of IC packaging designs.

1. Introduction

When electronic equipment operates, the temperature of its materials rises due to the consumption of electrical energy. Additionally, differences in the thermal expansion coefficients and shrinkage properties of the materials cause thermal stress in the jointed parts. This heat energy increases the temperature of components, including semiconductor products. Elevated temperatures can compromise the functionality, reliability, and safety of electronic equipment [1,2]. Therefore, it is crucial to ensure that electronic components are maintained at a stable temperature under any environmental conditions. Given the importance of heat dissipation performance in semiconductor packaging, the current trend in IC design favors increasingly smaller package sizes. As a result, thermal analysis has become an essential aspect of testing. Designers must accurately forecast the heat generated by semiconductor packaging during system applications, assess the heat dissipation performance of packaging materials and structures, and account for factors in the packaging design process.
Current thermal simulation methods for electronic products generally use finite element analysis to model airflow, temperature distribution, and heat transfer in IC packages, PCBs, electronic components, housings, and power electronic devices to provide accurate results [3]. However, these simulations may rely on assumptions that are inconsistent with realistic conditions. Additionally, the simulation method requires high-performance hardware, which increases the complexity of the simulation and, as a result, significantly raises the calculation time [4].
Due to the complexity of packaging design patterns, the combination of simulation experiments and machine learning can enable predictive capabilities using a minimal required training dataset. This approach is gradually being applied to solve thermal analysis challenges in packaging testing [5]. For example, Wang and Vafai [6] used the K-fold cross-validation algorithm and the support vector regression (SVR) algorithm to predict changes in the hotspot temperature of 3D wafers when modeling and cooling parameters are varied. Experiment results show that the prediction deviation is very small. Chen et al. [4] used different machine learning models to predict and analyze thermal resistance in packaged products, with results indicating that the artificial neural network model performed the best. Park et al. [7] employed three machine learning models to analyze the relationship between thermal flux and mechanical flux in semiconductor packages, verifying the results through finite element analysis of equivalent properties. Stoyanov et al. [8] used two different element model structures, namely multiple quadratic functions and neural networks, to predict thermal fatigue damage caused by temperature cycle loads in insulated gate bipolar transistor power electronic module bonding wires. The results confirmed that the proposed method and modeling technique provide finite element matching accuracy and effectively map highly non-linear spatial distributions of failure-related parameters. Kim and Moon [9] primarily used a deep neural network (DNN) model to estimate the effective thermal conductivity of flat heat pipes with various shapes and working conditions. Numerical results demonstrated that the DNN can accurately estimate effective thermal conductivity for different diffusion thermal resistances. Lai et al. [10] employed the Long Short-Term Memory (LSTM) method to efficiently predict reflow soldering temperature distributions, significantly reducing computation time. Wang et al. [11] used convolutional neural networks (CNNs) with Bayesian optimization to replace computational fluid dynamic simulations, accelerating the thermal optimization of multichip modules. Their technique improved computational efficiency, reduced chip junction temperatures, and enhanced manufacturing efficiency (Table 1).
This study aimed to use finite element simulation data and machine learning models to forecast the thermal resistance values and thermal characteristics of two IC package types, QFN and TFBGA. The rest of this study is organized as follows: Section 2 introduces the thermal resistance of the IC package. The proposed framework of forecasting thermal resistance is illustrated in Section 3. Numerical results are presented in Section 4. Conclusions are addressed in Section 5.

2. Thermal Resistance of IC Package

The heat dissipation of semiconductor components occurs through two primary pathways: convection from the top of the package to the surrounding air and conduction from the package pins to the circuit board. The thermal resistance of an IC package is a key metric that measures the package’s ability to conduct heat generated by the die to the circuit board or the surrounding environment. By specifying the temperatures at two different points, thermal resistance quantifies the amount of heat flow between these two points. For thermal analysis of semiconductor packages, defining key temperature points of the package is essential. Figure 1 illustrates temperature points including Ambient Temperature (Ta), Junction Temperature (Tj), Case Temperature (Tc), and Board Temperature (Tb).
The thermal resistance of an IC package can be used to calculate the junction temperature of the IC based on power consumption and a given reference temperature. The thermal resistance θ j X of heat flow from the junction point j to a specific temperature point x can be expressed as Equation (1).
θ j X = T j T x P
where T j is the junction temperature, T x is the temperature at the specified location, and P is the total dissipated power.
The thermal characteristics of a typical IC package are shown in Table 2. Thermal resistance is an index indicating the temperature rise of the semiconductor product relative to the ambient temperature when the die generates one watt of heat. The measurement of thermal resistance is defined by the joint electron device engineering council (JEDEC). The thermal resistance θJA is defined for the IC package mounted on the test board and located in the surrounding environment. Since test environments and board designs vary significantly in many applications, θJA provides a thermal performance ranking of a package rather than simulating a specific end-use application. θJA is the thermal resistance of multiple heat dissipation paths that conduct heat transfers through conduction, convection, radiation, etc. θJB refers to the thermal resistance from the junction to the circuit board, including the thermal resistance from the junction of the IC to the reference point on the bottom of the package, and the thermal resistance through the circuit board at the bottom of the package. θJC represents the thermal resistance of the heat transfer path from the junction to a specific point on the top surface of the package case through specific conduction methods. The measurement of ΨJB is different from the direct single path in the measurement of thermal resistance θJB. The measurement of ΨJB is based on multiple thermal paths, which represents the thermal characteristic parameters from the junction to the circuit board. ΨJT is a characteristic parameter that measures the temperature change between the junction temperature and the package top temperature.
There are various types of IC packages with different thermal conductivity characteristics. Using surface mount technology to connect PCBs, the QFN has become one of the most commonly used packaging types due to advantages in performance and cost. As the power demands of products continue to increase, the heat generated by the package also rises accordingly. The TFBGA is another common packaging type, characterized by a grid-like arrangement of pins on the surface. Package types of QFN and TFBGA generally provide high heat dissipation capability. Compared to the QFN, the TFBGA has the advantage of supporting more pins. Depending on the type and size of the packaging structure, internal temperatures can be significantly affected by varying thermal conduction interactions [12]. In packaging testing, numerical simulation and experimental methods are the most commonly used approaches to determine wafer packaging temperatures. However, both simulations and experiments for analyzing the thermal properties of packages are time- and resource-intensive [13].

3. Machine Learning in Predicting Thermal Resistance (MLPTR) Model

3.1. Machine Learning Models for Regression

Five machine learning models, namely support vector regression, multilayer perceptron, XGBoosting, light gradient boosting machine, and random forest, are employed to forecast thermal resistance in this study. LGBM is an ensemble learning model composed of multiple decision trees. Each tree is trained sequentially by residual errors of the previous trees to approximate the true values iteratively. The core mechanism of LGBM is its leaf-wise growth strategy, which selects the split that maximizes information gain at each step, leading to reduced error [14]. Additionally, LGBM employs histogram-based binning technology to convert continuous features into discrete intervals, thereby accelerating calculations. The objective of LGBM is to minimize the loss function. Random forest enhances prediction accuracy and model stability by aggregating the results of multiple decision trees. It demonstrates a strong capability to handle non-linear relationships, noise, and overfitting in data [15,16]. During the construction of each decision tree, the random forest employs random subsampling from the original training data to mitigate overfitting through random sampling. For each sample drawn from the training set, the random forest builds a decision tree and randomly selects a subset of features at each node, thereby reducing model bias. Once training is complete, the random forest combines all prediction results, leading to a finalized prediction value. XGBoost is based on gradient-boosted decision trees, which improves prediction results by gradually adding new trees [17]. Each time, the model is updated and is adjusted based on the prediction results of the previous model. A new tree is added to fit the difference between the predicted and actual values of the previous iteration. After the update, a new model is formed, and the most updated model is treated as the basis for the next round of learning. The goal of prediction is to make the predicted value close to the true value while maximizing the model’s generalization capability simultaneously. Therefore, this constitutes an optimization problem. While minimizing the difference between the predicted value and the target value, the loss function in the objective function is used to maximize the model’s fitness to the sample through continuous learning. XGBoost effectively manages overfitting through the regularization technique, supports distributed training, and facilitates feature importance analysis. As a result, XGBoost has become a popular alternative for regression tasks.
Support vector regression is developed for conducting regression within a certain allowable error range. SVR searches for a hyperplane with the smallest possible deviation from the training data and balances model generalization and accuracy. To achieve the aim mentioned, SVR minimizes the model’s complexity by controlling the deviation between the predicted and actual values within an acceptable range [18,19]. The main goal of SVR is to find the best regression line, ensuring the line is as close as possible to all data points while allowing some points to fall within the error range. A tube is used to allow prediction errors within a range of the SVR model. The weight vector controls the complexity of the model and provides a flatter decision function. When data points fall outside the acceptable deviation values, a penalty parameter is used to regulate the tolerance. Thus, the objective becomes minimizing the target function to find the optimal parameters [20,21]. The multilayer perceptron is a feedforward artificial neural network including an input layer, at least one hidden layer, and an output layer. Each neuron in a layer is connected to neurons in the subsequent layer through weighted connections, with computations incorporating non-linear activation functions [22,23]. An activation function is employed to decide whether a neuro is triggered or not. If a neuron is triggered, then a signal is conveyed forward to the neurons connected to the activated neuron. The MLP is trained using the backpropagation algorithm, which minimizes the loss function through gradient descent to reduce prediction error. Due to its capability of learning complex mappings, MLP is widely used in classification and regression tasks, making it a foundational component in deep learning.

3.2. Machine Learning Architecture for Thermal Resistance Prediction

To mitigate the time-consuming process using finite element analysis simulation, this study employs machine learning methods to predict thermal resistance. The framework of machine learning models in forecasting thermal resistance is shown in Figure 2. First, a dataset of simulated thermal resistance values was generated using finite element analysis. This dataset includes various input features related to the packaging structure, such as chip size and temperature.
Three stages, namely data collection, model establishment, and model implementation, are included in the developed machine learning in predicting thermal resistance (MLPTR) model. For the data collection stage, the features of the packages are fed into finite element analysis, and resistance values are generated. For the QFN package type with a total of 3234 data points, the dataset comprises twelve characteristic parameters including PKG Size, Die Size, Pad Size, Exposed Pad Size, Die Thickness, Mold Thickness, EMC K, Die K, Die Attach K, Leadframe K, Ambient Temperature, and input power. For the TFBGA package type, thirteen parameters, namely PKG Size, Die Size, Substrate Copper Ratio, Substrate Thickness, Die Thickness, Mold Thickness, EMC K, Die K, Thermal Via K, Die Attach K, BGA Ball, Ambient Temperature, and input power, are included with a total of 683 data points. Thermal resistance values, namely θJA, θJB, θJC, ΨJT, and ΨJB, are generated by finite element analysis and serve as target values in machine learning models. For the model establishment stage, the data are merged and invalid entries are removed. Then, the dataset is split into a training dataset and a testing dataset. The training dataset is employed for model learning, and the testing dataset is used for evaluating the performance of machine learning models. The numbers of training data points and testing data points for the QFN package type are 2587 and 647, respectively. For the TFBGA package type, 546 data points are for model training and 137 data points are for model testing. In this study, default values of hyperparameters are used for machine learning models, first for training models and then for conducting predictions. Then, measurements of forecasting accuracy are performed. When the forecasting accuracy is not satisfactory, the grid search algorithm is performed to select hyperparameters of the machine learning models. The final stage is an implementation phase. A well-trained machine learning model with the best forecasting accuracy is used to predict the thermal resistance practically.

4. Numerical Results

Five machine learning models, SVR, MLP, XGB, LGBM, and RF, are used to forecast thermal resistance in this study. Three techniques, including XGB, LGBM, and RF, are tree-based models. Two measurements, namely mean absolute percentage error (MAPE) and root mean square error (RMSE), are used to evaluate the forecasting accuracy of thermal resistance of the QFN package and the TFBGA package. MAPE and RMSE are calculated as shown in Equations (2) and (3).
M A P E   ( % ) = 100 n i = 1 n Y ^ i Y i Y i
R M S E = Σ i = 1 n Y ^ i Y i 2 n
where Y ^ i is the i-th predicted value, Y i is the i-th actual value, and i = 1~n.
Table 3 and Table 4 illustrate the prediction performance of the QFN package and the TFBGA package by machine learning models. At this stage, default values are used for the hyperparameters. The MAPE and RMSE values indicate that the three tree-based models are superior to the SVR model and the MLP model in terms of forecasting accuracy. Notably, the SVR models and the MLP models are not able to capture trends of thermal resistance for the ΨJT package. To improve forecasting accuracy, the grid search technique was employed to select hyperparameters [24,25] for SVR models and MLP models. In this study, one hidden layer was employed for the MLP models. Table 5 lists the hyperparameters selected by the grid search algorithm for the two models. Table 6 indicates the forecasting performance of SVR and MLP using hyperparameters selected by the grid search algorithm. It can be observed that the selected hyperparameters improved the forecasting accuracy for both models. Figure 3 and Figure 4 illustrate the thermal resistance predictions with hyperparameter selection for the QFN package and the TFBGA package, respectively, in terms of MAPE (%) values. Furthermore, Figure 5 and Figure 6 reveal that the MAPE (%) values provided by the three tree-based models are less than 10% for both package types. The XGB model resulted in the best forecasting performance. The actual values and predicted values generated by XGB models of 10 packaging structures are illustrated in Figure 7, Figure 8, Figure 9, Figure 10 and Figure 11 for QFN package types and Figure 12, Figure 13, Figure 14, Figure 15 and Figure 16 for TFBGA package types.

5. Conclusions

This study uses machine learning models for effectively predicting thermal resistance in IC packaging processes. The forecasting results without hyperparameter selection indicate that tree-based models outperformed the SVR models and the MLP models in terms of MAPE (%) and RMSE values. For the three tree-based models, the XGBoost model generated the best forecasting results in both the QFN package and the TFBGA package. Then, the grid search algorithm was employed to determine the hyperparameters for the SVR models and the MLP models to improve forecasting performance. Numerical results indicated that the forecasting accuracy of the SVR models and the MLP models was increased by using the hyperparameter selection technique. However, the three tree-based machine learning models outperformed the SVR models and the MLP models with hyperparameter selection in terms of MAPE (%) values and RMSE values. In addition, the XBG models are superior to the other models in almost all cases. The superior predictive performance of XGB models is based on several factors. With regularization techniques to prevent overfitting and error correction capabilities during the learning process, XGB has been reported to perform well in forecasting tasks [26,27]. Additionally, Reddad et al. [28] highlighted that XGBoost is robust in predictive tasks due to its effective handling of parameter interactions, tree pruning, and hyper tuning. Studies [26,27,28] have successfully applied XGBoost to predict thermal conductivity, phase change material melting, and the reliability of solder ball joints. Incorporating data provided by the finite element analysis technique, this study concludes that the XGB model is a feasible, effective, and promising alternative for predicting thermal resistance in the IC package and testing industry. Additionally, the presented MLPTR framework could potentially be applied to address other regression problems in packaging design.

Author Contributions

Conceptualization, J.-P.L., S.L., V.L., A.K., Y.-P.W. and P.-F.P.; methodology, J.-P.L. and P.-F.P.; software, J.-P.L.; validation, S.L., V.L. and A.K.; formal analysis, J.-P.L. and P.-F.P.; data curation, S.L. and V.L.; writing—original draft preparation, J.-P.L.; writing—review and editing, P.-F.P.; visualization, J.-P.L.; supervision, Y.-P.W. and P.-F.P.; funding acquisition, Y.-P.W. and P.-F.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Siliconware Precision Industries Co., Ltd., Taiwan, grant number 113A023.

Data Availability Statement

The data presented in this study are available on reasonable request.

Conflicts of Interest

The authors declare that this study received funding from Siliconware Precision Industries Co., Ltd. Authors Shane Lin, Vito Lin, Andrew Kang, and Yu-Po Wang were employed by the Siliconware Precision Industries Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The funder was not involved in the study design, collection, analysis, interpretation of data, the writing of this article or the decision to submit it for publication.

References

  1. Hollstein, K.; Yang, X.; Weide-Zaage, K. Thermal analysis of the design parameters of a QFN package soldered on a PCB using a simulation approach. Microelectron. Reliab. 2021, 120, 114118. [Google Scholar] [CrossRef]
  2. Qiu, B.; Xiong, J.; Wang, H.; Zhou, S.; Yang, X.; Lin, Z.; Liu, M.; Cai, N. Survey on fatigue life prediction of BGA solder joints. Electronics 2022, 11, 542. [Google Scholar] [CrossRef]
  3. Shang, R.; Yao, Y.; Bi, A.; Wang, Y.; Wang, S. Exploring modeling and testing approaches for three-dimensional integrated thermal resistance of chiplets. J. Therm. Anal. Calorim. 2024, 149, 7689–7703. [Google Scholar] [CrossRef]
  4. Chen, G.-W.; Lin, Y.-C.; Hsu, C.-H.; Chen, T.-Y.; Wang, C.-C.; Hung, C.-P.; Wang, H.-K. Thermal resistance prediction model for IC packaging optimization and design cycle reduction. In Proceedings of the 2024 IEEE 74th Electronic Components and Technology Conference (ECTC), Denver, CO, USA, 28–31 May 2024; pp. 1593–1598. [Google Scholar]
  5. Wang, N.; Jieensi, J.; Zhen, Z.; Zhou, Y.; Ju, S. Predicting the interfacial thermal resistance of electronic packaging materials via machine learning. In Proceedings of the 2022 23rd International Conference on Electronic Packaging Technology (ICEPT), Dalian, China, 10–13 August 2022; pp. 1–4. [Google Scholar]
  6. Wang, C.; Vafai, K. Heat transfer enhancement for 3D chip thermal simulation and prediction. Appl. Therm. Eng. 2024, 236, 121499. [Google Scholar] [CrossRef]
  7. Park, J.-H.; Park, H.; Kim, T.; Kim, J.; Lee, E.-H. Numerical analysis of thermal and mechanical characteristics with property maps in complex semiconductor package designs. Appl. Math. Model. 2024, 130, 140–159. [Google Scholar] [CrossRef]
  8. Stoyanov, S.; Tilford, T.; Zhang, X.; Hu, Y.; Yang, X.; Shen, Y. Physics-informed Machine Learning for predicting fatigue damage of wire bonds in power electronic modules. In Proceedings of the 2024 25th International Conference on Thermal, Mechanical and Multi-Physics Simulation and Experiments in Microelectronics and Microsystems (EuroSimE), Catania, Italy, 7–10 April 2024; pp. 1–8. [Google Scholar]
  9. Kim, M.; Moon, J.H. Deep neural network prediction for effective thermal conductivity and spreading thermal resistance for flat heat pipe. Int. J. Numer. Methods Heat Fluid Flow 2022, 33, 437–455. [Google Scholar] [CrossRef]
  10. Lai, Y.; Kataoka, J.; Pan, K.; Ha, J.; Yang, J.; Deo, K.A.; Xu, J.; Yin, P.; Cai, C.; Park, S.; et al. A deep learning approach for reflow profile prediction. In Proceedings of the 2022 IEEE 72nd Electronic Components and Technology Conference (ECTC), San Diego, CA, USA, 31 May–3 June 2022; pp. 2269–2274. [Google Scholar]
  11. Wang, Z.-Q.; Hua, Y.; Aubry, N.; Zhou, Z.-F.; Feng, F.; Wu, W.-T. Fast optimization of multichip modules using deep learning coupled with Bayesian method. Int. Commun. Heat Mass Transf. 2023, 141, 106592. [Google Scholar] [CrossRef]
  12. Wang, Y.; Wei, X.; Zhang, G.; Hu, Z.; Zhao, Z.; Wang, L. Analytical thermal resistance model for calculating mean die temperature of eccentric quad flat no-leads packaging on printed circuit board. AIP Adv. 2021, 11, 035039. [Google Scholar] [CrossRef]
  13. Rao, X.; Liu, H.; Song, J.; Jin, C.; Xiao, C. Optimizing Heat Source Arrangement for 3D ICs with irregular structures using machine learning methods. In Proceedings of the 2023 24th International Conference on Electronic Packaging Technology (ICEPT), Shihezi, China, 8–11 August 2023; pp. 1–6. [Google Scholar]
  14. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. LGBM: A highly efficient gradient boosting decision tree. Adv. Neural Inf. Process. Syst. 2017, 30, 1–9. [Google Scholar]
  15. Segal, M.R. Machine Learning Benchmarks and Random Forest Regression. 2004. Available online: https://escholarship.org/uc/item/35x3v9t4 (accessed on 6 December 2024).
  16. Breiman, L.; Friedman, J.; Olshen, R.A.; Stone, C.J. Classification and Regression Trees; Routledge: New York, NY, USA, 2017. [Google Scholar]
  17. Chen, T.; Guestrin, C. Xgboost: A scalable tree boosting system. In Proceedings of the Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
  18. Vapnik, V.; Golowich, S.; Smola, A. Support vector method for function approximation, regression estimation and signal processing. Adv. Neural Inf. Process. Syst. 1996, 9, 281–287. [Google Scholar]
  19. da Silva Santos, C.E.; Sampaio, R.C.; dos Santos Coelho, L.; Bestard, G.A.; Llanos, C.H. Multi-objective adaptive differential evolution for SVM/SVR hyperparameters selection. Pattern Recognit. 2021, 110, 107649. [Google Scholar] [CrossRef]
  20. Yu, H.; Kim, S. SVM Tutorial-Classification, Regression and Ranking. Handb. Nat. Comput. 2012, 1, 479–506. [Google Scholar]
  21. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  22. Gardner, M.W.; Dorling, S. Artificial neural networks (the multilayer perceptron)—A review of applications in the atmospheric sciences. Atmos. Environ. 1998, 32, 2627–2636. [Google Scholar] [CrossRef]
  23. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
  24. Bischl, B.; Binder, M.; Lang, M.; Pielok, T.; Richter, J.; Coors, S.; Thomas, J.; Ullmann, T.; Becker, M.; Boulesteix, A.L. Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2023, 13, e1484. [Google Scholar] [CrossRef]
  25. Chen, C.-H.; Lai, J.-P.; Chang, Y.-M.; Lai, C.-J.; Pai, P.-F. A study of optimization in deep neural networks for regression. Electronics 2023, 12, 3071. [Google Scholar] [CrossRef]
  26. Bhandari, U.; Chen, Y.; Ding, H.; Zeng, C.; Emanet, S.; Gradl, P.R.; Guo, S. Machine-Learning-Based Thermal Conductivity Prediction for Additively Manufactured Alloys. J. Manuf. Mater. Process. 2023, 7, 160. [Google Scholar] [CrossRef]
  27. Kıyak, B.; Öztop, H.F.; Ertam, F.; Aksoy, İ.G. An Intelligent Approach to Investigate the Effects of Container Orientation for PCM Melting Based on an XGBoost Regression Model. Eng. Anal. Bound. Elem. 2024, 161, 202–213. [Google Scholar] [CrossRef]
  28. Reddad, H.; Zemzami, M.; El Hami, N.; Hmina, N.; Nguyen, N.Q. Reliability Assessment of Solder Ball Joints Using Finite Element Analysis and Machine Learning Techniques. In Proceedings of the 2024 10th International Conference on Control, Decision and Information Technologies (CoDIT), Valletta, Malta, 1–4 July 2024; pp. 2750–2755. [Google Scholar]
Figure 1. Critical temperature points of IC packaging.
Figure 1. Critical temperature points of IC packaging.
Micromachines 16 00350 g001
Figure 2. The proposed MLPTR framework.
Figure 2. The proposed MLPTR framework.
Micromachines 16 00350 g002
Figure 3. Visualization of MAPE (%) values for predicting the thermal resistance of the QFN package by machine learning models with hyperparameter selection.
Figure 3. Visualization of MAPE (%) values for predicting the thermal resistance of the QFN package by machine learning models with hyperparameter selection.
Micromachines 16 00350 g003
Figure 4. Visualization of MAPE (%) values for predicting the thermal resistance of the TFBGA package by machine learning models with hyperparameter selection.
Figure 4. Visualization of MAPE (%) values for predicting the thermal resistance of the TFBGA package by machine learning models with hyperparameter selection.
Micromachines 16 00350 g004
Figure 5. Visualization of MAPE (%) values for predicting the thermal resistance of the QFN package by tree-based models.
Figure 5. Visualization of MAPE (%) values for predicting the thermal resistance of the QFN package by tree-based models.
Micromachines 16 00350 g005
Figure 6. Visualization of MAPE (%) values for predicting the thermal resistance of the TFBGA package by tree-based models.
Figure 6. Visualization of MAPE (%) values for predicting the thermal resistance of the TFBGA package by tree-based models.
Micromachines 16 00350 g006
Figure 7. The actual values and predicted values generated by the XGB model of θJA for the QFN package type.
Figure 7. The actual values and predicted values generated by the XGB model of θJA for the QFN package type.
Micromachines 16 00350 g007
Figure 8. The actual values and predicted values generated by the XGB model of θJB for the QFN package type.
Figure 8. The actual values and predicted values generated by the XGB model of θJB for the QFN package type.
Micromachines 16 00350 g008
Figure 9. The actual values and predicted values generated by the XGB model of θJC for the QFN package type.
Figure 9. The actual values and predicted values generated by the XGB model of θJC for the QFN package type.
Micromachines 16 00350 g009
Figure 10. The actual values and predicted values generated by the XGB model of ΨJT for the QFN package type.
Figure 10. The actual values and predicted values generated by the XGB model of ΨJT for the QFN package type.
Micromachines 16 00350 g010
Figure 11. The actual values and predicted values generated by the XGB model of ΨJB for the QFN package type.
Figure 11. The actual values and predicted values generated by the XGB model of ΨJB for the QFN package type.
Micromachines 16 00350 g011
Figure 12. The actual values and predicted values generated by the XGB model of θJA for the TFBGA package type.
Figure 12. The actual values and predicted values generated by the XGB model of θJA for the TFBGA package type.
Micromachines 16 00350 g012
Figure 13. The actual values and predicted values generated by the XGB model of θJB for the TFBGA package type.
Figure 13. The actual values and predicted values generated by the XGB model of θJB for the TFBGA package type.
Micromachines 16 00350 g013
Figure 14. The actual values and predicted values generated by the XGB model of θJC for the TFBGA package type.
Figure 14. The actual values and predicted values generated by the XGB model of θJC for the TFBGA package type.
Micromachines 16 00350 g014
Figure 15. The actual values and predicted values generated by the XGB model of ΨJT for the TFBGA package type.
Figure 15. The actual values and predicted values generated by the XGB model of ΨJT for the TFBGA package type.
Micromachines 16 00350 g015
Figure 16. The actual values and predicted values generated by the XGB model of ΨJB for the TFBGA package type.
Figure 16. The actual values and predicted values generated by the XGB model of ΨJB for the TFBGA package type.
Micromachines 16 00350 g016
Table 1. Thermal predictions of IC packaging by machine learning-based techniques.
Table 1. Thermal predictions of IC packaging by machine learning-based techniques.
LiteratureYearsApplicationsMethods
Chen et al. [4]2024Thermal resistance prediction of IC packagesANN
Wang and Vafai [6] 2024Predicting changes in hot-spot temperature on 3D wafersSVR
Park et al. [7]2024Predicting thermal and mechanical flux in packagesSVR, GPR, ANN
Stoyanov et al. [8]2024Predicting thermal fatigue damage due to the temperature of power electronic modulesANN
Kim and Moon [9]2022Estimating the effective thermal conductivity of a flat heat pipeCNNs
Lai et al. [10]2022Predicting the reflow profile of a bulky ball grid array packageLSTM
Wang et al. [11]2022Optimizing the thermal layout of multi-die modulesCNN
Note: ANN—artificial neural network; GPR—Gaussian process regression.
Table 2. Symbols and descriptions of thermal characteristics.
Table 2. Symbols and descriptions of thermal characteristics.
SymbolsDescriptions
θJA Junction-to-Ambient Thermal Resistance
θJC Junction-to-Case Thermal Resistance
θJB Junction-to-Board Thermal Resistance
ΨJB Junction-to-Board Characterization Parameter
ΨJT Junction-to-Top Characterization Parameter
Table 3. MAPE (%) and RMSE values of machine learning models for predicting the QFN package’s thermal resistance.
Table 3. MAPE (%) and RMSE values of machine learning models for predicting the QFN package’s thermal resistance.
MAPE (%)
θJAθJBθJCΨJTΨJB
LGBM0.819100.718120.259855.761610.76677
RF3.350842.619941.296139.152162.74191
XGB0.547220.086790.031561.123120.07795
SVR21.5612538.9428251.8362074.7726139.93139
MLP9.4479127.4414754.38130219.7777828.19956
RMSE
θJAθJBθJCΨJTΨJB
LGBM0.427010.181870.034960.082840.17618
RF1.411330.593740.288610.315190.59424
XGB0.290240.056050.009810.067570.02962
SVR21.2622621.6121914.76133.3460621.62768
MLP5.877977.987457.084412.154717.84077
Table 4. MAPE (%) and RMSE values of machine learning models for predicting the TFBGA package’s thermal resistance.
Table 4. MAPE (%) and RMSE values of machine learning models for predicting the TFBGA package’s thermal resistance.
MAPE (%)
θJAθJBθJCΨJTΨJB
LGBM0.540500.450010.522742.277550.50401
RF0.833320.250390.161721.470550.19218
XGB0.327960.055410.164840.443340.02457
SVR6.6738614.0305736.3535846.4556414.26818
MLP3.8332214.0218424.3678852.4851812.89667
RMSE
θJAθJBθJCΨJTΨJB
LGBM0.234020.108680.053600.019640.10976
RF0.322220.108720.054800.021090.08745
XGB0.163060.051720.058220.014960.04275
SVR2.642453.035632.404390.258183.03495
MLP1.390982.968661.407180.275442.68824
Table 5. Hyperparameters provided by the grid search for the SVR models and the MLP models.
Table 5. Hyperparameters provided by the grid search for the SVR models and the MLP models.
QFNTFBGA
ModelsHyperparametersθJAθJBθJCΨJTΨJBθJAθJBθJCΨJTΨJB
SVRC1281281288128321211
ε 0.010.010.010.10.010.11.50.010.51.5
gamma0.10.10.10.10.10.10.10.10.10.1
MLPNumber of hidden nodes150150501005010015010050100
Activation functions tanhtanhtanhtanhtanhrelurelurelurelutanh
Optimizersadamadamadamadamadamadamadamadamadamadam
Learning rate0.0010.0010.0010.0010.0010.0010.0010.050.10.001
Table 6. MAPE (%) and RMSE values of SVR models and MLP models for predicting the thermal resistance of the QFN package and the TFBGA package with hyperparameter selection.
Table 6. MAPE (%) and RMSE values of SVR models and MLP models for predicting the thermal resistance of the QFN package and the TFBGA package with hyperparameter selection.
QFNMAPE (%)
θJAθJBθJCΨJTΨJB
SVR3.558235.930159.7048650.390726.77025
MLP1.968323.4845610.1900241.613304.48553
RMSE
SVR3.463853.351523.118101.981873.52001
MLP0.891570.737371.258650.698230.88693
TFBGAMAPE (%)
θJAθJBθJCΨJTΨJB
SVR1.3476213.6284127.6589652.0072913.86218
MLP1.620158.0767921.6142248.888368.21445
RMSE
SVR0.589752.914471.803220.264472.91400
MLP0.644192.240971.364790.261562.24396
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lai, J.-P.; Lin, S.; Lin, V.; Kang, A.; Wang, Y.-P.; Pai, P.-F. Predicting Thermal Resistance of Packaging Design by Machine Learning Models. Micromachines 2025, 16, 350. https://doi.org/10.3390/mi16030350

AMA Style

Lai J-P, Lin S, Lin V, Kang A, Wang Y-P, Pai P-F. Predicting Thermal Resistance of Packaging Design by Machine Learning Models. Micromachines. 2025; 16(3):350. https://doi.org/10.3390/mi16030350

Chicago/Turabian Style

Lai, Jung-Pin, Shane Lin, Vito Lin, Andrew Kang, Yu-Po Wang, and Ping-Feng Pai. 2025. "Predicting Thermal Resistance of Packaging Design by Machine Learning Models" Micromachines 16, no. 3: 350. https://doi.org/10.3390/mi16030350

APA Style

Lai, J.-P., Lin, S., Lin, V., Kang, A., Wang, Y.-P., & Pai, P.-F. (2025). Predicting Thermal Resistance of Packaging Design by Machine Learning Models. Micromachines, 16(3), 350. https://doi.org/10.3390/mi16030350

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop