Next Article in Journal
Text Data Augmentation for the Korean Language
Next Article in Special Issue
Effect of Catalyst Ink and Formation Process on the Multiscale Structure of Catalyst Layers in PEM Fuel Cells
Previous Article in Journal
Machine Learning Powered Microalgae Classification by Use of Polarized Light Scattering Data
Previous Article in Special Issue
An Artificial Intelligence Solution for Predicting Short-Term Degradation Behaviors of Proton Exchange Membrane Fuel Cell
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Statistical Analysis on Random Matrices of Echo State Network in PEMFC System’s Lifetime Prediction

1
School of Automation, Northwestern Polytechnical University, Road Dongxiang, Xi’an 710129, China
2
FEMTO-ST Institute, Université Bourgogne Franche-Comté, UTBM, CNRS, Rue Ernest Thierry Mieg, 90010 Belfort, France
3
FCLAB, Université Bourgogne Franche-Comté, UTBM, CNRS, Rue Ernest Thierry Mieg, 90010 Belfort, France
4
LMOPS Laboratory, Université de Lorraine & CentraleSupélec, 2 Rue Edouard Belin, 57070 Metz, France
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(7), 3421; https://doi.org/10.3390/app12073421
Submission received: 25 February 2022 / Revised: 19 March 2022 / Accepted: 25 March 2022 / Published: 28 March 2022
(This article belongs to the Special Issue Advanced Technologies and Applications of Fuel Cells for Clean Energy)

Abstract

:
The data-driven method of echo state network (ESN) has been successfully used in the proton exchange membrane fuel cell (PEMFC) system’s lifetime prediction area. Nevertheless, the uncertainties of the randomly generated input and internal weight matrices in ESN have not been reported yet. In view of this, an ensemble ESN structure is proposed in this paper to analyze the effects of random matrices from a statistical point of view. For each ESN, the particle swarm optimization (PSO) method is utilized to optimize the hyperparameters of the leaking rate, spectral radius, and regularization coefficient. The statistical results of each ensemble ESN are analyzed from 100 repeated tests whose weight matrices are generated randomly. The mean value of the ensemble ESN and a confidence interval (CI) of 95% are given during the long-term lifetime prediction. The effects of two different distribution shapes, i.e., uniform distribution and Gaussian distribution, are fully compared. Finally, the effects of the ensemble structure and two different distribution shapes are tested by three experimental datasets under steady-state, quasi-dynamic, and full dynamic operating conditions.

1. Introduction

Energy and environment are the focal points attracting the attention of the international community due to the depletion of fossil fuels and the increasing emissions of carbon dioxide [1,2]. Hydrogen (H2) is considered to be a promising energy carrier of sustainable clean energy in the 21st century. The proton exchange membrane fuel cells (PEMFC) system can transform the chemical energy of H2 into electricity directly, and the by-products during the electrochemical process are heat and water [3,4]. It is a promising power conversion device because of its satisfactory characteristics except for the high cost and limited lifetime. Predicting the lifetime of the PEMFC system can help the users to take action in advance to extend its working life.
Model-based and data-driven methods are two typical classifications of existing lifetime prediction methods [5,6]. Data-driven methods, which are mainly based on machine learning (ML) theory, construct the nonlinear degradation relations [7,8]. One of the patterns of ML, the recurrent neural network (RNN), has shown great power in nonlinear time series prediction. Nevertheless, the RNN has some weaknesses such as slow convergence rate and the existence of bifurcation. To overcome the weaknesses of RNN, the echo state network (ESN) was proposed by Prof. Jaeger et al. in [9]. There are two characteristics of ESN: one is a large, randomly generated reservoir that is used to replace the hidden layers of RNN; the other is the weight matrices (input and internal) that are generated randomly. The ESN has a faster convergence rate and lower computational training cost than the traditional RNN. Nevertheless, the hyperparameters of ESN (leaking rate, spectral radius, regularization coefficient, reservoir neuron number, reservoir connectivity, etc.) need to be carefully tuned from human expertise [10].
For the PEMFC system’s lifetime prediction application, the ESN has first been applied in [11] to estimate the steady-state cell voltage by the single-step ahead prediction pattern. Four kinds of reservoirs are compared in [12], and the delay line reservoir (DLR) and the simple cycle reservoir (SCR) have the best precision values. A cycle reservoir with a jump (CRJ) model has been proposed in [13], and this structure is validated on the steady-state datasets. The CRJ model can speed up the linear fitting process and improve prediction accuracy. In ref. [14], the steady-state cell voltages are divided into the approximation part and detail part by wavelet transform, and then double ESNs with different dynamic characteristics are utilized to deal with the voltage signal separately. This double ESN structure could improve prediction accuracy. The hyperparameters of ESN in [11,12,13,14] are set by the trial-and-error method. Inspired by the separation concept in [14], the wavelet transform and multiple ESNs are proposed in [15] to deal with the multi-timescale degradation features under the dynamic operating conditions in the long-term prediction horizon. In ref. [15], the hyperparameters of the spectral radius and leaking rate are optimized by the grid-search method, and the prediction accuracy has been improved. To improve the prediction efficiency, the original degradation data prediction is replaced by the shortened coefficients in the discrete wavelet transform process [16]. The prediction data points can be shortened by a factor of 8, and the hyperparameters of ESN (spectral radius, leaking rate, and regularization coefficient) are optimized by the genetic algorithms (GA) method. In ref. [17], the analysis of variance (ANOVA) method is applied to analyze the four hyperparameters of the reservoir neuron number, reservoir connectivity, spectral radius, and leaking rate under the steady-state dataset. Results show that the spectral radius has the biggest effects on the prediction performance. Additionally, based on the ANOVA method, the effects of three hyperparameters (i.e., leaking rate, spectral radius, and regularization coefficient) have been analyzed under the dynamic operating conditions [18]. Results show that the interaction of the spectral radius and regularization parameter has the biggest effect, and the effects of the spectral radius rank second.
Based on the ANOVA analysis results of [17], a multi-reservoir ESN structure is proposed in [19], and each reservoir has a different spectral radius. This multi-reservoir ESN can avoid the optimization of the spectral radius, and its effectiveness is validated under the steady-state cell voltage prediction condition. As an extension of the work in [19], the ESN and Markov chains are combined to predict the cell voltage under the dynamic micro-combined heat and power (μ-CHP) load profile [20]. The RUL prediction by the combined structure can be realized with a mean relative accuracy below 17%. An ensemble ESN network has recently been proposed to avoid hyperparameter optimization and improve the adaptiveness of lifetime prediction [21]. A dynamic health indicator of “virtual steady-state voltage (VSV)” is proposed based on linear parameter varying (LPV) models. Then, a number of ESNs with different combinations of hyperparameters (leaking rate and spectral radius) are used for the VSV prediction. Finally, the long-term RUL prediction and a confidence interval of 95% confidence level are given in detail. In the past two years, the multi-input ESN structure is proposed to improve the prediction accuracy of the RUL [18,22,23,24]. The effects of the operating parameters (i.e., current, temperature, and pressure) are fully analyzed under the steady-state and dynamic operating conditions in [18,22] separately. The multi-input ESN structure could improve the prediction accuracy regarding the health indicator and the scheduled current as the double inputs are more meaningful in practice. In ref. [23], the ambient temperature and stack voltage are used as the double inputs of ESN to improve the RUL prediction accuracy. The normalized root mean square error is equal to 0.098 with a prediction over 2000 h. In ref. [24], the input parameters of ESN are evaluated and selected by the least absolute shrinkage and selection operator (LASSO), and then the degradation-related parameters are used for the RUL prediction. Results show that the ESN with eight input parameters is the optimal choice. Both [23] and [24] are validated by the steady-state datasets.
In the current state-of-art, the developments of ESN for the PEMFC system’s lifetime prediction are focused on the ESN structure (reservoir forms, ensemble ESN, multi-input ESN, etc.) and hyperparameter optimization. In the practical implementation process of ESN, the input and internal weight matrices (Win and W) remain unchanged after they have been drawn from a random distribution. Nevertheless, the effects of the distribution shape of Win and W on the lifetime prediction accuracy have not been reported yet. There is abundant evidence that the ensemble structure can improve the prediction robustness and stability such as the ensemble neural network model [25], particle filtering [26], and extreme learning machine [27]. The ensemble ESN is proposed in [21], and the effects of the leaking rate and spectral radius have been fully analyzed. Nevertheless, the statistical property of the ESN, e.g., the uncertainty caused by the random matrices, has not been analyzed yet. Based on these two motivations, the contributions of this paper are summarized as follows:
  • The effects of two distribution shapes (uniform and Gaussian distribution) of Win and W on the prediction accuracy are explored;
  • The metaheuristic technique of particle swarm optimization (PSO) is utilized to optimize the hyperparameters of the leaking rate, spectral radius, and regularization coefficient;
  • The uncertainty of the ensemble ESN caused by the random matrices is statistically analyzed under three different operating conditions.

2. Mathematical Backgrounds

2.1. Echo State Network

Similar to the structure of RNN, there are three parts in the ESN: input layer, reservoir, and output layer. The different parts are connected by the weight matrix elements of input weight matrix Win   ϵ   N × K , internal weight matrix W   ϵ   N × N , feedback weight matrix Wfb   ϵ   N × L , and output weight matrix Wout   ϵ   L × ( N + K ) . K is the dimension of the input signal, N is the number of neurons in the reservoir, and L is the dimension of the output signal. The spectral radius ρ is the maximum eigenvalue of W. The Wfb is set to be 0 in this task. At step n, the neuron update in the reservoir is determined by
x ˜ ( n ) = f ( W i n u ( n ) + W x ( n ) )
where x ˜ ( n ) is the neuron update, u ( n ) is the input signal at step n, x ( n ) is the neuron state at step n, and f is the activation function. Then, the neuron state at step (n + 1) can be expressed as
x ( n + 1 ) = ( 1 α ) x ( n ) + α x ˜ ( n )
where α is the leaking rate of the reservoir. The ESN output is represented by
y ( n + 1 ) = W o u t ( x ( n + 1 ) ; u ( n ) )
where y ( n + 1 ) is the output signal. In the training part, the Wout is calculated to minimize the root mean square error (RMSE) between the predicted value y p r e ( n ) and target value y t a r ( n )
W o u t = arg min 1 m n = 1 m ( y p r e ( n ) y t a r ( n ) ) 2
where m is the total data points in the training part. When the ridge regression is utilized, the Wout can be calculated by linear regression
W o u t = Y t a r X T ( X X T + γ I ) 1
where X represents the neuron state and inputs (x(n + 1);u(n)), Y t a r is the target value in the training part, γ is the regularization coefficient, and I is the identity matrix. In this task, the reservoir neuron number N is 400, the bounds of leaking rate α, spectral radius ρ , and the regularization coefficient γ are (0, 1), (0, 2), and (1 × 10 3 , 9 × 10 3 ), respectively. The implementation time of ESN without hyperparameter optimization takes about 20 s in Matlab 2018, Intel-i7 CPU (2.6 GHz), and 16 G RAM environment.
The Win and W are generated randomly and sparsely. For simplicity, they are usually generated in the same distribution form. Based on the practical guide to applying ESN [28], the uniform distribution and Gaussian distribution are two popular distribution shapes. The uniform distribution is first performed for its continuity of values and boundedness. The distribution interval is [−0.5, 0.5], and the variance D(x) is calculated as 1/12. When the Gaussian distribution has the same variance as uniform distribution, the weight matrix follows the normal distribution N (0, 1/12) [29]. The probability density functions of uniform distribution and Gaussian distribution are shown in Figure 1.

2.2. Particle Swarm Optimization

The hyperparameters of the ESN need to be carefully tuned, including the leaking rate α, spectral radius ρ , regularization coefficient γ , reservoir neuron number N, reservoir connectivity c, etc. Some straightforward hyperparameter setting methods such as trial-and-error [18], grid-search [15], and stochastic gradient descent [10] are computationally complex. The effectiveness of some metaheuristic methods such as PSO [30], GA [31], and big bang–big crunch (BB–BC) [20] have been proven in different tasks. Optimizing all hyperparameters is time-consuming, so three key hyperparameters of the ESN (α, ρ , and γ ), which have the biggest effects on the prediction results, are selected to be optimized [16].
The PSO is a satisfactory global search method that is easier to be implemented, has fewer parameters, is computationally less expensive, and can converge more quickly [32]. A population of particles is used for searching, and each particle indicates a candidate solution to the problem. The velocity vi and position xi updating of the particle at time step k are shown as
{ v i k + 1 = w v i k + c 1 r 1 ( p i b x i k ) + c 2 r 2 ( g i b x i k ) x i k + 1 = x i k + v i k + 1
where pib and gib are the best personal and global positions, w is the inertial weight, c1 and c2 are the learning factors, and r1 and r2 are the random variables from 0 to 1. The objective function is to minimize the RMSE between the predicted value y p r e ( n ) and target value y t a r ( n ) , as shown in Equation (4). In this case study, the swarm size is 30, the maximum number of iterations is 600, the function tolerance is 1 × 10 6 , both learning factors are 1.49, and the inertial weight is 0.8.

2.3. Implementation of Ensemble ESN

The implementation process of the ensemble ESN is shown in Figure 2. There are four steps in total:
Step 1 is the training data preparation. Some pretreatment procedures are necessary, such as filtering, resampling, and normalization after the data (voltage, current, etc.) acquisition. In this task, the moving average filtering method is utilized, and the length of the moving window is empirically set to 31. The resampling time interval between two adjacent points is 0.5 h. The data are normalized to the same interval [0, 1] by Equation (7)
z = z z min z max z min
where z’ is the normalized value, z is the original value, and zmax and zmin are the maximum and minimum values, respectively. The health indicators can be extracted, i.e., voltage for steady-state and quasi-dynamic operating conditions and the relative power loss rate (RPLR) for the dynamic operating condition.
Step 2 is the degradation prediction based on individual ESNs. The randomly generated procedure of Win and W are similar in the ESN, so the distribution shapes of these two matrices are the same for convenience. For each ESNi (1 ≤ ip), the uniform distribution and Gaussian distribution shapes of these two matrices are explored separately. The hyperparameters of the ESN are optimized by PSO in the training part, and the optimized parameter combination is used in the prediction part. Thus, each ESN would have two long-term prediction results under two random matrices distribution shapes. The total number of ensemble ESNs is set to be 100 (p = 100) for statistical analysis.
Step 3 is the ensemble of prediction results. For a certain distribution shape, 100 times prediction results are assembled to form a single new model to statistically analyze the randomness. The weight of each ESN is taken to be the same, i.e., each ESN has the same contribution during the RUL prediction. Thus, the mean value of 100 ESNs at each time is calculated as the final prediction value.
Step 4 is the result analysis. Each ESN could realize an individual prediction, and its RMSE between the predicted signal and target signal can be calculated by
RMSE = 1 m t = 1 m ( y p r e ( t ) y t a r ( t ) ) 2
At a certain time step t (1 ≤ tm), the mean and the variance of the predicted value at time step (s + t) can be represented as
{ y ¯ ( s + t ) = 1 p i = 1 p y ^ i ( s + t ) D ( s + t ) = 1 p 1 i = 1 p ( y ¯ ( s + t ) y ^ i ( s + t ) ) 2
where p is the total number of ensemble ESNs, y ^ i ( s + t ) is the predicted value, y ¯ ( s + t ) is mean value, and D ( s + t ) is the variance. Assuming the predicted value follows the standard Gaussian distribution, the 95% confidence interval (CI) of the results at time step (s + t) can be expressed as
CI ( s + t ) =   [ y ¯ ( s + t ) 1.96 D ( s + t ) , y ¯ ( s + t ) + 1.96 D ( s + t ) ]

3. Experimental Results

Three degradation datasets under different operation conditions are used for exploring the prediction effects of two distribution shapes of weight matrices, testing the performance improvements of the ensemble ESN, and statistically analyzing the uncertainty of random matrices. The steady-state and quasi-dynamic datasets come from the open data of the 2014 Prognostic and Health Management Data Challenge [22]. Two PEMFC stacks are used for the 1000 h duration test under the steady-state and quasi-dynamic operating conditions. There are 5 cells for both stacks, and their active area is 100 cm2, the operating temperature is 80   , and the back pressure is 0.2 MPa. For the steady-state test, the constant current is 70 A. For the quasi-dynamic test, the current ripples (7 A) at different frequencies (from 50 mHz to 10 kHz) are superimposed to the constant current (70 A) for the electrochemical impedance spectroscopy (EIS) measurement. The third dataset comes from a PEMFC stack for 382 hours’ duration test under the dynamic μ-CHP load profile. The current has a cycle between 170 A and 85 A. The stack consists of 8 cells, and the active area is 220 cm2, the operating temperature is 80   , and the back pressure is 0.15 MPa [18]. The load profiles of the three tests are shown in Figure 3.

3.1. Steady-State Operating Condition

In the steady-state operating condition, the stack output power is used as the degradation health indicator. Half of the data (0–500 h) is used for training the ESN, and the rest of the data (501–1000 h) is used for prediction. During the prediction procedure, 100 individual ESNs have 100 groups of prediction results. Then, the mean value, variance, and 95% CI of the predicted results can be calculated at each time step. When Win and W follow the uniform distribution, the RUL prediction results are shown in Figure 4. Taking the first 10 groups of ESNs as an example, their hyperparameters and RMSE are shown in Table 1.
After changing the distribution shapes of Win and W to Gaussian distribution, the RUL prediction results under the steady-state operating condition are shown in Figure 5. The hyperparameters and RMSE of the first 10 groups of ESNs are shown in Table 2. To analyze the effects of the two matrices’ distribution shapes of Win and W on the prediction accuracy, the RMSE of 100 ESNs is shown in Figure 6.
The average RMSE under the Gaussian distribution (0.57488) is lower than the average RMSE under the uniform distribution (0.58602) for the steady-state test. The mean prediction results of the ensemble ESN are calculated, i.e., 0.57841 for uniform distribution and 0.56966 for Gaussian distribution.

3.2. Quasi-Dynamic Operating Condition

In the quasi-dynamic operating condition, the stack output power is used as the degradation-related health indicator. Similar to the steady-state operating condition, half of the data (0–500 h) is used for training the ESN, and the rest of the data (501–1000 h) is used for prediction. When Win and W follow the uniform distribution, the RUL prediction results are shown in Figure 7. Taking the first 10 groups of ESNs as an example, their hyperparameters and RMSE are shown in Table 3.
After changing the distribution shapes of Win and W to Gaussian distribution, the RUL prediction results under the quasi-dynamic operating condition are shown in Figure 8. The hyperparameters and RMSE of the first 10 groups of ESNs are shown in Table 4.
To analyze the effects of the two matrices’ distribution shapes of Win and W on the prediction accuracy, the RMSE of 100 groups of ESNs is shown in Figure 9. The average RMSE under the Gaussian distribution (1.19634) is lower than the average RMSE under the uniform distribution (1.27562) for the quasi-dynamic test. The mean prediction results of the ensemble ESN are calculated, i.e., 0.95657 for uniform distribution and 0.90618 for Gaussian distribution.

3.3. Dynamic Operating Condition

In the dynamic operating condition, the RPLR is used as the degradation health indicator, which has been described in detail in our previous work [18]. First, the polarization curve of the PEMFC stack at the beginning of life (BoL) is measured. The BoL power at different currents can be calculated to form a table. Secondly, the power at time step t can be calculated by the measured current (It) and voltage (Ut). Finally, the RPLR at time step t is calculated by
RPLR = ( P t P 0 ) / P t
where Pt is the power at time step t and P0 is the BoL power, which is determined by the look-up table method. The Pt and P0 are at the same current level.
Half of the data (0–191 h) is used for training the ESN, and the rest of the data (192–382 h) is used for prediction. When Win and W follow the uniform distribution, the RUL prediction results are shown in Figure 10. Taking the first 10 groups of ESNs as an example, their hyperparameters and RMSE are shown in Table 5.
After changing the distribution shapes of Win and W to Gaussian distribution, the RUL prediction results under the dynamic operating condition are shown in Figure 11. The hyperparameters and RMSE of the first 10 groups of ESNs are shown in Table 6.
The RMSE of 100 groups of ESNs is shown in Figure 12. The average RMSE under the Gaussian distribution (0.00489) is lower than the average RMSE under the uniform distribution (0.00502) for the dynamic test. The mean prediction results of the ensemble ESN are calculated, i.e., 0.00473 for uniform distribution and 0.00464 for Gaussian distribution.
Among these three different operating conditions, each random generation of Win and W produces a different prediction result. The essential reason is that the randomness of the matrices changes the weight values of the connected neurons, and the random weight leads to different neuron behaviors, which are important for the nonlinear relationship construction ability of the reservoir. The numerous repeated tests also illustrate the uncertainty properties of random matrices. The ensemble ESN structure is utilized to statistically analyze the effects of random characteristics and the bounds of uncertainties are obtained. Taking into account these uncertainties, the robustness of the prediction can be further improved in practical applications.

4. Conclusions

The data-driven method of echo state network (ESN) is further explored in the long-term lifetime prediction of the PEMFC system. During the implementation process of the ESN, the input and the internal weight matrices (Win and W) are randomly generated. The traditional method keeps these two matrices unchanged once they are generated. Nevertheless, the random characteristics of these matrices will affect the performance of the ESN. To statistically analyze the effects of random characteristics, an ensemble ESN is proposed and tested under three different operating conditions. The conclusions are as follows:
1.
The random characteristics of Win and W affect the lifetime prediction results. The prediction results are presented in the statistical form by the ensemble computing technique, and this helps the user to analyze the uncertainties of the randomness. After the data analysis, a 95% confidence interval (CI) is given, which better qualifies the reliability of the result.
2.
Based on the uniform and Gaussian distribution shapes of Win and W, the prediction performances are fully compared. To analyze the effects of random matrices, the PSO method is used to optimize the hyperparameters of the ESN. Based on the comparison results, the Gaussian distribution of Win and W can decrease the prediction error slightly when compared to the uniform distribution. However, the effects of two different distribution shapes on the prediction results are rather insignificant.
3.
Combining the ESN with other methods to improve the prediction accuracy and exploring the prognostic methods to realize the online lifetime prediction will be the focuses of our future work.

Author Contributions

Conceptualization, Z.H. and Z.Z.; Formal analysis, Z.H. and Z.Z.; Funding acquisition, M.-C.P. and F.G.; Methodology, Z.H. and Z.Z.; Project administration, M.-C.P. and F.G.; Software, Z.H. and Z.Z.; Supervision, Z.Z., M.-C.P. and F.G.; Validation, Z.H. and Z.Z.; Writing—original draft, Z.H.; Writing—review and editing, Z.H., Z.Z., M.-C.P. and F.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been supported by the EIPHI Graduate school (contract “ANR-17-EURE-0002”).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Qiu, Y.; Li, Q.; Huang, L.; Sun, C.; Wang, T.; Chen, W. Adaptive uncertainty sets-based two-stage robust optimization for economic dispatch of microgrid with demand response. IET Renew. Power Gener. 2020, 14, 3608–3615. [Google Scholar] [CrossRef]
  2. Pu, Y.; Li, Q.; Zou, X.; Li, R.; Li, L.; Chen, W.; Liu, H. Optimal sizing for an integrated energy system considering degradation and seasonal hydrogen storage. Appl. Energy 2021, 302, 117542. [Google Scholar] [CrossRef]
  3. Li, Q.; Wang, T.; Li, S.; Chen, W.; Liu, H.; Breaz, E.; Gao, F. Online extremum seeking-based optimized energy management strategy for hybrid electric tram considering fuel cell degradation. Appl. Energy 2021, 285, 116505. [Google Scholar] [CrossRef]
  4. Li, Q.; Yin, L.; Yang, H.; Wang, T.; Qiu, Y.; Chen, W. Multiobjective optimization and data-driven constraint adaptive predictive control for efficient and stable operation of PEMFC system. IEEE Trans. Ind. Electron. 2021, 68, 12418–12429. [Google Scholar] [CrossRef]
  5. Wang, C.; Li, Z.; Outbib, R.; Dou, M.; Zhao, D. A novel long short-term memory networks-based data-driven prognostic strategy for proton exchange membrane fuel cells. Int. J. Hydrogen Energy 2022, 47, 10395–10408. [Google Scholar] [CrossRef]
  6. Hua, Z.; Zheng, Z.; Pahon, E.; Péra, M.C.; Gao, F. A review on lifetime prediction of proton exchange membrane fuel cells system. J. Power Sources 2022, 529, 231256. [Google Scholar] [CrossRef]
  7. Zuo, J.; Lv, H.; Zhou, D.; Xue, Q.; Jin, L.; Zhou, W.; Yang, D.; Zhang, C. Deep learning based prognostic framework towards proton exchange membrane fuel cell for automotive application. Appl. Energy 2021, 281, 115937. [Google Scholar] [CrossRef]
  8. Yue, M.; Li, Z.; Roche, R.; Jemei, S.; Zerhouni, N. Degradation identification and prognostics of proton exchange membrane fuel cell under dynamic load. Control Eng. Pract. 2022, 118, 104959. [Google Scholar] [CrossRef]
  9. Hadaeghi, F.; Jaeger, H. Computing optimal discrete readout weights in reservoir computing is NP-hard. Neurocomputing 2019, 338, 233–236. [Google Scholar] [CrossRef] [Green Version]
  10. Thiede, L.A.; Parlitz, U. Gradient based hyperparameter optimization in echo state networks. Neural Netw. 2019, 115, 23–29. [Google Scholar] [CrossRef]
  11. Morando, S.; Jemei, S.; Gouriveau, R.; Zerhouni, N.; Hissel, D. Fuel cells prognostics using echo state network. In Proceedings of the 39th Annual Conference of the IEEE Industrial Electronics Society (IECON), Vienna, Austria, 10–13 November 2013. [Google Scholar] [CrossRef] [Green Version]
  12. Morando, S.; Jemei, S.; Hissel, D.; Gouriveau, R.; Zerhouni, N. Predicting the remaining useful lifetime of a proton exchange membrane fuel cell using an echo state network. In Proceedings of the International Discussion on Hydrogen Energy and Applications (IDHEA), Nantes, France, 12–14 May 2014. [Google Scholar]
  13. Jin, J.; Chen, Y.; Wenchao, Z.; Xie, C.; Wu, F. Remaining useful life prediction of PEMFC based on cycle reservoir with jump model. Int. J. Hydrogen Energy 2021, 46, 40001–40013. [Google Scholar] [CrossRef]
  14. Morando, S.; Jemei, S.; Hissel, D.; Gouriveau, R.; Zerhouni, N. Proton exchange membrane fuel cell ageing forecasting algorithm based on echo state network. Int. J. Hydrogen Energy 2017, 42, 1472–1480. [Google Scholar] [CrossRef]
  15. Hua, Z.; Zheng, Z.; Pahon, E.; Péra, M.C.; Gao, F. Multi-timescale lifespan prediction for PEMFC systems under dynamic operating conditions. IEEE Trans. Transp. Electrif. 2022, 8, 345–355. [Google Scholar] [CrossRef]
  16. Hua, Z.; Zheng, Z.; Pahon, E.; Péra, M.C.; Gao, F. Lifespan prediction for proton exchange membrane fuel cells based on wavelet transform and echo state network. IEEE Trans. Transp. Electrif. 2022, 8, 420–431. [Google Scholar] [CrossRef]
  17. Morando, S.; Jemei, S.; Hissel, D.; Gouriveau, R.; Zerhouni, N. ANOVA method applied to proton exchange membrane fuel cell ageing forecasting using an echo state network. Math. Comput. Simul. 2017, 131, 283–294. [Google Scholar] [CrossRef]
  18. Hua, Z.; Zheng, Z.; Pahon, E.; Péra, M.-C.; Gao, F. Remaining useful life prediction of PEMFC systems under dynamic operating conditions. Energy Convers. Manag. 2021, 231, 113825. [Google Scholar] [CrossRef]
  19. Mezzi, R.; Morando, S.; Steiner, N.Y.; Péra, M.C.; Hissel, D.; Larger, L. Multi-reservoir echo state network for proton exchange membrane fuel cell remaining useful life prediction. In Proceedings of the 44th Annual Conference of the IEEE Industrial Electronics Society (IECON), Washington, DC, USA, 21–23 October 2018. [Google Scholar] [CrossRef]
  20. Mezzi, R.; Yousfi-Steiner, N.; Péra, M.C.; Hissel, D.; Larger, L. An echo state network for fuel cell lifetime prediction under a dynamic micro-cogeneration load profile. Appl. Energy 2021, 283, 116297. [Google Scholar] [CrossRef]
  21. Li, Z.; Zheng, Z.; Outbib, R. Adaptive prognostic of fuel cells by implementing ensemble echo state networks in time varying model space. IEEE Trans. Ind. Electron. 2019, 67, 379–389. [Google Scholar] [CrossRef] [Green Version]
  22. Hua, Z.; Zheng, Z.; Péra, M.-C.; Gao, F. Remaining useful life prediction of PEMFC systems based on the multi-input echo state network. Appl. Energy 2020, 265, 114791. [Google Scholar] [CrossRef]
  23. Vichard, L.; Harel, F.; Ravey, A.; Venet, P.; Hissel, D. Degradation prediction of PEM fuel cell based on artificial intelligence. Int. J. Hydrogen Energy 2020, 45, 14953–14963. [Google Scholar] [CrossRef]
  24. He, K.; Mao, L.; Yu, J.; Huang, W.; He, Q.; Jackson, L. Long-term performance prediction of PEMFC based on LASSO-ESN. IEEE Trans. Instrum. Meas. 2021, 70, 1–11. [Google Scholar] [CrossRef]
  25. Napoli, G.; Ferraro, M.; Sergi, F.; Brunaccini, G.; Antonucci, V. Data driven models for a PEM fuel cell stack performance prediction. Int. J. Hydrogen Energy 2013, 38, 11628–11638. [Google Scholar] [CrossRef]
  26. Zhang, D.; Baraldi, P.; Cadet, C.; Yousfi-Steiner, N.; Bérenguer, C.; Zio, E. An ensemble of models for integrating dependent sources of information for the prognosis of the remaining useful life of proton exchange membrane fuel cells. Mech. Syst. Signal Process. 2019, 124, 479–501. [Google Scholar] [CrossRef] [Green Version]
  27. Javed, K.; Gouriveau, R.; Zerhouni, N.; Hissel, D. Prognostics of proton exchange membrane fuel cells stack using an ensemble of constraints based connectionist networks. J. Power Sources 2016, 324, 745–757. [Google Scholar] [CrossRef]
  28. Lukoševičius, M. A practical guide to applying echo state networks. In Neural Networks: Tricks of the Trade, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2012; pp. 659–686. [Google Scholar] [CrossRef]
  29. Wu, Q.; Kudithipudi, D. An ensemble learning approach to the predictive stability of echo state networks. J. Inform. Math. Sci. 2018, 10, 181–199. [Google Scholar] [CrossRef]
  30. Chouikhi, N.; Fdhila, R.; Ammar, B.; Rokbani, N.; Alimi, A.M. Single- and multi-objective particle swarm optimization of reservoir structure in echo state network. In Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 24–29 July 2016. [Google Scholar] [CrossRef]
  31. Zhong, S.; Xie, X.; Lin, L.; Wang, F. Genetic algorithm optimized double-reservoir echo state network for multi-regime time series prediction. Neurocomputing 2017, 238, 191–204. [Google Scholar] [CrossRef]
  32. Wang, H.; Yan, X. Optimizing the echo state network with a binary particle swarm optimization algorithm. Knowl. Based Syst. 2015, 86, 182–193. [Google Scholar] [CrossRef]
Figure 1. The probability density function of uniform distribution and Gaussian distribution.
Figure 1. The probability density function of uniform distribution and Gaussian distribution.
Applsci 12 03421 g001
Figure 2. The implementation process of ensemble ESN.
Figure 2. The implementation process of ensemble ESN.
Applsci 12 03421 g002
Figure 3. Load profiles of three tests: (a) steady-state operating condition; (b) quasi-dynamic operating condition; (c) dynamic operating condition.
Figure 3. Load profiles of three tests: (a) steady-state operating condition; (b) quasi-dynamic operating condition; (c) dynamic operating condition.
Applsci 12 03421 g003aApplsci 12 03421 g003b
Figure 4. Steady-state RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 4. Steady-state RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g004
Figure 5. Steady-state RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 5. Steady-state RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g005
Figure 6. RMSE of 100 groups of ESNs under the steady-state operating condition.
Figure 6. RMSE of 100 groups of ESNs under the steady-state operating condition.
Applsci 12 03421 g006
Figure 7. Quasi-dynamic RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 7. Quasi-dynamic RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g007
Figure 8. Quasi-dynamic RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 8. Quasi-dynamic RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g008
Figure 9. RMSE of 100 groups of ESNs under the quasi-dynamic operating condition.
Figure 9. RMSE of 100 groups of ESNs under the quasi-dynamic operating condition.
Applsci 12 03421 g009
Figure 10. Dynamic RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 10. Dynamic RUL results under the uniform distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g010
Figure 11. Dynamic RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Figure 11. Dynamic RUL results under the Gaussian distribution: (a) Prediction results of 100 individual ESNs (blue lines); (b) Mean prediction results (blue line) of ensemble ESN with 95% CI.
Applsci 12 03421 g011
Figure 12. RMSE of 100 groups of ESNs under the dynamic operating condition.
Figure 12. RMSE of 100 groups of ESNs under the dynamic operating condition.
Applsci 12 03421 g012
Table 1. Steady-state results of ESN1-ESN10 when Win and W follow the uniform distribution.
Table 1. Steady-state results of ESN1-ESN10 when Win and W follow the uniform distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.800000.244770.800000.800000.100000.615980.550700.280000.100000.10000
0.381250.692270.797480.932180.207560.571880.563480.725810.183740.17359
0.001000.009000.009000.009000.001640.003630.001000.009000.001360.00340
RMSE0.580500.594870.596470.600320.595540.575500.573310.570920.605880.63377
Table 2. Steady-state results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Table 2. Steady-state results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.900000.900000.900000.850000.390400.900000.900000.715720.900000.90000
0.195600.334400.445900.100000.862500.100000.317600.100001.410600.10000
0.001000.003200.006500.006600.003500.001100.001000.001000.001000.00120
RMSE0.565080.577300.571510.569700.561990.569150.568010.582550.571920.56398
Table 3. Quasi-dynamic results of ESN1-ESN10 when Win and W follow the uniform distribution.
Table 3. Quasi-dynamic results of ESN1-ESN10 when Win and W follow the uniform distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.299010.109700.108200.137400.100000.347700.100000.415900.108000.12280
1.900001.802301.741101.900001.622371.556701.900001.427401.424601.61780
0.002990.007100.008600.001500.009000.001100.009000.008900.009000.00100
RMSE1.073941.012901.027101.103801.418011.030801.888301.280401.095401.49140
Table 4. Quasi-dynamic results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Table 4. Quasi-dynamic results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.870000.358300.384200.101100.874200.544000.846200.828200.100000.10900
1.365001.805701.581101.518601.479401.900001.830801.430601.760601.86000
0.007900.008300.009000.009000.003300.009000.008800.001000.001000.00580
RMSE1.038900.939201.281700.895600.963801.373081.050200.976901.139700.90080
Table 5. Dynamic results of ESN1-ESN10 when Win and W follow the uniform distribution.
Table 5. Dynamic results of ESN1-ESN10 when Win and W follow the uniform distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.124830.256710.158440.112130.342450.232910.239230.148340.269630.36064
1.432051.749591.352571.534651.181921.337071.900001.231531.373281.25512
0.002180.005240.009000.009000.001290.004050.001000.009000.001650.00253
RMSE0.004760.005270.004830.004820.005780.005230.005080.004850.004790.00501
Table 6. Dynamic results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Table 6. Dynamic results of ESN1-ESN10 when Win and W follow the Gaussian distribution.
Test GroupESN1ESN2ESN3ESN4ESN5ESN6ESN7ESN8ESN9ESN10
Hyper-para.α
ρ
γ
0.303760.476590.543720.500780.137090.575840.321000.280350.115640.39271
1.646201.299021.551391.718621.382321.441231.431601.594971.507291.90000
0.006340.009000.005860.009000.009000.006380.001570.006250.009000.00900
RMSE0.005110.005270.005250.004810.004130.005120.005090.004730.005690.00485
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hua, Z.; Zheng, Z.; Péra, M.-C.; Gao, F. Statistical Analysis on Random Matrices of Echo State Network in PEMFC System’s Lifetime Prediction. Appl. Sci. 2022, 12, 3421. https://doi.org/10.3390/app12073421

AMA Style

Hua Z, Zheng Z, Péra M-C, Gao F. Statistical Analysis on Random Matrices of Echo State Network in PEMFC System’s Lifetime Prediction. Applied Sciences. 2022; 12(7):3421. https://doi.org/10.3390/app12073421

Chicago/Turabian Style

Hua, Zhiguang, Zhixue Zheng, Marie-Cécile Péra, and Fei Gao. 2022. "Statistical Analysis on Random Matrices of Echo State Network in PEMFC System’s Lifetime Prediction" Applied Sciences 12, no. 7: 3421. https://doi.org/10.3390/app12073421

APA Style

Hua, Z., Zheng, Z., Péra, M. -C., & Gao, F. (2022). Statistical Analysis on Random Matrices of Echo State Network in PEMFC System’s Lifetime Prediction. Applied Sciences, 12(7), 3421. https://doi.org/10.3390/app12073421

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop