主管:中国科学院
主办:中国优选法统筹法与经济数学研究会
   中国科学院科技战略咨询研究院

Chinese Journal of Management Science ›› 2023, Vol. 31 ›› Issue (3): 155-166.doi: 10.16381/j.cnki.issn1003-207x.2022.0401

• Articles • Previous Articles    

Lifetime Decay Prediction of Fuel Cell Based on Attention Neural Network

GAO Ming1, 2, LIU Chao1, 3, TANG Jia-fu1, SUN Si-jing4, ZOU Guang-yu5   

  1. 1. School of Management Science and Engineering, Dongbei University of Finance and Economics, Liaoning Provincial Key Laboratory of Big Data Management and Optimization Decision, Dalian 116025, China;2. Center for Post-doctoral Studies of Computer Science, Northeastern University, Shenyang 110819, China;3. Linde Hydrogen FuelTech Dalian Co., Ltd, Dalian 116100, China;4. Liaoning Dalien Law Firm, Dalian 116001, China;5. Department of Electrical Engineering, Dalian University of Technology, Dalian 116024, China
  • Received:2022-02-28 Revised:2022-09-07 Published:2023-04-03
  • Contact: 高明 E-mail:gm@dufe.edu.cn

Abstract: As a green energy source, fuel cells are an important way to achieve a low-carbon economy. However, the safety, high cost and durability of the mainstream PEMFC (Proton Exchange Membrane Fuel Cell) have restricted their commercialization, and effective lifetime prediction can improve reliability, maintainability and reduce the total cost of use, so the lifetime decay prediction of PEMFC has become an important issue of common concern for the fuel cell industry and academia. The effective lifetime prediction of PEMFC faces great challenges due to the complexity of their physicochemical processes, operating states, environmental conditions, and operating conditions. Model-driven prediction methods are constrained by the difficulty of accurately modeling the complex reaction mechanism inside the battery, as well as the subjectivity and one-sidedness of empirical rules, which make the simulation calculation large and the accuracy difficult to improve. The data-driven prediction methods, such as statistical analysis methods, require statistical modelling or stochastic processes related analysis involving parameter estimation and other aspects, which give more ideal assumptions and are prone to the risk of information loss. At the same time, traditional machine learning methods have different model fitting abilities and large model selection workload. In particular, it is difficult to take into account the complex nonlinear relationships between features and time steps in the multidimensional time series prediction, as well as the noise and bias existing in the original data at the same time, and the prediction accuracy is limited. In recent years, deep learning has become an effective method to solve highly nonlinear multidimensional time series prediction due to the powerful nonlinear fitting ability and flexible modeling by artificial neural networks. In contrast, RNN (Recurrent Neural Network), which are commonly used in existing research, mainly focus on short series learning, with insufficient global modeling and learning ability on long series to capture complex interactions between multidimensional vectors at different time steps. Although the Transformer has shown great advantages on large-scale natural language processing and computer vision tasks, it suffers from overfitting due to the limited sample size and behaves poorly in this study. Therefore, based on the characteristics and limitations of LSTM (Long Short-Term Memory neural network), 1D-CNN (1-Dimensional Convolutional Neural Network), a novel composite deep neural network AACNN-LSTM (Attention After CNN-LSTM) is proposed for multidimensional time series prediction. The feature vectors (including average voltage, current density, hydrogen pressure, air pressure, and circulating water pressure, etc.) in multiple historical time points are constructed from a real PEMFC’s 3-month lifetime test dataset as multidimensional time series inputs. The method uses 1D-CNN for smoothing and filtering, and the LSTM layer for learning the temporal relationships among multidimensional vectors. Finally, the Attention module is introduced, which adaptively weights the multidimensional vectors at different time steps from a global perspective to decide which features play a key role in the prediction results. The model uses the output voltage of the PEMFC as the prediction result for lifetime evaluation. Verification experiments in different life stages, ablation study with multiple architecture variants, and comparison with different types of neural networks are conducted. The results show that the accuracy is significantly improved compared to other methods, and maintains a good computational efficiency. The generalizability and superiority of the model are also verified on the IEEE PHM 2014 fuel cell life prediction challenge dataset. In addition, the multi-step time series prediction of PEMFC lifetime is explored, and is able to achieve acceptable accuracy within a moderate prediction step (10 h) using historical information of 72 (h) steps, which has a certain practical value and encourages longer and more reliable multi-step prediction. The proposed CNN-LSTM combination verifies that CNN’s inductive bias learning can be complemented with LSTM’s sequence learning, which naturally achieves end-to-end combined learning of smoothing, filtering, and sequence learning and improves the final prediction accuracy. The complementarity and effective location of Attention around CNN, LSTM, and GRU (Gated Recurrent Unit) is verified, and the necessity of composite deep neural networks in domain-specific problems is also corroborated. In addition, it is also found that the end-to-end multi-step time series prediction model is more accurate than the iterative multi-step prediction. The proposed method has significant technical value in the field of PEMFC lifetime prediction, e.g., for accelerated fuel cell aging tests, predictive maintenance, anomaly detection, and safety assurance. It is also useful for other type of energy batteries with similar data structure, such as lithium batteries.

Key words: lifetime prediction; attention mechanism; time series prediction; LSTM

CLC Number: