主管:中国科学院
主办:中国优选法统筹法与经济数学研究会
   中国科学院科技战略咨询研究院

Chinese Journal of Management Science ›› 2025, Vol. 33 ›› Issue (2): 105-117.doi: 10.16381/j.cnki.issn1003-207x.2023.1927

Previous Articles     Next Articles

Enhancing Short-term Tourist Flow Forecasting and Evaluation Using an Improved Transformer Framework

Xin Li1(), Xu Zhang1, Lean Yu2, Shouyang Wang3   

  1. 1.School of Economics and Management,University of Science and Technology Beijing,Beijing 100083,China
    2.Business School,Sichuan University,Chengdu 610065,China
    3.Academy of Mathematics and Systems Science,Chinese Academy of Sciences,Beijing 100190,China
  • Received:2023-11-16 Revised:2024-04-25 Online:2025-02-25 Published:2025-03-06
  • Contact: Xin Li E-mail:drxinli@ustb.edu.cn

Abstract:

Current research on tourism demand forecasting primarily focuses on data with annual, monthly, and daily frequencies, leaving a gap in the study of short-term, high-frequency passenger flow predictions. A framework for predicting the flow of visitors to tourist attractions is studied based on an improved Transformer model. By collecting data every 15 minutes from February to August 2023 on seven 5A-level tourist sites in Beijing, including the Summer Palace, the Forbidden City, and the Temple of Heaven, etc, the Informer, Autoformer, and Fedformer models—three Transformer-based deep learning models are enhanced—using the TPE optimization algorithm to conduct one-step and multi-step predictions of high-frequency passenger flows in Beijing's seven scenic areas. These are then evaluated against the precision of other deep learning models (DeepAR, TCN, LSTM), machine learning models (GBRT), and time series models (ARIMA) across various forecasting scenarios. The performance of these models is evaluated using Mean Squared Error (MSE), Mean Absolute Error (MAE), and Improvement Rate (IR) metrics, along with DM tests for statistical significance. The results reveal that the three modified Transformer models consistently outperformed alternative methods, particularly the Informer model, which demonstrated superior accuracy across multiple prediction horizons, ranging from one step (15 minutes ahead) to 48 steps (one day ahead). In the one-step scenario, the Informer achieved an average improvement rate of 14%, escalating to 23% and 40% for four and eight steps, respectively, and maintaining a substantial advantage even at the 48-step horizon, with an average improvement rate of 30%. These findings underscore the significant practical implications of high-frequency data and enhanced Transformer models in improving operational efficiency, managing congestion, optimizing visitor experiences, and providing decision-making information for tourist destinations, thereby promoting sustainable destination development. Furthermore, the body of research on tourism demand analysis and prediction models grounded is enriched in deep learning frameworks, thereby furnishing theoretical underpinnings and practical references instrumental to the advancement of theories and methodologies in tourism demand forecasting.

Key words: deep learning, Transformer, Informer, short-term tourist flow, tourism forecasting

CLC Number: