主管:中国科学院
主办:中国优选法统筹法与经济数学研究会
   中国科学院科技战略咨询研究院

中国管理科学 ›› 2025, Vol. 33 ›› Issue (2): 105-117.doi: 10.16381/j.cnki.issn1003-207x.2023.1927cstr: 32146.14.j.cnki.issn1003-207x.2023.1927

• • 上一篇    下一篇

基于改进Transformer模型的景区短时客流预测研究

李新1(), 张旭1, 余乐安2, 汪寿阳3   

  1. 1.北京科技大学经济管理学院,北京 100083
    2.四川大学商学院,四川 成都 610065
    3.中国科学院数学与系统科学研究院,北京 100190
  • 收稿日期:2023-11-16 修回日期:2024-04-25 出版日期:2025-02-25 发布日期:2025-03-06
  • 通讯作者: 李新 E-mail:drxinli@ustb.edu.cn
  • 基金资助:
    国家自然科学基金面上项目(72371025);国家自然科学基金重点支持项目(72331007);国家自然科学基金杰出青年项目(72025101)

Enhancing Short-term Tourist Flow Forecasting and Evaluation Using an Improved Transformer Framework

Xin Li1(), Xu Zhang1, Lean Yu2, Shouyang Wang3   

  1. 1.School of Economics and Management,University of Science and Technology Beijing,Beijing 100083,China
    2.Business School,Sichuan University,Chengdu 610065,China
    3.Academy of Mathematics and Systems Science,Chinese Academy of Sciences,Beijing 100190,China
  • Received:2023-11-16 Revised:2024-04-25 Online:2025-02-25 Published:2025-03-06
  • Contact: Xin Li E-mail:drxinli@ustb.edu.cn

摘要:

当前的旅游需求预测研究大多以年度、月度和日度频率的数据为主,而在短时高频的客流预测研究方面仍有待深化。本研究提出了一个基于改进Transformer模型的景区客流量预测框架,通过采集北京颐和园、故宫、天坛等7个5A级旅游景区2023年2月—2023年8月的每15分钟数据,采用TPE优化算法对Informer、Autoformer和Fedformer三种基于Transformer的深度学习模型进行改进,对北京7个景区的高频客流量开展一步和多步预测,并与其他深度学习模型(DeepAR、TCN、LSTM)、机器学习模型(GBRT)以及时间序列模型(ARIMA)在多种预测情境中的精度进行评价。结果表明,三种基于改进的Transformer模型在预测表现上展现出显著优势,尤其是Informer模型。本文提出的研究框架丰富了短时客流数据的分析方法,是对现有旅游需求预测研究的重要拓展,能够提高景区高频客流预测精度,在提升景区管理效率和支持决策制定等方面具有重要的现实意义。

关键词: 深度学习, Transformer, Informer, 短时客流, 旅游预测

Abstract:

Current research on tourism demand forecasting primarily focuses on data with annual, monthly, and daily frequencies, leaving a gap in the study of short-term, high-frequency passenger flow predictions. A framework for predicting the flow of visitors to tourist attractions is studied based on an improved Transformer model. By collecting data every 15 minutes from February to August 2023 on seven 5A-level tourist sites in Beijing, including the Summer Palace, the Forbidden City, and the Temple of Heaven, etc, the Informer, Autoformer, and Fedformer models—three Transformer-based deep learning models are enhanced—using the TPE optimization algorithm to conduct one-step and multi-step predictions of high-frequency passenger flows in Beijing's seven scenic areas. These are then evaluated against the precision of other deep learning models (DeepAR, TCN, LSTM), machine learning models (GBRT), and time series models (ARIMA) across various forecasting scenarios. The performance of these models is evaluated using Mean Squared Error (MSE), Mean Absolute Error (MAE), and Improvement Rate (IR) metrics, along with DM tests for statistical significance. The results reveal that the three modified Transformer models consistently outperformed alternative methods, particularly the Informer model, which demonstrated superior accuracy across multiple prediction horizons, ranging from one step (15 minutes ahead) to 48 steps (one day ahead). In the one-step scenario, the Informer achieved an average improvement rate of 14%, escalating to 23% and 40% for four and eight steps, respectively, and maintaining a substantial advantage even at the 48-step horizon, with an average improvement rate of 30%. These findings underscore the significant practical implications of high-frequency data and enhanced Transformer models in improving operational efficiency, managing congestion, optimizing visitor experiences, and providing decision-making information for tourist destinations, thereby promoting sustainable destination development. Furthermore, the body of research on tourism demand analysis and prediction models grounded is enriched in deep learning frameworks, thereby furnishing theoretical underpinnings and practical references instrumental to the advancement of theories and methodologies in tourism demand forecasting.

Key words: deep learning, Transformer, Informer, short-term tourist flow, tourism forecasting

中图分类号: