Automotive Innovation ›› 2024, Vol. 7 ›› Issue (2): 258-270.doi: 10.1007/s42154-023-00269-6

Previous Articles     Next Articles

Efficient Interaction-Aware Trajectory Prediction Model Based on Multi-head Attention

Zifeng Peng1, Jun Yan1, Huilin Yin1, Yurong Wen1, Wanchen Ge1, Tobias Watzel2 & Gerhard Rigoll2    

  1. Zifeng Peng and Jun Yan contributed equally to this work as first authors. 1. School of Electronic and Information Engineering, Tongji University, Caoan Gonglu Street, Shanghai, 201804, China
    2. Institute for Human–Machine Communication, Technical University of Munich, Arcisstraße, 80333, Munich, Germany

  • Online:2024-05-20 Published:2025-04-03

Abstract: Predicting vehicle trajectories using deep learning has seen substantial progress in recent years. However, making autonomous vehicles pay attention to their surrounding vehicles with the consideration of social interaction remains an open problem, especially in long-term prediction scenarios. Unlike autonomous vehicles, human drivers continuously observes and analyzes interactive information between their vehicle and other traffic participants for long-term route planning. To alleviate the challenge that the trajectory prediction should be interaction-aware, this study proposes a multi-head attention mechanism to boost the trajectory prediction performance by globally exploiting the interactive information. The multi-dimensional spatial interactive information encoded with the vehicle type and size can assign different weights of surrounding vehicles to realize the interaction of diverse trajectories. Furthermore, the model is based on a simple data pre-processing method, surpassing the traditional grid data processing approach. In the experiment, the proposed model achieves significant prediction performance. Surprisingly, this proposed multi-head trajectory prediction model outperforms state-of-the-art models, particularly in long-term prediction metrics. The code for this model is accessible at: https://github.com/pengpengjun/hybrid attention.