Journal of System Simulation ›› 2025, Vol. 37 ›› Issue (3): 657-666.doi: 10.16182/j.issn1004731x.joss.23-1388

• Papers • Previous Articles    

Fine-grained Traffic Flow Inference Model Based on Dynamic Back Projection Network

Xu Ming1, Qi Guangyao1, Qi Geqi2   

  1. 1.College of Software, Liaoning Technical University, Huludao 125105, China
    2.School of Traffic and Transportation, Beijing Jiaotong University, Beijing 100044, China
  • Received:2023-11-16 Revised:2023-12-23 Online:2025-03-17 Published:2025-03-21
  • Contact: Qi Geqi

Abstract:

To solve the problem of large errors in the inference results of existing fine-grained urban flow inference models in complex traffic areas, a fine-grained traffic flow inference model based on dynamic back-projection network is proposed. The multi-dimensional interaction between the input coarse-grained traffic flow and external factors is calculated, and the interaction results are dynamically and adaptively fused with the coarse-grained traffic flow, so that the features can interact and adjust each other to assist model reasoning.Combining deep convolution and self-attention mechanism to learn local information and global information, and improve the understanding of input data by subsequent block. Through the back projection algorithm and gated cross attention mechanism, the traffic flow characteristics of complex regions are learned at a fine-grained level. Finally, a nonlinear transformation path is introduced based on flow normalization mechanism to enforce spatial structure constraints using information at different levels, thereby improving the inference accuracy of the model. Experimental results demonstrate that the proposed model outperforms similar methods in both subjective evaluation and objective metrics, particularly excelling in complex traffic areas such as city center entrances and bridge zones, where its performance is notably superior.

Key words: fine-grained traffic flow inference, dynamic adaptive fusion, back projection algorithm, gated cross-attention, self-attention

CLC Number: