Journal of System Simulation ›› 2023, Vol. 35 ›› Issue (5): 1098-1108.doi: 10.16182/j.issn1004731x.joss.22-0087

• Papers • Previous Articles     Next Articles

Action Recognition Method Based on Projection Subspace Views under Single Viewing Angle

Benyue Su1,2(), Manzhen Sun3, Qing Ma4, Min Sheng4   

  1. 1.The Key Laboratory of Intelligent Perception and Computing of Anhui Province, Anqing Normal University, Anqing 246133, China
    2.School of Mathematics and Computer, Tongling University, Tongling 244061, China
    3.School of Computer and Information, Anqing Normal University, Anqing 246133, China
    4.School of Mathematics and Physics, Anqing Normal University, Anqing 246133, China
  • Received:2022-01-27 Revised:2022-04-27 Online:2023-05-30 Published:2023-05-22

Abstract:

In view of the self-occlusion problem of joint action tracking by a depth camera under a single viewing angle, a new human action recognition method based on projection subspace views is proposed. Without adding data acquisition equipment, the method projects the three-dimensional(3D) action sequences obtained under a single viewing angle into multiple two-dimensional subspacesand then seeks the maximum distance between classes in the two-dimensional subspaces, so as to increase the distance between 3D actions based on the fusion of multiple subspace views as much as possible. The recognition rate in the self-built AQNU dataset is 99.69%, which is 1.22% higher than the benchmark method. The recognition rate in the public NTU-RGB+D dataset subset is 80.23%, which is 1.98% higher than the benchmark method. The experimental results show that the method proposed in this paper can alleviate the self-occlusion problem of datasets of single viewing angles to a certain extent, effectively improve the recognition rate and computational efficiency, and achieve the recognition effect equivalent to that of datasets of multiple viewing angles.

Key words: action recognition, single view, projection subspace, graph convolutional network

CLC Number: