Journal of System Simulation ›› 2022, Vol. 34 ›› Issue (8): 1899-1907.doi: 10.16182/j.issn1004731x.joss.21-0288

• National Economy Simulation • Previous Articles    

Identification of Switching Operation Based on LSTM and MoE

Xiaoqing Zhang1,2(), Wanfang Xiao3, Yingjie Guo1(), Bowen Liu3, Xuesen Han3, Jingwei Ma3, Gao Gao3, He Huang3, Shihong Xia1   

  1. 1.Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China
    2.College of Computer, Beijing University of Posts and Telecommunications, Beijing 100876, China
    3.State Grid Beijing urban power supply company, Beijing 110102, China
  • Received:2021-04-03 Revised:2021-05-08 Online:2022-08-30 Published:2022-08-15
  • Contact: Yingjie Guo E-mail:1074913189@qq.com;guoyingjie@ict.ac.cn

Abstract:

Aiming at the individual differences of different personnel in the same operation and differences of the same person in the same operation at different times, a switching operation recognition model(MoE-LSTM) based on Mixture of experts model (MOE) and long short-term memory network(LSTM) is proposed. Based on MoE, LSTM is integrated to learn the feature distribution of different sources data. The acceleration data is collected to build the switching operation dataset and the action sequence is segmented and aligned based on sliding window. The action sequence is input to MoE-LSTM, and the temporal dependencies of different actions are independently learned by different LSTMs. The gating network selects the output of LSTM that classifies the current input better as the action recognition result. The result of model learning is that for action data from different time and space, different LSTMs perform better in a certain feature area than other LSTMs. The experiments on the switching operation dataset demonstrate superior performance of the proposed method compared to other existing action recognition algorithms.

Key words: switching operation, long short-term memory network(LSTM), mixture of experts model (MOE), neural network

CLC Number: