TY - GEN
T1 - An enhanced multi-view human action recognition system for virtual training simulator
AU - Kwon, Beom
AU - Kim, Junghwan
AU - Lee, Sanghoon
N1 - Publisher Copyright:
© 2016 Asia Pacific Signal and Information Processing Association.
Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2017/1/17
Y1 - 2017/1/17
N2 - Virtual military training systems have received considerable attention as a possible substitute for conventional real military training. In our previous work, human action recognition system using multiple Kinects (HARS-MK) has been implemented as a prototype of virtual military training simulator. However, the classification accuracy of HARS-MK is not enough to be utilized for virtual military training simulator. In addition, the experiments are carried out under just two simple action types; walking and crouching walking. In order to overcome these limitations, in this paper, we propose an enhanced multi-view human action recognition system (EM-HARS). Compared to HARS-MK, in EM-HARS, feature extractor is enhanced by employing covariance descriptor. In addition, the feasibility test of EM-HARS is conducted under various human actions including military training actions which are newly captured. The experiment results show that EM-HARS achieves higher classification accuracy than that of HARS-MK.
AB - Virtual military training systems have received considerable attention as a possible substitute for conventional real military training. In our previous work, human action recognition system using multiple Kinects (HARS-MK) has been implemented as a prototype of virtual military training simulator. However, the classification accuracy of HARS-MK is not enough to be utilized for virtual military training simulator. In addition, the experiments are carried out under just two simple action types; walking and crouching walking. In order to overcome these limitations, in this paper, we propose an enhanced multi-view human action recognition system (EM-HARS). Compared to HARS-MK, in EM-HARS, feature extractor is enhanced by employing covariance descriptor. In addition, the feasibility test of EM-HARS is conducted under various human actions including military training actions which are newly captured. The experiment results show that EM-HARS achieves higher classification accuracy than that of HARS-MK.
UR - http://www.scopus.com/inward/record.url?scp=85013844705&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85013844705&partnerID=8YFLogxK
U2 - 10.1109/APSIPA.2016.7820895
DO - 10.1109/APSIPA.2016.7820895
M3 - Conference contribution
AN - SCOPUS:85013844705
T3 - 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2016
BT - 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2016
Y2 - 13 December 2016 through 16 December 2016
ER -