Natural facial and head behavior recognition using dictionary of motion primitives

Qun Shi, Norimichi Ukita, Ming Hsuan Yang

Research output: Contribution to journalArticlepeer-review

Abstract

This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.

Original languageEnglish
Pages (from-to)2993-3000
Number of pages8
JournalIEICE Transactions on Information and Systems
VolumeE100D
Issue number12
DOIs
Publication statusPublished - 2017 Dec

Bibliographical note

Funding Information:
This paper presented facial and head behavior recognition using dynamical systems, which consist of three steps, i.e., dynamic feature extraction, construction of dictionary based on motion primitives, and classification. We propose a novel dictionary based on motion primitives and show its effectiveness for recognition using statistical analysis without temporal alignment. Experimental results on four datasets demonstrate the effectiveness of the proposed method in recognizing natural and exerted facial and head behaviors. Our future work includes automatic extraction of video segments in each of which a natural facial and head behavior is observed in a long video sequence (e.g., video spotting). This work was supported by Yanmar Innovation Lab. 2112.

Publisher Copyright:
Copyright © 2017 The Institute of Electronics, Information and Communication Engineers.

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Natural facial and head behavior recognition using dictionary of motion primitives'. Together they form a unique fingerprint.

Cite this