Natural facial and head behavior recognition using dictionary of motion primitives

Qun Shi, Norimichi Ukita, Ming Hsuan Yang

Research output: Contribution to journalArticle

Abstract

This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.

Original languageEnglish
Pages (from-to)2993-3000
Number of pages8
JournalIEICE Transactions on Information and Systems
VolumeE100D
Issue number12
DOIs
Publication statusPublished - 2017 Dec

Fingerprint

Glossaries
Dynamical systems
Feature extraction
Communication

All Science Journal Classification (ASJC) codes

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Cite this

@article{178650bfe48c4b7e99a04f5437194d56,
title = "Natural facial and head behavior recognition using dictionary of motion primitives",
abstract = "This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.",
author = "Qun Shi and Norimichi Ukita and Yang, {Ming Hsuan}",
year = "2017",
month = "12",
doi = "10.1587/transinf.2017EDP7128",
language = "English",
volume = "E100D",
pages = "2993--3000",
journal = "IEICE Transactions on Information and Systems",
issn = "0916-8532",
publisher = "Maruzen Co., Ltd/Maruzen Kabushikikaisha",
number = "12",

}

Natural facial and head behavior recognition using dictionary of motion primitives. / Shi, Qun; Ukita, Norimichi; Yang, Ming Hsuan.

In: IEICE Transactions on Information and Systems, Vol. E100D, No. 12, 12.2017, p. 2993-3000.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Natural facial and head behavior recognition using dictionary of motion primitives

AU - Shi, Qun

AU - Ukita, Norimichi

AU - Yang, Ming Hsuan

PY - 2017/12

Y1 - 2017/12

N2 - This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.

AB - This paper proposes a natural facial and head behavior recognition method using hybrid dynamical systems. Most existing facial and head behavior recognition methods focus on analyzing deliberately displayed prototypical emotion patterns rather than complex and spontaneous facial and head behaviors in natural conversation environments. We first capture spatio-temporal features on important facial parts via dense feature extraction. Next, we cluster the spatio-temporal features using hybrid dynamical systems, and construct a dictionary of motion primitives to cover all possible elemental motion dynamics accounting for facial and head behaviors. With this dictionary, the facial and head behavior can be interpreted into a distribution of motion primitives. This interpretation is robust against different rhythms of dynamic patterns in complex and spontaneous facial and head behaviors. We evaluate the proposed approach under natural tele-communication scenarios, and achieve promising results. Furthermore, the proposed method also performs favorably against the state-of-the-art methods on three benchmark databases.

UR - http://www.scopus.com/inward/record.url?scp=85038405479&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85038405479&partnerID=8YFLogxK

U2 - 10.1587/transinf.2017EDP7128

DO - 10.1587/transinf.2017EDP7128

M3 - Article

AN - SCOPUS:85038405479

VL - E100D

SP - 2993

EP - 3000

JO - IEICE Transactions on Information and Systems

JF - IEICE Transactions on Information and Systems

SN - 0916-8532

IS - 12

ER -