Estimating the efficiency of recognizing gender and affect from biological motion

Frank E. Pollick, Vaia Lestou, Jungwon Ryu, Sung Bae Cho

Research output: Contribution to journalArticlepeer-review

103 Citations (Scopus)

Abstract

It is often claimed that point-light displays provide sufficient information to easily recognize properties of the actor and action being performed. We examined this claim by obtaining estimates of human efficiency in the categorization of movement. We began by recording a database of three-dimensional human arm movements from 13 males and 13 females that contained multiple repetitions of knocking, waving and lifting movements done both in an angry and a neutral style. Point-light displays of each individual for all of the six different combinations were presented to participants who were asked to judge the gender of the model in Experiment 1 and the affect in Experiment 2. To obtain estimates of efficiency, results of human performance were compared to the output of automatic pattern classifiers based on artificial neural networks designed and trained to perform the same classification task on the same movements. Efficiency was expressed as the squared ratio of human sensitivity (d′) to neural network sensitivity (d′). Average results for gender recognition showed a proportion correct of 0.51 and an efficiency of 0.27%. Results for affect recognition showed a proportion correct of 0.71 and an efficiency of 32.5%. These results are discussed in the context of how different cues inform the recognition of movement style.

Original languageEnglish
Pages (from-to)2345-2355
Number of pages11
JournalVision Research
Volume42
Issue number20
DOIs
Publication statusPublished - 2002 Sept

Bibliographical note

Funding Information:
We would like to thank the Royal Society for support of this research through a UK–Korea Joint Project Grant as well as the Wellcome Trust and Nuffield Foundation.

All Science Journal Classification (ASJC) codes

  • Ophthalmology
  • Sensory Systems

Fingerprint

Dive into the research topics of 'Estimating the efficiency of recognizing gender and affect from biological motion'. Together they form a unique fingerprint.

Cite this