Video scene retrieval with interactive genetic algorithm

Hun Woo Yoo, Sung Bae Cho

Research output: Contribution to journalArticle

30 Citations (Scopus)

Abstract

This paper proposes a video scene retrieval algorithm based on emotion. First, abrupt/gradual shot boundaries are detected in the video clip of representing a specific story. Then, five video features such as "average color histogram," "average brightness," "average edge histogram," "average shot duration," and "gradual change rate" are extracted from each of the videos, and mapping through an interactive genetic algorithm is conducted between these features and the emotional space that a user has in mind. After the proposed algorithm selects the videos that contain the corresponding emotion from the initial population of videos, the feature vectors from them are regarded as chromosomes, and a genetic crossover is applied to those feature vectors. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on a similarity function to obtain the most similar videos as solutions of the next generation. By iterating this process, a new population of videos that a user has in mind are retrieved. In order to show the validity of the proposed method, six example categories of "action," "excitement," "suspense," "quietness," "relaxation," and "happiness" are used as emotions for experiments. This method of retrieval shows 70% of effectiveness on the average over 300 commercial videos.

Original languageEnglish
Pages (from-to)317-336
Number of pages20
JournalMultimedia Tools and Applications
Volume34
Issue number3
DOIs
Publication statusPublished - 2007 Sep 1

Fingerprint

Genetic algorithms
Chromosomes
Luminance
Color
Experiments

All Science Journal Classification (ASJC) codes

  • Software
  • Media Technology
  • Hardware and Architecture
  • Computer Networks and Communications

Cite this

@article{44b5190daa2e468eb110c9776463be45,
title = "Video scene retrieval with interactive genetic algorithm",
abstract = "This paper proposes a video scene retrieval algorithm based on emotion. First, abrupt/gradual shot boundaries are detected in the video clip of representing a specific story. Then, five video features such as {"}average color histogram,{"} {"}average brightness,{"} {"}average edge histogram,{"} {"}average shot duration,{"} and {"}gradual change rate{"} are extracted from each of the videos, and mapping through an interactive genetic algorithm is conducted between these features and the emotional space that a user has in mind. After the proposed algorithm selects the videos that contain the corresponding emotion from the initial population of videos, the feature vectors from them are regarded as chromosomes, and a genetic crossover is applied to those feature vectors. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on a similarity function to obtain the most similar videos as solutions of the next generation. By iterating this process, a new population of videos that a user has in mind are retrieved. In order to show the validity of the proposed method, six example categories of {"}action,{"} {"}excitement,{"} {"}suspense,{"} {"}quietness,{"} {"}relaxation,{"} and {"}happiness{"} are used as emotions for experiments. This method of retrieval shows 70{\%} of effectiveness on the average over 300 commercial videos.",
author = "Yoo, {Hun Woo} and Cho, {Sung Bae}",
year = "2007",
month = "9",
day = "1",
doi = "10.1007/s11042-007-0109-8",
language = "English",
volume = "34",
pages = "317--336",
journal = "Multimedia Tools and Applications",
issn = "1380-7501",
publisher = "Springer Netherlands",
number = "3",

}

Video scene retrieval with interactive genetic algorithm. / Yoo, Hun Woo; Cho, Sung Bae.

In: Multimedia Tools and Applications, Vol. 34, No. 3, 01.09.2007, p. 317-336.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Video scene retrieval with interactive genetic algorithm

AU - Yoo, Hun Woo

AU - Cho, Sung Bae

PY - 2007/9/1

Y1 - 2007/9/1

N2 - This paper proposes a video scene retrieval algorithm based on emotion. First, abrupt/gradual shot boundaries are detected in the video clip of representing a specific story. Then, five video features such as "average color histogram," "average brightness," "average edge histogram," "average shot duration," and "gradual change rate" are extracted from each of the videos, and mapping through an interactive genetic algorithm is conducted between these features and the emotional space that a user has in mind. After the proposed algorithm selects the videos that contain the corresponding emotion from the initial population of videos, the feature vectors from them are regarded as chromosomes, and a genetic crossover is applied to those feature vectors. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on a similarity function to obtain the most similar videos as solutions of the next generation. By iterating this process, a new population of videos that a user has in mind are retrieved. In order to show the validity of the proposed method, six example categories of "action," "excitement," "suspense," "quietness," "relaxation," and "happiness" are used as emotions for experiments. This method of retrieval shows 70% of effectiveness on the average over 300 commercial videos.

AB - This paper proposes a video scene retrieval algorithm based on emotion. First, abrupt/gradual shot boundaries are detected in the video clip of representing a specific story. Then, five video features such as "average color histogram," "average brightness," "average edge histogram," "average shot duration," and "gradual change rate" are extracted from each of the videos, and mapping through an interactive genetic algorithm is conducted between these features and the emotional space that a user has in mind. After the proposed algorithm selects the videos that contain the corresponding emotion from the initial population of videos, the feature vectors from them are regarded as chromosomes, and a genetic crossover is applied to those feature vectors. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on a similarity function to obtain the most similar videos as solutions of the next generation. By iterating this process, a new population of videos that a user has in mind are retrieved. In order to show the validity of the proposed method, six example categories of "action," "excitement," "suspense," "quietness," "relaxation," and "happiness" are used as emotions for experiments. This method of retrieval shows 70% of effectiveness on the average over 300 commercial videos.

UR - http://www.scopus.com/inward/record.url?scp=34547349944&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34547349944&partnerID=8YFLogxK

U2 - 10.1007/s11042-007-0109-8

DO - 10.1007/s11042-007-0109-8

M3 - Article

VL - 34

SP - 317

EP - 336

JO - Multimedia Tools and Applications

JF - Multimedia Tools and Applications

SN - 1380-7501

IS - 3

ER -