Automated music video generation using emotion synchronization

Ki Ho Shin, Hye Rin Kim, In Kwon Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

In this paper, we present an automated music video generation framework that utilizes emotion synchronization between video and music. After a user uploads a video or music, the framework segments the video and music, and then predicts the emotion of each of the segments. The preprocessing result is stored on the server's database. The user can select a set of videos and music from the database, and the framework will generate a music video. The system finds the most closely associated video segment with the music segment by comparing certain low level features and the emotion differences. We compare our work to a similar music video generation method by performing a user preference study, and show that our method generates a preferable result.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2594-2597
Number of pages4
ISBN (Electronic)9781509018970
DOIs
Publication statusPublished - 2017 Feb 6
Event2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Budapest, Hungary
Duration: 2016 Oct 92016 Oct 12

Other

Other2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016
CountryHungary
CityBudapest
Period16/10/916/10/12

Fingerprint

Music
Synchronization
Servers
Emotion
User Preferences
Preprocessing
Server
Predict
Framework

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Artificial Intelligence
  • Control and Optimization
  • Human-Computer Interaction

Cite this

Shin, K. H., Kim, H. R., & Lee, I. K. (2017). Automated music video generation using emotion synchronization. In 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings (pp. 2594-2597). [7844629] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/SMC.2016.7844629
Shin, Ki Ho ; Kim, Hye Rin ; Lee, In Kwon. / Automated music video generation using emotion synchronization. 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 2594-2597
@inproceedings{e3e7da39208f43f39078ffedf0568f27,
title = "Automated music video generation using emotion synchronization",
abstract = "In this paper, we present an automated music video generation framework that utilizes emotion synchronization between video and music. After a user uploads a video or music, the framework segments the video and music, and then predicts the emotion of each of the segments. The preprocessing result is stored on the server's database. The user can select a set of videos and music from the database, and the framework will generate a music video. The system finds the most closely associated video segment with the music segment by comparing certain low level features and the emotion differences. We compare our work to a similar music video generation method by performing a user preference study, and show that our method generates a preferable result.",
author = "Shin, {Ki Ho} and Kim, {Hye Rin} and Lee, {In Kwon}",
year = "2017",
month = "2",
day = "6",
doi = "10.1109/SMC.2016.7844629",
language = "English",
pages = "2594--2597",
booktitle = "2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

Shin, KH, Kim, HR & Lee, IK 2017, Automated music video generation using emotion synchronization. in 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings., 7844629, Institute of Electrical and Electronics Engineers Inc., pp. 2594-2597, 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016, Budapest, Hungary, 16/10/9. https://doi.org/10.1109/SMC.2016.7844629

Automated music video generation using emotion synchronization. / Shin, Ki Ho; Kim, Hye Rin; Lee, In Kwon.

2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc., 2017. p. 2594-2597 7844629.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Automated music video generation using emotion synchronization

AU - Shin, Ki Ho

AU - Kim, Hye Rin

AU - Lee, In Kwon

PY - 2017/2/6

Y1 - 2017/2/6

N2 - In this paper, we present an automated music video generation framework that utilizes emotion synchronization between video and music. After a user uploads a video or music, the framework segments the video and music, and then predicts the emotion of each of the segments. The preprocessing result is stored on the server's database. The user can select a set of videos and music from the database, and the framework will generate a music video. The system finds the most closely associated video segment with the music segment by comparing certain low level features and the emotion differences. We compare our work to a similar music video generation method by performing a user preference study, and show that our method generates a preferable result.

AB - In this paper, we present an automated music video generation framework that utilizes emotion synchronization between video and music. After a user uploads a video or music, the framework segments the video and music, and then predicts the emotion of each of the segments. The preprocessing result is stored on the server's database. The user can select a set of videos and music from the database, and the framework will generate a music video. The system finds the most closely associated video segment with the music segment by comparing certain low level features and the emotion differences. We compare our work to a similar music video generation method by performing a user preference study, and show that our method generates a preferable result.

UR - http://www.scopus.com/inward/record.url?scp=85015751277&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85015751277&partnerID=8YFLogxK

U2 - 10.1109/SMC.2016.7844629

DO - 10.1109/SMC.2016.7844629

M3 - Conference contribution

AN - SCOPUS:85015751277

SP - 2594

EP - 2597

BT - 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -

Shin KH, Kim HR, Lee IK. Automated music video generation using emotion synchronization. In 2016 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2016 - Conference Proceedings. Institute of Electrical and Electronics Engineers Inc. 2017. p. 2594-2597. 7844629 https://doi.org/10.1109/SMC.2016.7844629