Conditioned generative model via latent semantic controlling for learning deep representation of data

Jin Young Kim, Sung Bae Cho

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Learning representations of data is an important issue in machine learning. Though generative adversarial network has led to significant improvements in the data representations, it still has several problems such as unstable training, hidden manifold of data, and huge computational overhead. Moreover, most of GAN’s have a large size of manifold, resulting in poor scalability. In this paper, we propose a novel GAN to control the latent semantic representation, called LSC-GAN, which allows us to produce desired data and learns a representation of the data efficiently. Unlike the conventional GAN models with hidden distribution of latent space, we define the distributions explicitly in advance that are trained to generate the data based on the corresponding features by inputting the latent variables, which follow the distribution, into the generative model. As the larger scale of latent space caused by deploying various distributions makes training unstable, we need to separate the process of defining the distributions explicitly and operation of generation. We prove that a variational auto-encoder is proper for the former and modify a loss function of VAE to map the data into the corresponding pre-defined latent space. The decoder, which generates the data from the associated latent space, is used as the generator of the LSC-GAN. Several experiments on the CelebA dataset are conducted to verify the usefulness of the proposed method. Besides, our model achieves a high compression ratio that can hold about 24 pixels of information in each dimension of latent space.

Original languageEnglish
Title of host publicationIntelligent Data Engineering and Automated Learning – IDEAL 2019 - 20th International Conference, Proceedings
EditorsHujun Yin, Richard Allmendinger, David Camacho, Peter Tino, Antonio J. Tallón-Ballesteros, Ronaldo Menezes
PublisherSpringer
Pages319-327
Number of pages9
ISBN (Print)9783030336066
DOIs
Publication statusPublished - 2019
Event20th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2019 - Manchester, United Kingdom
Duration: 2019 Nov 142019 Nov 16

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11871 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2019
CountryUnited Kingdom
CityManchester
Period19/11/1419/11/16

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Conditioned generative model via latent semantic controlling for learning deep representation of data'. Together they form a unique fingerprint.

  • Cite this

    Kim, J. Y., & Cho, S. B. (2019). Conditioned generative model via latent semantic controlling for learning deep representation of data. In H. Yin, R. Allmendinger, D. Camacho, P. Tino, A. J. Tallón-Ballesteros, & R. Menezes (Eds.), Intelligent Data Engineering and Automated Learning – IDEAL 2019 - 20th International Conference, Proceedings (pp. 319-327). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11871 LNCS). Springer. https://doi.org/10.1007/978-3-030-33607-3_35