The joint optimization of representation learning and clustering in the embedding space has experienced a breakthrough in recent years. In spite of the advance, clustering with representation learning has been limited to flat-level categories, which often involves cohesive clustering with a focus on instance relations. To overcome the limitations of flat clustering, we introduce hierarchically-clustered representation learning (HCRL), which simultaneously optimizes representation learning and hierarchical clustering in the embedding space. Compared with a few prior works, HCRL firstly attempts to consider a generation of deep embeddings from every component of the hierarchy, not just leaf components. In addition to obtaining hierarchically clustered embeddings, we can reconstruct data by the various abstraction levels, infer the intrinsic hierarchical structure, and learn the level-proportion features. We conducted evaluations with image and text domains, and our quantitative analyses showed competent likelihoods and the best accuracies compared with the baselines.
|Title of host publication||AAAI 2020 - 34th AAAI Conference on Artificial Intelligence|
|Number of pages||8|
|Publication status||Published - 2020|
|Event||34th AAAI Conference on Artificial Intelligence, AAAI 2020 - New York, United States|
Duration: 2020 Feb 7 → 2020 Feb 12
|Name||AAAI 2020 - 34th AAAI Conference on Artificial Intelligence|
|Conference||34th AAAI Conference on Artificial Intelligence, AAAI 2020|
|Period||20/2/7 → 20/2/12|
Bibliographical noteFunding Information:
Acknowledgments This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education(NRF-2019M3F2A1072239)
© 2020, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.
All Science Journal Classification (ASJC) codes
- Artificial Intelligence