Fast Tucker Factorization for Large-Scale Tensor Completion

Dongha Lee, Jaehyung Lee, Hwanjo Yu

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Tensor completion is the task of completing multi-aspect data represented as a tensor by accurately predicting missing entries in the tensor. It is mainly solved by tensor factorization methods, and among them, Tucker factorization has attracted considerable interests due to its powerful ability to learn latent factors and even their interactions. Although several Tucker methods have been developed to reduce the memory and computational complexity, the state-of-the-art method still 1) generates redundant computations and 2) cannot factorize a large tensor that exceeds the size of memory. This paper proposes FTcom, a fast and scalable Tucker factorization method for tensor completion. FTcom performs element-wise updates for factor matrices based on coordinate descent, and adopts a novel caching algorithm which stores frequently-required intermediate data. It also uses a tensor file for disk-based data processing and loads only a small part of the tensor at a time into the memory. Experimental results show that FTcom is much faster and more scalable compared to all other competitors. It significantly shortens the training time of Tucker factorization, especially on real-world tensors, and it can be executed on a billion-scale tensor which is bigger than the memory capacity within a single machine.

Original languageEnglish
Title of host publication2018 IEEE International Conference on Data Mining, ICDM 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781538691588
Publication statusPublished - 2018 Dec 27
Event18th IEEE International Conference on Data Mining, ICDM 2018 - Singapore, Singapore
Duration: 2018 Nov 172018 Nov 20

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
ISSN (Print)1550-4786


Conference18th IEEE International Conference on Data Mining, ICDM 2018

Bibliographical note

Funding Information:
VI. ACKNOWLEDGMENTS This research was supported by the NRF grant funded by the MSIT: (No. 2016R1E1A1A01942642) and (No. 2017M3C4A7063570), the IITP grant funded by the MSIT: (No. 2018-0-00584) and (IITP-2018-2011-1-00783).

Publisher Copyright:
© 2018 IEEE.

All Science Journal Classification (ASJC) codes

  • Engineering(all)


Dive into the research topics of 'Fast Tucker Factorization for Large-Scale Tensor Completion'. Together they form a unique fingerprint.

Cite this