Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self- Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.
|Title of host publication||AAAI 2020 - 34th AAAI Conference on Artificial Intelligence|
|Number of pages||8|
|Publication status||Published - 2020|
|Event||34th AAAI Conference on Artificial Intelligence, AAAI 2020 - New York, United States|
Duration: 2020 Feb 7 → 2020 Feb 12
|Name||AAAI 2020 - 34th AAAI Conference on Artificial Intelligence|
|Conference||34th AAAI Conference on Artificial Intelligence, AAAI 2020|
|Period||20/2/7 → 20/2/12|
Bibliographical noteFunding Information:
This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (NRF-2018R1C1B6008652).
© 2020, Association for the Advancement of Artificial Intelligence.
All Science Journal Classification (ASJC) codes
- Artificial Intelligence