Abstract
Although the kernel support vector machine (SVM) outperforms linear SVM, its application to real world problems is limited because the evaluation of its decision function is computationally very expensive due to kernel expansion. On the other hand, additive kernel (AK) SVM enables fast evaluation of a decision function using look-up tables (LUTs). The AKs, however, assume a specific functional form for kernels such as the intersection kernel (IK) or χ2 kernel, and are problematic in that their performance is seriously degraded when a given problem is highly nonlinear. To address this issue, an optimal additive kernel (OAK) is proposed in this paper. The OAK does not assume any specific kernel form, but the kernel is represented by a quantized Gram table. The training of the OAK SVM is formulated as semi-definite programming (SDP), and it is solved by convex optimization. In the experiment, the proposed method is tested with 2D synthetic datasets, UCI repository datasets and LIBSVM datasets. The experimental results show that the proposed OAK SVM has better performance than the previous AKs and RBF kernel while maintaining fast computation using look-up tables.
Original language | English |
---|---|
Pages (from-to) | 279-299 |
Number of pages | 21 |
Journal | Neurocomputing |
Volume | 329 |
DOIs | |
Publication status | Published - 2019 Feb 15 |
Bibliographical note
Funding Information:This research was supported by Next-Generation Information Computing Development Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT (NRF-2017M3C4A7069370).
Publisher Copyright:
© 2018 Elsevier B.V.
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence