General Dimensional Multiple-Output Support Vector Regressions and Their Multiple Kernel Learning

Wooyong Chung, Jisu Kim, Heejin Lee, Euntai Kim

Research output: Contribution to journalArticle

12 Citations (Scopus)

Abstract

Support vector regression has been considered as one of the most important regression or function approximation methodologies in a variety of fields. In this paper, two new general dimensional multiple output support vector regressions (MSVRs) named SOCPL1 and SOCPL2 are proposed. The proposed methods are formulated in the dual space and their relationship with the previous works is clearly investigated. Further, the proposed MSVRs are extended into the multiple kernel learning and their training is implemented by the off-the-shelf convex optimization tools. The proposed MSVRs are applied to benchmark problems and their performances are compared with those of the previous methods in the experimental section.

Original languageEnglish
Article number6994253
Pages (from-to)2572-2584
Number of pages13
JournalIEEE Transactions on Cybernetics
Volume45
Issue number11
DOIs
Publication statusPublished - 2015 Nov 1

Fingerprint

Convex optimization

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this

@article{67c61231c633406195d15771d00e9c7e,
title = "General Dimensional Multiple-Output Support Vector Regressions and Their Multiple Kernel Learning",
abstract = "Support vector regression has been considered as one of the most important regression or function approximation methodologies in a variety of fields. In this paper, two new general dimensional multiple output support vector regressions (MSVRs) named SOCPL1 and SOCPL2 are proposed. The proposed methods are formulated in the dual space and their relationship with the previous works is clearly investigated. Further, the proposed MSVRs are extended into the multiple kernel learning and their training is implemented by the off-the-shelf convex optimization tools. The proposed MSVRs are applied to benchmark problems and their performances are compared with those of the previous methods in the experimental section.",
author = "Wooyong Chung and Jisu Kim and Heejin Lee and Euntai Kim",
year = "2015",
month = "11",
day = "1",
doi = "10.1109/TCYB.2014.2377016",
language = "English",
volume = "45",
pages = "2572--2584",
journal = "IEEE Transactions on Cybernetics",
issn = "2168-2267",
publisher = "IEEE Advancing Technology for Humanity",
number = "11",

}

General Dimensional Multiple-Output Support Vector Regressions and Their Multiple Kernel Learning. / Chung, Wooyong; Kim, Jisu; Lee, Heejin; Kim, Euntai.

In: IEEE Transactions on Cybernetics, Vol. 45, No. 11, 6994253, 01.11.2015, p. 2572-2584.

Research output: Contribution to journalArticle

TY - JOUR

T1 - General Dimensional Multiple-Output Support Vector Regressions and Their Multiple Kernel Learning

AU - Chung, Wooyong

AU - Kim, Jisu

AU - Lee, Heejin

AU - Kim, Euntai

PY - 2015/11/1

Y1 - 2015/11/1

N2 - Support vector regression has been considered as one of the most important regression or function approximation methodologies in a variety of fields. In this paper, two new general dimensional multiple output support vector regressions (MSVRs) named SOCPL1 and SOCPL2 are proposed. The proposed methods are formulated in the dual space and their relationship with the previous works is clearly investigated. Further, the proposed MSVRs are extended into the multiple kernel learning and their training is implemented by the off-the-shelf convex optimization tools. The proposed MSVRs are applied to benchmark problems and their performances are compared with those of the previous methods in the experimental section.

AB - Support vector regression has been considered as one of the most important regression or function approximation methodologies in a variety of fields. In this paper, two new general dimensional multiple output support vector regressions (MSVRs) named SOCPL1 and SOCPL2 are proposed. The proposed methods are formulated in the dual space and their relationship with the previous works is clearly investigated. Further, the proposed MSVRs are extended into the multiple kernel learning and their training is implemented by the off-the-shelf convex optimization tools. The proposed MSVRs are applied to benchmark problems and their performances are compared with those of the previous methods in the experimental section.

UR - http://www.scopus.com/inward/record.url?scp=84960463745&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84960463745&partnerID=8YFLogxK

U2 - 10.1109/TCYB.2014.2377016

DO - 10.1109/TCYB.2014.2377016

M3 - Article

AN - SCOPUS:84960463745

VL - 45

SP - 2572

EP - 2584

JO - IEEE Transactions on Cybernetics

JF - IEEE Transactions on Cybernetics

SN - 2168-2267

IS - 11

M1 - 6994253

ER -