Geometric approach to train support vector machines

Ming Hsuan Yang, Narendra Ahuja

Research output: Contribution to journalConference article

27 Citations (Scopus)

Abstract

Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory.

Original languageEnglish
Pages (from-to)430-437
Number of pages8
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Volume1
Publication statusPublished - 2000 Jan 1
EventCVPR '2000: IEEE Conference on Computer Vision and Pattern Recognition - Hilton Head Island, SC, USA
Duration: 2000 Jun 132000 Jun 15

Fingerprint

Support vector machines
Data storage equipment
Quadratic programming
Linear programming
Pattern recognition

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition

Cite this

@article{72ea6aa4d211458aacfda76cb19cddbd,
title = "Geometric approach to train support vector machines",
abstract = "Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory.",
author = "Yang, {Ming Hsuan} and Narendra Ahuja",
year = "2000",
month = "1",
day = "1",
language = "English",
volume = "1",
pages = "430--437",
journal = "Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition",
issn = "1063-6919",
publisher = "IEEE Computer Society",

}

Geometric approach to train support vector machines. / Yang, Ming Hsuan; Ahuja, Narendra.

In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Vol. 1, 01.01.2000, p. 430-437.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Geometric approach to train support vector machines

AU - Yang, Ming Hsuan

AU - Ahuja, Narendra

PY - 2000/1/1

Y1 - 2000/1/1

N2 - Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory.

AB - Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory.

UR - http://www.scopus.com/inward/record.url?scp=0033725293&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033725293&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:0033725293

VL - 1

SP - 430

EP - 437

JO - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

JF - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition

SN - 1063-6919

ER -