Geometric approach to train support vector machines

Ming Hsuan Yang, Narendra Ahuja

Research output: Contribution to journalConference article

29 Citations (Scopus)

Abstract

Support Vector Machines (SVMs) have shown great potential in numerous visual learning and pattern recognition problems. The optimal decision surface of a SVM is constructed from its support vectors which are conventionally determined by solving a quadratic programming (QP) problem. However, solving a large optimization problem is challenging since it is computationally intensive and the memory requirement grows with square of the training vectors. In this paper, we propose a geometric method to extract a small superset of support vectors, which we call guard vectors, to construct the optimal decision surface. Specifically, the guard vectors are found by solving a set of linear programming problems. Experimental results on synthetic and real data sets show that the proposed method is more efficient than conventional methods using QPs and requires much less memory.

Original languageEnglish
Pages (from-to)430-437
Number of pages8
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Volume1
Publication statusPublished - 2000 Jan 1
EventCVPR '2000: IEEE Conference on Computer Vision and Pattern Recognition - Hilton Head Island, SC, USA
Duration: 2000 Jun 132000 Jun 15

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Geometric approach to train support vector machines'. Together they form a unique fingerprint.

  • Cite this