Feature Extraction Based on Decision Boundaries

Chul Hee Lee, David A. Landgrebe

Research output: Contribution to journalArticle

274 Citations (Scopus)

Abstract

In this paper, a novel approach to feature extraction for classification is proposed based directly on the decision boundaries. We note that feature extraction is equivalent to retaining informative features or eliminating redundant features; thus, the terms “discriminantly information feature” and “discriminantly redundant feature” are first defined relative to feature extraction for classification. Next, it is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. A novel characteristic of the proposed method arises by noting that usually only a portion of the decision boundary is effective in discriminating between classes, and the concept of the effective decision boundary is therefore introduced. Next, a procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature extraction algorithm has several desirable properties: 1) It predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; 2) it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal class means or equal class covariances as some previous algorithms do. Experiments show that the performance of the proposed algorithm compares favorably with those of previous algorithms.

Original languageEnglish
Pages (from-to)388-400
Number of pages13
JournalIEEE transactions on pattern analysis and machine intelligence
Volume15
Issue number4
DOIs
Publication statusPublished - 1993 Jan 1

Fingerprint

Feature Extraction
Feature extraction
Necessary
Feature Vector
Pattern Recognition
Pattern recognition
Predict
Term
Experiment
Class
Experiments

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computational Theory and Mathematics
  • Computer Vision and Pattern Recognition
  • Software
  • Applied Mathematics
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

@article{d7d38e97a8904e06a35a27e5970f4e4b,
title = "Feature Extraction Based on Decision Boundaries",
abstract = "In this paper, a novel approach to feature extraction for classification is proposed based directly on the decision boundaries. We note that feature extraction is equivalent to retaining informative features or eliminating redundant features; thus, the terms “discriminantly information feature” and “discriminantly redundant feature” are first defined relative to feature extraction for classification. Next, it is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. A novel characteristic of the proposed method arises by noting that usually only a portion of the decision boundary is effective in discriminating between classes, and the concept of the effective decision boundary is therefore introduced. Next, a procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature extraction algorithm has several desirable properties: 1) It predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; 2) it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal class means or equal class covariances as some previous algorithms do. Experiments show that the performance of the proposed algorithm compares favorably with those of previous algorithms.",
author = "Lee, {Chul Hee} and Landgrebe, {David A.}",
year = "1993",
month = "1",
day = "1",
doi = "10.1109/34.206958",
language = "English",
volume = "15",
pages = "388--400",
journal = "IEEE Transactions on Pattern Analysis and Machine Intelligence",
issn = "0162-8828",
publisher = "IEEE Computer Society",
number = "4",

}

Feature Extraction Based on Decision Boundaries. / Lee, Chul Hee; Landgrebe, David A.

In: IEEE transactions on pattern analysis and machine intelligence, Vol. 15, No. 4, 01.01.1993, p. 388-400.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Feature Extraction Based on Decision Boundaries

AU - Lee, Chul Hee

AU - Landgrebe, David A.

PY - 1993/1/1

Y1 - 1993/1/1

N2 - In this paper, a novel approach to feature extraction for classification is proposed based directly on the decision boundaries. We note that feature extraction is equivalent to retaining informative features or eliminating redundant features; thus, the terms “discriminantly information feature” and “discriminantly redundant feature” are first defined relative to feature extraction for classification. Next, it is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. A novel characteristic of the proposed method arises by noting that usually only a portion of the decision boundary is effective in discriminating between classes, and the concept of the effective decision boundary is therefore introduced. Next, a procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature extraction algorithm has several desirable properties: 1) It predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; 2) it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal class means or equal class covariances as some previous algorithms do. Experiments show that the performance of the proposed algorithm compares favorably with those of previous algorithms.

AB - In this paper, a novel approach to feature extraction for classification is proposed based directly on the decision boundaries. We note that feature extraction is equivalent to retaining informative features or eliminating redundant features; thus, the terms “discriminantly information feature” and “discriminantly redundant feature” are first defined relative to feature extraction for classification. Next, it is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. A novel characteristic of the proposed method arises by noting that usually only a portion of the decision boundary is effective in discriminating between classes, and the concept of the effective decision boundary is therefore introduced. Next, a procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature extraction algorithm has several desirable properties: 1) It predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space for a given pattern recognition problem; 2) it finds the necessary feature vectors. The proposed algorithm does not deteriorate under the circumstances of equal class means or equal class covariances as some previous algorithms do. Experiments show that the performance of the proposed algorithm compares favorably with those of previous algorithms.

UR - http://www.scopus.com/inward/record.url?scp=0027579237&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027579237&partnerID=8YFLogxK

U2 - 10.1109/34.206958

DO - 10.1109/34.206958

M3 - Article

VL - 15

SP - 388

EP - 400

JO - IEEE Transactions on Pattern Analysis and Machine Intelligence

JF - IEEE Transactions on Pattern Analysis and Machine Intelligence

SN - 0162-8828

IS - 4

ER -