Optimal feature extraction for normally distributed data

Chul Hee Lee, Euisun Choi, Jaehong Kim

Research output: Contribution to journalConference article

Abstract

In this paper, we propose an optimal feature extraction method for normally distributed data. The feature extraction algorithm is optimal in the sense that we search the whole feature space to find a set of features which give the smallest classification error for the Gaussian ML classifier. Initially, we start with an arbitrary feature vector. Assuming that the feature vector is used for classification, we compute the classification error. Then we move the feature vector slightly in the direction so that the classification error decreases most rapidly. This can be done by taking gradient. We propose two search methods, sequential search and global search. In the sequential search, if more features are needed, we try to find an additional feature which gives the best classification accuracy with the already chosen features. In the global search, we are not restricted to use the already chosen features. Experiment results show that the proposed method outperforms the conventional feature extraction algorithms.

Original languageEnglish
Pages (from-to)223-232
Number of pages10
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume3372
DOIs
Publication statusPublished - 1998 Dec 1
EventAlgorithms for Multispectral and Hyperspectral Imagery IV - Orlando, FL, United States
Duration: 1998 Apr 131998 Apr 14

Fingerprint

pattern recognition
Feature Extraction
Feature extraction
Feature Vector
Global Search
classifiers
Feature Space
Search Methods
Classifiers
Classifier
Gradient
Decrease
gradients
Arbitrary
Experiment
Experiments

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Cite this

@article{e53fd28c42f04499a7d03d38db68c533,
title = "Optimal feature extraction for normally distributed data",
abstract = "In this paper, we propose an optimal feature extraction method for normally distributed data. The feature extraction algorithm is optimal in the sense that we search the whole feature space to find a set of features which give the smallest classification error for the Gaussian ML classifier. Initially, we start with an arbitrary feature vector. Assuming that the feature vector is used for classification, we compute the classification error. Then we move the feature vector slightly in the direction so that the classification error decreases most rapidly. This can be done by taking gradient. We propose two search methods, sequential search and global search. In the sequential search, if more features are needed, we try to find an additional feature which gives the best classification accuracy with the already chosen features. In the global search, we are not restricted to use the already chosen features. Experiment results show that the proposed method outperforms the conventional feature extraction algorithms.",
author = "Lee, {Chul Hee} and Euisun Choi and Jaehong Kim",
year = "1998",
month = "12",
day = "1",
doi = "10.1117/12.312603",
language = "English",
volume = "3372",
pages = "223--232",
journal = "Proceedings of SPIE - The International Society for Optical Engineering",
issn = "0277-786X",
publisher = "SPIE",

}

Optimal feature extraction for normally distributed data. / Lee, Chul Hee; Choi, Euisun; Kim, Jaehong.

In: Proceedings of SPIE - The International Society for Optical Engineering, Vol. 3372, 01.12.1998, p. 223-232.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Optimal feature extraction for normally distributed data

AU - Lee, Chul Hee

AU - Choi, Euisun

AU - Kim, Jaehong

PY - 1998/12/1

Y1 - 1998/12/1

N2 - In this paper, we propose an optimal feature extraction method for normally distributed data. The feature extraction algorithm is optimal in the sense that we search the whole feature space to find a set of features which give the smallest classification error for the Gaussian ML classifier. Initially, we start with an arbitrary feature vector. Assuming that the feature vector is used for classification, we compute the classification error. Then we move the feature vector slightly in the direction so that the classification error decreases most rapidly. This can be done by taking gradient. We propose two search methods, sequential search and global search. In the sequential search, if more features are needed, we try to find an additional feature which gives the best classification accuracy with the already chosen features. In the global search, we are not restricted to use the already chosen features. Experiment results show that the proposed method outperforms the conventional feature extraction algorithms.

AB - In this paper, we propose an optimal feature extraction method for normally distributed data. The feature extraction algorithm is optimal in the sense that we search the whole feature space to find a set of features which give the smallest classification error for the Gaussian ML classifier. Initially, we start with an arbitrary feature vector. Assuming that the feature vector is used for classification, we compute the classification error. Then we move the feature vector slightly in the direction so that the classification error decreases most rapidly. This can be done by taking gradient. We propose two search methods, sequential search and global search. In the sequential search, if more features are needed, we try to find an additional feature which gives the best classification accuracy with the already chosen features. In the global search, we are not restricted to use the already chosen features. Experiment results show that the proposed method outperforms the conventional feature extraction algorithms.

UR - http://www.scopus.com/inward/record.url?scp=0032404438&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0032404438&partnerID=8YFLogxK

U2 - 10.1117/12.312603

DO - 10.1117/12.312603

M3 - Conference article

VL - 3372

SP - 223

EP - 232

JO - Proceedings of SPIE - The International Society for Optical Engineering

JF - Proceedings of SPIE - The International Society for Optical Engineering

SN - 0277-786X

ER -