Abstract
In this paper, we propose an optimal feature extraction method for normally distributed data. The feature extraction algorithm is optimal in the sense that we search the whole feature space to find a set of features which give the smallest classification error for the Gaussian ML classifier. Initially, we start with an arbitrary feature vector. Assuming that the feature vector is used for classification, we compute the classification error. Then we move the feature vector slightly in the direction so that the classification error decreases most rapidly. This can be done by taking gradient. We propose two search methods, sequential search and global search. In the sequential search, if more features are needed, we try to find an additional feature which gives the best classification accuracy with the already chosen features. In the global search, we are not restricted to use the already chosen features. Experiment results show that the proposed method outperforms the conventional feature extraction algorithms.
Original language | English |
---|---|
Pages (from-to) | 223-232 |
Number of pages | 10 |
Journal | Proceedings of SPIE - The International Society for Optical Engineering |
Volume | 3372 |
DOIs | |
Publication status | Published - 1998 |
Event | Algorithms for Multispectral and Hyperspectral Imagery IV - Orlando, FL, United States Duration: 1998 Apr 13 → 1998 Apr 14 |
All Science Journal Classification (ASJC) codes
- Electronic, Optical and Magnetic Materials
- Condensed Matter Physics
- Computer Science Applications
- Applied Mathematics
- Electrical and Electronic Engineering