A novel approach to feature selection for classification is proposed based directly on decision boundaries. It is shown how discriminantly redundant features and discriminantly informative features are related to decision boundaries. It is noted that only a portion of the decision boundary is effective in discriminating between classes. A procedure to extract discriminantly informative features based on a decision boundary is proposed. The proposed feature selection algorithm predicts the minimum number of features necessary to achieve the same classification accuracy as in the original space, and it finds the necessary feature vectors. Experiments show that the performance of the algorithm compares favorably with that of previous algorithms.