Abstract
Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.
Original language | English |
---|---|
Title of host publication | Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings |
Editors | David Cheung, Graham J. Williams, Qing Li |
Publisher | Springer Verlag |
Pages | 538-549 |
Number of pages | 12 |
ISBN (Print) | 3540419101, 9783540419105 |
DOIs | |
Publication status | Published - 2001 |
Event | 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001 - Kowloon, Hong Kong Duration: 2001 Apr 16 → 2001 Apr 18 |
Publication series
Name | Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) |
---|---|
Volume | 2035 |
ISSN (Print) | 0302-9743 |
Other
Other | 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001 |
---|---|
Country/Territory | Hong Kong |
City | Kowloon |
Period | 01/4/16 → 01/4/18 |
Bibliographical note
Publisher Copyright:© Springer-Verlag Berlin Heidelberg 2001.
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Computer Science(all)