Rule reduction over numerical attributes in decision trees using multilayer perceptron

DaeEun Kim, Jaeho Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.

Original languageEnglish
Title of host publicationAdvances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings
EditorsGraham J. Williams, Qing Li, David Cheung
PublisherSpringer Verlag
Pages538-549
Number of pages12
ISBN (Print)3540419101, 9783540419105
Publication statusPublished - 2001 Jan 1
Event5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001 - Kowloon, Hong Kong
Duration: 2001 Apr 162001 Apr 18

Publication series

NameLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume2035
ISSN (Print)0302-9743

Other

Other5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001
CountryHong Kong
CityKowloon
Period01/4/1601/4/18

Fingerprint

Multilayer neural networks
Decision trees
Perceptron
Decision tree
Multilayer
Classifiers
Attribute
Classifier
Neural networks
Backpropagation
Neural Networks
Nonlinear Modeling
Network Modeling
Back Propagation
Compact Set
Proof by induction

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Kim, D., & Lee, J. (2001). Rule reduction over numerical attributes in decision trees using multilayer perceptron. In G. J. Williams, Q. Li, & D. Cheung (Eds.), Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings (pp. 538-549). (Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science); Vol. 2035). Springer Verlag.
Kim, DaeEun ; Lee, Jaeho. / Rule reduction over numerical attributes in decision trees using multilayer perceptron. Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings. editor / Graham J. Williams ; Qing Li ; David Cheung. Springer Verlag, 2001. pp. 538-549 (Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)).
@inproceedings{047ec8b8b9df4e5db473c8672965dc46,
title = "Rule reduction over numerical attributes in decision trees using multilayer perceptron",
abstract = "Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.",
author = "DaeEun Kim and Jaeho Lee",
year = "2001",
month = "1",
day = "1",
language = "English",
isbn = "3540419101",
series = "Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)",
publisher = "Springer Verlag",
pages = "538--549",
editor = "Williams, {Graham J.} and Qing Li and David Cheung",
booktitle = "Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings",
address = "Germany",

}

Kim, D & Lee, J 2001, Rule reduction over numerical attributes in decision trees using multilayer perceptron. in GJ Williams, Q Li & D Cheung (eds), Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings. Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science), vol. 2035, Springer Verlag, pp. 538-549, 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2001, Kowloon, Hong Kong, 01/4/16.

Rule reduction over numerical attributes in decision trees using multilayer perceptron. / Kim, DaeEun; Lee, Jaeho.

Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings. ed. / Graham J. Williams; Qing Li; David Cheung. Springer Verlag, 2001. p. 538-549 (Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science); Vol. 2035).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Rule reduction over numerical attributes in decision trees using multilayer perceptron

AU - Kim, DaeEun

AU - Lee, Jaeho

PY - 2001/1/1

Y1 - 2001/1/1

N2 - Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.

AB - Many data sets show significant correlations between input variables, and much useful information is hidden in the data in a non- linear format. It has been shown that a neural network is better than a direct application of induction trees in modeling nonlinear characteristics of sample data. We have extracted a compact set of rules to support data with input variable relations over continuous-valued attributes. Those re- lations as a set of linear classifiers can be obtained from neural network modeling based on back-propagation. It is shown in this paper that vari- able thresholds play an important role in constructing linear classifier rules when we use a decision tree over linear classifiers extracted from a multilayer perceptron. We have tested this scheme over several data sets to compare it with the decision tree results.

UR - http://www.scopus.com/inward/record.url?scp=84942926848&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84942926848&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84942926848

SN - 3540419101

SN - 9783540419105

T3 - Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)

SP - 538

EP - 549

BT - Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings

A2 - Williams, Graham J.

A2 - Li, Qing

A2 - Cheung, David

PB - Springer Verlag

ER -

Kim D, Lee J. Rule reduction over numerical attributes in decision trees using multilayer perceptron. In Williams GJ, Li Q, Cheung D, editors, Advances in Knowledge Discovery and Data Mining - 5th Pacific-Asia Conference, PAKDD 2001, Proceedings. Springer Verlag. 2001. p. 538-549. (Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)).