Evolutionarily optimized features in functional link neural network for classification

Satchidananda Dehuri, Sung-Bae Cho

Research output: Contribution to journalArticle

49 Citations (Scopus)

Abstract

In this paper, an adequate set of input features is selected for functional expansion genetically for the purpose of solving the problem of classification in data mining using functional link neural network. The proposed method named as HFLNN aims to choose an optimal subset of input features by eliminating features with little or no predictive information and designs a more compact classifier. With an adequate set of basis functions, HFLNN overcomes the non-linearity of problems, which is a common phenomenon in single layer neural networks. The properties like simplicity of the architecture (i.e., no hidden layer) and the low computational complexity of the network (i.e., less number of weights to be learned) encourage us to use it in classification task of data mining. We present a mathematical analysis of the stability and convergence of the proposed method. Further the issue of statistical tests for comparison of algorithms on multiple datasets, which is even more essential in data mining studies, has been all but ignored. In this paper, we recommend a set of simple, yet safe, robust and non-parametric tests for statistical comparisons of the HFLNN with functional link neural network (FLNN) and radial basis function network (RBFN) classifiers over multiple datasets by an extensive set of simulation studies.

Original languageEnglish
Pages (from-to)4379-4391
Number of pages13
JournalExpert Systems with Applications
Volume37
Issue number6
DOIs
Publication statusPublished - 2010 Jun 1

Fingerprint

Data mining
Neural networks
Classifiers
Radial basis function networks
Statistical tests
Computational complexity

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Science Applications
  • Engineering(all)

Cite this

@article{a0d42781e29c43fa830f17146c4aba9c,
title = "Evolutionarily optimized features in functional link neural network for classification",
abstract = "In this paper, an adequate set of input features is selected for functional expansion genetically for the purpose of solving the problem of classification in data mining using functional link neural network. The proposed method named as HFLNN aims to choose an optimal subset of input features by eliminating features with little or no predictive information and designs a more compact classifier. With an adequate set of basis functions, HFLNN overcomes the non-linearity of problems, which is a common phenomenon in single layer neural networks. The properties like simplicity of the architecture (i.e., no hidden layer) and the low computational complexity of the network (i.e., less number of weights to be learned) encourage us to use it in classification task of data mining. We present a mathematical analysis of the stability and convergence of the proposed method. Further the issue of statistical tests for comparison of algorithms on multiple datasets, which is even more essential in data mining studies, has been all but ignored. In this paper, we recommend a set of simple, yet safe, robust and non-parametric tests for statistical comparisons of the HFLNN with functional link neural network (FLNN) and radial basis function network (RBFN) classifiers over multiple datasets by an extensive set of simulation studies.",
author = "Satchidananda Dehuri and Sung-Bae Cho",
year = "2010",
month = "6",
day = "1",
doi = "10.1016/j.eswa.2009.11.090",
language = "English",
volume = "37",
pages = "4379--4391",
journal = "Expert Systems with Applications",
issn = "0957-4174",
publisher = "Elsevier Limited",
number = "6",

}

Evolutionarily optimized features in functional link neural network for classification. / Dehuri, Satchidananda; Cho, Sung-Bae.

In: Expert Systems with Applications, Vol. 37, No. 6, 01.06.2010, p. 4379-4391.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Evolutionarily optimized features in functional link neural network for classification

AU - Dehuri, Satchidananda

AU - Cho, Sung-Bae

PY - 2010/6/1

Y1 - 2010/6/1

N2 - In this paper, an adequate set of input features is selected for functional expansion genetically for the purpose of solving the problem of classification in data mining using functional link neural network. The proposed method named as HFLNN aims to choose an optimal subset of input features by eliminating features with little or no predictive information and designs a more compact classifier. With an adequate set of basis functions, HFLNN overcomes the non-linearity of problems, which is a common phenomenon in single layer neural networks. The properties like simplicity of the architecture (i.e., no hidden layer) and the low computational complexity of the network (i.e., less number of weights to be learned) encourage us to use it in classification task of data mining. We present a mathematical analysis of the stability and convergence of the proposed method. Further the issue of statistical tests for comparison of algorithms on multiple datasets, which is even more essential in data mining studies, has been all but ignored. In this paper, we recommend a set of simple, yet safe, robust and non-parametric tests for statistical comparisons of the HFLNN with functional link neural network (FLNN) and radial basis function network (RBFN) classifiers over multiple datasets by an extensive set of simulation studies.

AB - In this paper, an adequate set of input features is selected for functional expansion genetically for the purpose of solving the problem of classification in data mining using functional link neural network. The proposed method named as HFLNN aims to choose an optimal subset of input features by eliminating features with little or no predictive information and designs a more compact classifier. With an adequate set of basis functions, HFLNN overcomes the non-linearity of problems, which is a common phenomenon in single layer neural networks. The properties like simplicity of the architecture (i.e., no hidden layer) and the low computational complexity of the network (i.e., less number of weights to be learned) encourage us to use it in classification task of data mining. We present a mathematical analysis of the stability and convergence of the proposed method. Further the issue of statistical tests for comparison of algorithms on multiple datasets, which is even more essential in data mining studies, has been all but ignored. In this paper, we recommend a set of simple, yet safe, robust and non-parametric tests for statistical comparisons of the HFLNN with functional link neural network (FLNN) and radial basis function network (RBFN) classifiers over multiple datasets by an extensive set of simulation studies.

UR - http://www.scopus.com/inward/record.url?scp=77249161596&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77249161596&partnerID=8YFLogxK

U2 - 10.1016/j.eswa.2009.11.090

DO - 10.1016/j.eswa.2009.11.090

M3 - Article

VL - 37

SP - 4379

EP - 4391

JO - Expert Systems with Applications

JF - Expert Systems with Applications

SN - 0957-4174

IS - 6

ER -