Stochastic and non-stochastic feature selection

Antonio J. Tallón-Ballesteros, Luís Correia, Sung Bae Cho

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

Feature selection has been applied in several areas of science and engineering for a long time. This kind of pre-processing is almost mandatory in problems with huge amounts of features which requires a very high computational cost and also may be handicapped very frequently with more than two classes and lot of instances. The general taxonomy clearly divides the approaches into two groups such as filters and wrappers. This paper introduces a methodology to refine the feature subset with an additional feature selection approach. It reviews the possibilities and deepens into a new class of algorithms based on a refinement of an initial search with another method. We apply sequentially an approximate procedure and an exact procedure. The research is supported by empirical results and some guidelines are drawn as conclusions of this paper.

Original languageEnglish
Title of host publicationIntelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings
EditorsHujun Yin, Minling Zhang, Yimin Wen, Guoyong Cai, Tianlong Gu, Antonio J. Tallon-Ballesteros, Junping Du, Yang Gao, Songcan Chen
PublisherSpringer Verlag
Pages592-598
Number of pages7
ISBN (Print)9783319689340
DOIs
Publication statusPublished - 2017 Jan 1
Event18th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2017 - Guilin, China
Duration: 2017 Oct 302017 Nov 1

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10585 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other18th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2017
CountryChina
CityGuilin
Period17/10/3017/11/1

Fingerprint

Feature Selection
Feature extraction
Wrapper
Taxonomies
Taxonomy
Preprocessing
Divides
Computational Cost
Refinement
Filter
Engineering
Subset
Methodology
Processing
Costs
Class
Review

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Tallón-Ballesteros, A. J., Correia, L., & Cho, S. B. (2017). Stochastic and non-stochastic feature selection. In H. Yin, M. Zhang, Y. Wen, G. Cai, T. Gu, A. J. Tallon-Ballesteros, J. Du, Y. Gao, ... S. Chen (Eds.), Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings (pp. 592-598). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10585 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-68935-7_64
Tallón-Ballesteros, Antonio J. ; Correia, Luís ; Cho, Sung Bae. / Stochastic and non-stochastic feature selection. Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings. editor / Hujun Yin ; Minling Zhang ; Yimin Wen ; Guoyong Cai ; Tianlong Gu ; Antonio J. Tallon-Ballesteros ; Junping Du ; Yang Gao ; Songcan Chen. Springer Verlag, 2017. pp. 592-598 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{927e6f097b764cd48a021835165d5f6b,
title = "Stochastic and non-stochastic feature selection",
abstract = "Feature selection has been applied in several areas of science and engineering for a long time. This kind of pre-processing is almost mandatory in problems with huge amounts of features which requires a very high computational cost and also may be handicapped very frequently with more than two classes and lot of instances. The general taxonomy clearly divides the approaches into two groups such as filters and wrappers. This paper introduces a methodology to refine the feature subset with an additional feature selection approach. It reviews the possibilities and deepens into a new class of algorithms based on a refinement of an initial search with another method. We apply sequentially an approximate procedure and an exact procedure. The research is supported by empirical results and some guidelines are drawn as conclusions of this paper.",
author = "Tall{\'o}n-Ballesteros, {Antonio J.} and Lu{\'i}s Correia and Cho, {Sung Bae}",
year = "2017",
month = "1",
day = "1",
doi = "10.1007/978-3-319-68935-7_64",
language = "English",
isbn = "9783319689340",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "592--598",
editor = "Hujun Yin and Minling Zhang and Yimin Wen and Guoyong Cai and Tianlong Gu and Tallon-Ballesteros, {Antonio J.} and Junping Du and Yang Gao and Songcan Chen",
booktitle = "Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings",
address = "Germany",

}

Tallón-Ballesteros, AJ, Correia, L & Cho, SB 2017, Stochastic and non-stochastic feature selection. in H Yin, M Zhang, Y Wen, G Cai, T Gu, AJ Tallon-Ballesteros, J Du, Y Gao & S Chen (eds), Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10585 LNCS, Springer Verlag, pp. 592-598, 18th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2017, Guilin, China, 17/10/30. https://doi.org/10.1007/978-3-319-68935-7_64

Stochastic and non-stochastic feature selection. / Tallón-Ballesteros, Antonio J.; Correia, Luís; Cho, Sung Bae.

Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings. ed. / Hujun Yin; Minling Zhang; Yimin Wen; Guoyong Cai; Tianlong Gu; Antonio J. Tallon-Ballesteros; Junping Du; Yang Gao; Songcan Chen. Springer Verlag, 2017. p. 592-598 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10585 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Stochastic and non-stochastic feature selection

AU - Tallón-Ballesteros, Antonio J.

AU - Correia, Luís

AU - Cho, Sung Bae

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Feature selection has been applied in several areas of science and engineering for a long time. This kind of pre-processing is almost mandatory in problems with huge amounts of features which requires a very high computational cost and also may be handicapped very frequently with more than two classes and lot of instances. The general taxonomy clearly divides the approaches into two groups such as filters and wrappers. This paper introduces a methodology to refine the feature subset with an additional feature selection approach. It reviews the possibilities and deepens into a new class of algorithms based on a refinement of an initial search with another method. We apply sequentially an approximate procedure and an exact procedure. The research is supported by empirical results and some guidelines are drawn as conclusions of this paper.

AB - Feature selection has been applied in several areas of science and engineering for a long time. This kind of pre-processing is almost mandatory in problems with huge amounts of features which requires a very high computational cost and also may be handicapped very frequently with more than two classes and lot of instances. The general taxonomy clearly divides the approaches into two groups such as filters and wrappers. This paper introduces a methodology to refine the feature subset with an additional feature selection approach. It reviews the possibilities and deepens into a new class of algorithms based on a refinement of an initial search with another method. We apply sequentially an approximate procedure and an exact procedure. The research is supported by empirical results and some guidelines are drawn as conclusions of this paper.

UR - http://www.scopus.com/inward/record.url?scp=85034222347&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85034222347&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-68935-7_64

DO - 10.1007/978-3-319-68935-7_64

M3 - Conference contribution

AN - SCOPUS:85034222347

SN - 9783319689340

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 592

EP - 598

BT - Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings

A2 - Yin, Hujun

A2 - Zhang, Minling

A2 - Wen, Yimin

A2 - Cai, Guoyong

A2 - Gu, Tianlong

A2 - Tallon-Ballesteros, Antonio J.

A2 - Du, Junping

A2 - Gao, Yang

A2 - Chen, Songcan

PB - Springer Verlag

ER -

Tallón-Ballesteros AJ, Correia L, Cho SB. Stochastic and non-stochastic feature selection. In Yin H, Zhang M, Wen Y, Cai G, Gu T, Tallon-Ballesteros AJ, Du J, Gao Y, Chen S, editors, Intelligent Data Engineering and Automated Learning – IDEAL 2017 - 18th International Conference, Proceedings. Springer Verlag. 2017. p. 592-598. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-68935-7_64