Deterministic neural classification

Research output: Contribution to journalArticlepeer-review

85 Citations (Scopus)

Abstract

This letter presents a minimum classification error learning formulation for a single-layer feedforward network (SLFN). By approximating the nonlinear counting step function using a quadratic function, the classification error rate is shown to be deterministically solvable. Essentially the derived solution is related to an existing weighted least-squares method with class-specific weights set according to the size of data set. By considering the class-specific weights as adjustable parameters, the learning formulation extends the classification robustness of the SLFN without sacrificing its intrinsic advantage of being a closed-form algorithm. While the method is applicable to other linear formulations, our empirical results indicate SLFN's effectiveness on classification generalization.

Original languageEnglish
Pages (from-to)1565-1595
Number of pages31
JournalNeural Computation
Volume20
Issue number6
DOIs
Publication statusPublished - 2008 Jun

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint Dive into the research topics of 'Deterministic neural classification'. Together they form a unique fingerprint.

Cite this