Parallel, Self-Organizing, Hierarchical Neural Networks—II

Okan K. Ersoy, Daesik Hong

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Parallel, self-organizing hierarchical neural networks (PSHNN's) involve a number of stages with error detection at the end of each stage, rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on—off or bipolar switching operations for bulk computations and attenuators for pointwise operations.

Original languageEnglish
Pages (from-to)218-227
Number of pages10
JournalIEEE Transactions on Industrial Electronics
Volume40
Issue number2
DOIs
Publication statusPublished - 1993 Jan 1

Fingerprint

Error detection
Neural networks
Backpropagation
Maximum likelihood
Testing

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

@article{782ff8f5d686424eaefb8a27e343ed63,
title = "Parallel, Self-Organizing, Hierarchical Neural Networks—II",
abstract = "Parallel, self-organizing hierarchical neural networks (PSHNN's) involve a number of stages with error detection at the end of each stage, rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on—off or bipolar switching operations for bulk computations and attenuators for pointwise operations.",
author = "Ersoy, {Okan K.} and Daesik Hong",
year = "1993",
month = "1",
day = "1",
doi = "10.1109/41.222643",
language = "English",
volume = "40",
pages = "218--227",
journal = "IEEE Transactions on Industrial Electronics",
issn = "0278-0046",
publisher = "IEEE Industrial Electronics Society",
number = "2",

}

Parallel, Self-Organizing, Hierarchical Neural Networks—II. / Ersoy, Okan K.; Hong, Daesik.

In: IEEE Transactions on Industrial Electronics, Vol. 40, No. 2, 01.01.1993, p. 218-227.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Parallel, Self-Organizing, Hierarchical Neural Networks—II

AU - Ersoy, Okan K.

AU - Hong, Daesik

PY - 1993/1/1

Y1 - 1993/1/1

N2 - Parallel, self-organizing hierarchical neural networks (PSHNN's) involve a number of stages with error detection at the end of each stage, rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on—off or bipolar switching operations for bulk computations and attenuators for pointwise operations.

AB - Parallel, self-organizing hierarchical neural networks (PSHNN's) involve a number of stages with error detection at the end of each stage, rejection of error-causing vectors, which are then fed into the next stage after a nonlinear transformation. The stages operate in parallel during testing. Statistical properties and the mechanisms of vector rejection of the PSHNN are discussed in comparison to the maximum likelihood method and the backpropagation network. The PSHNN is highly fault tolerant and robust against errors in the weight values due to the adjustment of the error detection bounds to compensate errors in the weight values. These properties are exploited to develop architectures for programmable implementations in which the programmable parts are reduced to on—off or bipolar switching operations for bulk computations and attenuators for pointwise operations.

UR - http://www.scopus.com/inward/record.url?scp=0027580440&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027580440&partnerID=8YFLogxK

U2 - 10.1109/41.222643

DO - 10.1109/41.222643

M3 - Article

VL - 40

SP - 218

EP - 227

JO - IEEE Transactions on Industrial Electronics

JF - IEEE Transactions on Industrial Electronics

SN - 0278-0046

IS - 2

ER -