Parallel, self-organizing hierarchical neural networks

O. K. Ersoy, D. Hong

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

A neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN) is discussed. The PSHNN involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each SNN, error detection is carried out, and a number of input vectors are rejected. Between two SNNs there is a nonlinear first SNN. The PSHNN has an optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and parallel architectures in which all SNNs operate simultaneously without waiting for data from each other during testing. In classification experiments with aircraft and satellite remote-sensing data the PSHNN is compared to multilayer networks using backpropagation training.

Original languageEnglish
Title of host publicationProceedings of the Hawaii International Conference on System Science
EditorsLee W. Hoevel, Bruce D. Shriver, Jay F.Jr. Nunamaker, Ralph H.Jr. Sprague, Velijko Milutinovic
PublisherPubl by Western Periodicals Co
Pages158-169
Number of pages12
Volume1
ISBN (Print)0818620080
Publication statusPublished - 1990 Jan 1
EventProceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences. Volume 1: Architecture Track - Kailua-Kona, HI, USA
Duration: 1990 Jan 21990 Jan 5

Other

OtherProceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences. Volume 1: Architecture Track
CityKailua-Kona, HI, USA
Period90/1/290/1/5

Fingerprint

Neural networks
Parallel architectures
Error detection
Network architecture
Backpropagation
Remote sensing
Multilayers
Aircraft
Satellites
Testing
Experiments

All Science Journal Classification (ASJC) codes

  • Engineering(all)
  • Software
  • Industrial and Manufacturing Engineering

Cite this

Ersoy, O. K., & Hong, D. (1990). Parallel, self-organizing hierarchical neural networks. In L. W. Hoevel, B. D. Shriver, J. F. J. Nunamaker, R. H. J. Sprague, & V. Milutinovic (Eds.), Proceedings of the Hawaii International Conference on System Science (Vol. 1, pp. 158-169). Publ by Western Periodicals Co.
Ersoy, O. K. ; Hong, D. / Parallel, self-organizing hierarchical neural networks. Proceedings of the Hawaii International Conference on System Science. editor / Lee W. Hoevel ; Bruce D. Shriver ; Jay F.Jr. Nunamaker ; Ralph H.Jr. Sprague ; Velijko Milutinovic. Vol. 1 Publ by Western Periodicals Co, 1990. pp. 158-169
@inproceedings{bac9d64a35ae43d884b4d342fd43acef,
title = "Parallel, self-organizing hierarchical neural networks",
abstract = "A neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN) is discussed. The PSHNN involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each SNN, error detection is carried out, and a number of input vectors are rejected. Between two SNNs there is a nonlinear first SNN. The PSHNN has an optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and parallel architectures in which all SNNs operate simultaneously without waiting for data from each other during testing. In classification experiments with aircraft and satellite remote-sensing data the PSHNN is compared to multilayer networks using backpropagation training.",
author = "Ersoy, {O. K.} and D. Hong",
year = "1990",
month = "1",
day = "1",
language = "English",
isbn = "0818620080",
volume = "1",
pages = "158--169",
editor = "Hoevel, {Lee W.} and Shriver, {Bruce D.} and Nunamaker, {Jay F.Jr.} and Sprague, {Ralph H.Jr.} and Velijko Milutinovic",
booktitle = "Proceedings of the Hawaii International Conference on System Science",
publisher = "Publ by Western Periodicals Co",

}

Ersoy, OK & Hong, D 1990, Parallel, self-organizing hierarchical neural networks. in LW Hoevel, BD Shriver, JFJ Nunamaker, RHJ Sprague & V Milutinovic (eds), Proceedings of the Hawaii International Conference on System Science. vol. 1, Publ by Western Periodicals Co, pp. 158-169, Proceedings of the Twenty-Third Annual Hawaii International Conference on System Sciences. Volume 1: Architecture Track, Kailua-Kona, HI, USA, 90/1/2.

Parallel, self-organizing hierarchical neural networks. / Ersoy, O. K.; Hong, D.

Proceedings of the Hawaii International Conference on System Science. ed. / Lee W. Hoevel; Bruce D. Shriver; Jay F.Jr. Nunamaker; Ralph H.Jr. Sprague; Velijko Milutinovic. Vol. 1 Publ by Western Periodicals Co, 1990. p. 158-169.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Parallel, self-organizing hierarchical neural networks

AU - Ersoy, O. K.

AU - Hong, D.

PY - 1990/1/1

Y1 - 1990/1/1

N2 - A neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN) is discussed. The PSHNN involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each SNN, error detection is carried out, and a number of input vectors are rejected. Between two SNNs there is a nonlinear first SNN. The PSHNN has an optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and parallel architectures in which all SNNs operate simultaneously without waiting for data from each other during testing. In classification experiments with aircraft and satellite remote-sensing data the PSHNN is compared to multilayer networks using backpropagation training.

AB - A neural network architecture called the parallel self-organizing hierarchical neural network (PSHNN) is discussed. The PSHNN involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each SNN, error detection is carried out, and a number of input vectors are rejected. Between two SNNs there is a nonlinear first SNN. The PSHNN has an optimized system complexity in the sense of minimized self-organizing number of stages, high classification accuracy, minimized learning and recall times, and parallel architectures in which all SNNs operate simultaneously without waiting for data from each other during testing. In classification experiments with aircraft and satellite remote-sensing data the PSHNN is compared to multilayer networks using backpropagation training.

UR - http://www.scopus.com/inward/record.url?scp=0025260229&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025260229&partnerID=8YFLogxK

M3 - Conference contribution

SN - 0818620080

VL - 1

SP - 158

EP - 169

BT - Proceedings of the Hawaii International Conference on System Science

A2 - Hoevel, Lee W.

A2 - Shriver, Bruce D.

A2 - Nunamaker, Jay F.Jr.

A2 - Sprague, Ralph H.Jr.

A2 - Milutinovic, Velijko

PB - Publ by Western Periodicals Co

ER -

Ersoy OK, Hong D. Parallel, self-organizing hierarchical neural networks. In Hoevel LW, Shriver BD, Nunamaker JFJ, Sprague RHJ, Milutinovic V, editors, Proceedings of the Hawaii International Conference on System Science. Vol. 1. Publ by Western Periodicals Co. 1990. p. 158-169