There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.
|Title of host publication||Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers|
|Number of pages||10|
|Publication status||Published - 2008|
|Event||14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu, Japan|
Duration: 2007 Nov 13 → 2007 Nov 16
|Name||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
|Other||14th International Conference on Neural Information Processing, ICONIP 2007|
|Period||07/11/13 → 07/11/16|
Bibliographical noteFunding Information:
This research was supported by Brain Science and Engineering Research Program sponsored by Korean Ministry of Commerce, Industry and Energy.
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Computer Science(all)