Analyzing weight distribution of feedforward neural networks and efficient weight initialization

Jimvook Go, Byungjoon Baek, Chulhee Lee

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

In this paper, we investigate and analyze the weight distribution of feedforward two-layer neural networks in order to understand and improve the time-consuming training process of neural networks. Generally, it takes a long time to train neural networks. However, when a new problem is presented, neural networks have to be trained again without any benefit from previous training. In order to address this problem, we view training process as finding a solution weight point in a weight space and analyze the distribution of solution weight points in the weight space. Then, we propose a weight initialization method that uses the information on the distribution of the solution weight points. Experimental results show that the proposed weight initialization method provides a better performance than the conventional method that uses a random generator in terms of convergence speed.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
EditorsAna Fred, Terry Caelli, Robert P.W. Duin, Dick de Ridder, Aurelio Campilho
PublisherSpringer Verlag
Pages840-849
Number of pages10
ISBN (Print)9783540225706
DOIs
Publication statusPublished - 2004

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3138
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Analyzing weight distribution of feedforward neural networks and efficient weight initialization'. Together they form a unique fingerprint.

Cite this