In this paper, we propose to train the RBF neural network using a global descent method. Essentially, the method imposes a monotonic transformation on the training objective to improve numerical sensitivity without altering the relative orders of all local extrema. A gradient descent search which inherits the global descent property is derived to locate the global solution of an error objective. Numerical examples comparing the global descent algorithm with a gradient-based line-search algorithm shows superiority of the proposed global descent algorithm in terms of speed of convergence and quality of solution achieved.
|Number of pages||4|
|Journal||Proceedings - International Conference on Pattern Recognition|
|Publication status||Published - 2002 Dec 1|
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition