Abstract
This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.
Original language | English |
---|---|
Pages (from-to) | V-413 - V-418 |
Journal | Proceedings of the IEEE International Conference on Systems, Man and Cybernetics |
Volume | 5 |
Publication status | Published - 1999 |
Event | 1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn Duration: 1999 Oct 12 → 1999 Oct 15 |
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering
- Hardware and Architecture