Fast deterministic global optimization for FNN training

Research output: Contribution to journalConference articlepeer-review

3 Citations (Scopus)


This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.

Original languageEnglish
Pages (from-to)V-413 - V-418
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Publication statusPublished - 1999
Event1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn
Duration: 1999 Oct 121999 Oct 15

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Hardware and Architecture


Dive into the research topics of 'Fast deterministic global optimization for FNN training'. Together they form a unique fingerprint.

Cite this