Fast deterministic global optimization for FNN training

Research output: Contribution to journalConference article

3 Citations (Scopus)

Abstract

This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.

Original languageEnglish
Pages (from-to)V-413 - V-418
JournalProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Volume5
Publication statusPublished - 1999 Dec 1
Event1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn
Duration: 1999 Oct 121999 Oct 15

Fingerprint

Feedforward neural networks
Global optimization
Neural networks

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Hardware and Architecture

Cite this

@article{f5b1cef3b660401d983366010a01be88,
title = "Fast deterministic global optimization for FNN training",
abstract = "This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.",
author = "Toh, {Kar Ann}",
year = "1999",
month = "12",
day = "1",
language = "English",
volume = "5",
pages = "V--413 -- V--418",
journal = "Proceedings of the IEEE International Conference on Systems, Man and Cybernetics",
issn = "0884-3627",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

Fast deterministic global optimization for FNN training. / Toh, Kar Ann.

In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Vol. 5, 01.12.1999, p. V-413 - V-418.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Fast deterministic global optimization for FNN training

AU - Toh, Kar Ann

PY - 1999/12/1

Y1 - 1999/12/1

N2 - This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.

AB - This paper addresses the issue of training Feedforward Neural Networks (FNN) by global optimization. Our main contributions include: (i) proposal of a generic global optimality condition where the function to be optimized needs not be continuous, and (ii) formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single output-unit is considered. An explicit expression for the Jacobian of the network is first presented. Then, by means of convex monotonic transformation, we prove a necessary and sufficient global optimality condition. Based on this fundamental result, the characterization of global optimality is specialized to network training. Two penalty-based algorithms are then formulated constraining the search within those regions containing the global minima. Comparison with benchmark problems in the neural network literature shows superiority of the proposed algorithms both in terms of the speed of convergence and the percentage of trials attaining the desired solutions.

UR - http://www.scopus.com/inward/record.url?scp=0033332984&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033332984&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:0033332984

VL - 5

SP - V-413 - V-418

JO - Proceedings of the IEEE International Conference on Systems, Man and Cybernetics

JF - Proceedings of the IEEE International Conference on Systems, Man and Cybernetics

SN - 0884-3627

ER -