Improving generalization capability of neural networks based on simulated annealing

Yeejin Lee, Jong Seok Lee, Sun Young Lee, Cheol Hoon Park

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

This paper presents a single-objective and a multiobjective stochastic optimization algorithms for global training of neural networks based on simulated annealing. The algorithms overcome the limitation of local optimization by the conventional gradient-based training methods and perform global optimization of the weights of the neural networks. Especially, the multiobjective training algorithm is designed to enhance generalization capability of the trained networks by minimizing the training error and the dynamic range of the network weights simultaneously. For fast convergence and good solution quality of the algorithms, we suggest the hybrid simulated annealing algorithm with the gradient-based local optimization method. Experimental results show that the performance of the trained networks by the proposed methods is better than that by the gradient-based local training algorithm and, moreover, the generalization capability of the networks is significantly improved by preventing overfitting phenomena.

Original languageEnglish
Title of host publication2007 IEEE Congress on Evolutionary Computation, CEC 2007
Pages3447-3453
Number of pages7
DOIs
Publication statusPublished - 2007 Dec 1
Event2007 IEEE Congress on Evolutionary Computation, CEC 2007 - , Singapore
Duration: 2007 Sep 252007 Sep 28

Publication series

Name2007 IEEE Congress on Evolutionary Computation, CEC 2007

Other

Other2007 IEEE Congress on Evolutionary Computation, CEC 2007
CountrySingapore
Period07/9/2507/9/28

Fingerprint

Simulated annealing
Simulated Annealing
Neural Networks
Neural networks
Local Optimization
Training Algorithm
Gradient
Local Algorithms
Overfitting
Stochastic Algorithms
Simulated Annealing Algorithm
Stochastic Optimization
Dynamic Range
Hybrid Algorithm
Multi-objective Optimization
Global Optimization
Optimization Methods
Optimization Algorithm
Global optimization
Generalization

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Theoretical Computer Science

Cite this

Lee, Y., Lee, J. S., Lee, S. Y., & Park, C. H. (2007). Improving generalization capability of neural networks based on simulated annealing. In 2007 IEEE Congress on Evolutionary Computation, CEC 2007 (pp. 3447-3453). [4424918] (2007 IEEE Congress on Evolutionary Computation, CEC 2007). https://doi.org/10.1109/CEC.2007.4424918
Lee, Yeejin ; Lee, Jong Seok ; Lee, Sun Young ; Park, Cheol Hoon. / Improving generalization capability of neural networks based on simulated annealing. 2007 IEEE Congress on Evolutionary Computation, CEC 2007. 2007. pp. 3447-3453 (2007 IEEE Congress on Evolutionary Computation, CEC 2007).
@inproceedings{d65cab120e964a89bb5ee0fa211f37cc,
title = "Improving generalization capability of neural networks based on simulated annealing",
abstract = "This paper presents a single-objective and a multiobjective stochastic optimization algorithms for global training of neural networks based on simulated annealing. The algorithms overcome the limitation of local optimization by the conventional gradient-based training methods and perform global optimization of the weights of the neural networks. Especially, the multiobjective training algorithm is designed to enhance generalization capability of the trained networks by minimizing the training error and the dynamic range of the network weights simultaneously. For fast convergence and good solution quality of the algorithms, we suggest the hybrid simulated annealing algorithm with the gradient-based local optimization method. Experimental results show that the performance of the trained networks by the proposed methods is better than that by the gradient-based local training algorithm and, moreover, the generalization capability of the networks is significantly improved by preventing overfitting phenomena.",
author = "Yeejin Lee and Lee, {Jong Seok} and Lee, {Sun Young} and Park, {Cheol Hoon}",
year = "2007",
month = "12",
day = "1",
doi = "10.1109/CEC.2007.4424918",
language = "English",
isbn = "1424413400",
series = "2007 IEEE Congress on Evolutionary Computation, CEC 2007",
pages = "3447--3453",
booktitle = "2007 IEEE Congress on Evolutionary Computation, CEC 2007",

}

Lee, Y, Lee, JS, Lee, SY & Park, CH 2007, Improving generalization capability of neural networks based on simulated annealing. in 2007 IEEE Congress on Evolutionary Computation, CEC 2007., 4424918, 2007 IEEE Congress on Evolutionary Computation, CEC 2007, pp. 3447-3453, 2007 IEEE Congress on Evolutionary Computation, CEC 2007, Singapore, 07/9/25. https://doi.org/10.1109/CEC.2007.4424918

Improving generalization capability of neural networks based on simulated annealing. / Lee, Yeejin; Lee, Jong Seok; Lee, Sun Young; Park, Cheol Hoon.

2007 IEEE Congress on Evolutionary Computation, CEC 2007. 2007. p. 3447-3453 4424918 (2007 IEEE Congress on Evolutionary Computation, CEC 2007).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Improving generalization capability of neural networks based on simulated annealing

AU - Lee, Yeejin

AU - Lee, Jong Seok

AU - Lee, Sun Young

AU - Park, Cheol Hoon

PY - 2007/12/1

Y1 - 2007/12/1

N2 - This paper presents a single-objective and a multiobjective stochastic optimization algorithms for global training of neural networks based on simulated annealing. The algorithms overcome the limitation of local optimization by the conventional gradient-based training methods and perform global optimization of the weights of the neural networks. Especially, the multiobjective training algorithm is designed to enhance generalization capability of the trained networks by minimizing the training error and the dynamic range of the network weights simultaneously. For fast convergence and good solution quality of the algorithms, we suggest the hybrid simulated annealing algorithm with the gradient-based local optimization method. Experimental results show that the performance of the trained networks by the proposed methods is better than that by the gradient-based local training algorithm and, moreover, the generalization capability of the networks is significantly improved by preventing overfitting phenomena.

AB - This paper presents a single-objective and a multiobjective stochastic optimization algorithms for global training of neural networks based on simulated annealing. The algorithms overcome the limitation of local optimization by the conventional gradient-based training methods and perform global optimization of the weights of the neural networks. Especially, the multiobjective training algorithm is designed to enhance generalization capability of the trained networks by minimizing the training error and the dynamic range of the network weights simultaneously. For fast convergence and good solution quality of the algorithms, we suggest the hybrid simulated annealing algorithm with the gradient-based local optimization method. Experimental results show that the performance of the trained networks by the proposed methods is better than that by the gradient-based local training algorithm and, moreover, the generalization capability of the networks is significantly improved by preventing overfitting phenomena.

UR - http://www.scopus.com/inward/record.url?scp=78751624990&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78751624990&partnerID=8YFLogxK

U2 - 10.1109/CEC.2007.4424918

DO - 10.1109/CEC.2007.4424918

M3 - Conference contribution

AN - SCOPUS:78751624990

SN - 1424413400

SN - 9781424413409

T3 - 2007 IEEE Congress on Evolutionary Computation, CEC 2007

SP - 3447

EP - 3453

BT - 2007 IEEE Congress on Evolutionary Computation, CEC 2007

ER -

Lee Y, Lee JS, Lee SY, Park CH. Improving generalization capability of neural networks based on simulated annealing. In 2007 IEEE Congress on Evolutionary Computation, CEC 2007. 2007. p. 3447-3453. 4424918. (2007 IEEE Congress on Evolutionary Computation, CEC 2007). https://doi.org/10.1109/CEC.2007.4424918