Fast implementation of neural network classification

Guiwon Seo, Jiheon Ok, Chulhee Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Most artificial neural networks use a nonlinear activation function that includes sigmoid and hyperbolic tangent functions. Most artificial networks employ nonlinear functions such as these sigmoid and hyperbolic tangent functions, which incur high complexity costs, particularly during hardware implementation. In this paper, we propose new polynomial approximation methods for nonlinear activation functions that can substantially reduce complexity without sacrificing performance. The proposed approximation methods were applied to pattern classification problems. Experimental results show that the processing time was reduced by up to 50% without any performance degradations in terms of computer simulation.

Original languageEnglish
Title of host publicationSatellite Data Compression, Communications, and Processing IX
PublisherSPIE
ISBN (Print)9780819497215
DOIs
Publication statusPublished - 2013
EventSatellite Data Compression, Communications, and Processing IX - San Diego, CA, United States
Duration: 2013 Aug 262013 Aug 27

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume8871
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Other

OtherSatellite Data Compression, Communications, and Processing IX
Country/TerritoryUnited States
CitySan Diego, CA
Period13/8/2613/8/27

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Fast implementation of neural network classification'. Together they form a unique fingerprint.

Cite this