A neural network accelerator for mobile application processors

Doo Young Kim, Jin Min Kim, Hakbeom Jang, Jinkyu Jeong, Jae W. Lee

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Today's mobile consumer electronics devices, such as smartphones and tablets, are required to execute a wide variety of applications efficiently. To this end modern application processors integrate both general-purpose CPU cores and specialized accelerators. Energy efficiency is the primary design goal for those processors, which has recently rekindled interest in neural network accelerators. Neural network accelerators trade the accuracy of computation for performance and energy efficiency and are suitable for errortolerant media applications such as video and audio processing. However, most existing accelerators only exploit inter-neuron parallelism and leave processing elements underutilized when the number of neurons in a layer is small. Thus, this paper proposes a novel neural network accelerator that can efficiently exploit both inter- and intra-neuron parallelism. For five applications the proposed accelerator achieves average speedups of 126% and 23% over a generalpurpose CPU and a state-of-the-art accelerator exploiting inter-neuron parallelism only, respectively. Besides, the proposed accelerator saves energy consumption by 22% over the state-of-the-art accelerator.

    Original languageEnglish
    Article number7389812
    Pages (from-to)555-563
    Number of pages9
    JournalIEEE Transactions on Consumer Electronics
    Volume61
    Issue number4
    DOIs
    Publication statusPublished - 2015 Nov

    Bibliographical note

    Publisher Copyright:
    © 2015 IEEE.

    All Science Journal Classification (ASJC) codes

    • Media Technology
    • Electrical and Electronic Engineering

    Fingerprint

    Dive into the research topics of 'A neural network accelerator for mobile application processors'. Together they form a unique fingerprint.

    Cite this