All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Improving the Accuracy and Hardware Efficiency of Neural Networks Using Approximate Multipliers

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26230%2F20%3APU134964" target="_blank" >RIV/00216305:26230/20:PU134964 - isvavai.cz</a>

  • Result on the web

    <a href="https://www.fit.vut.cz/research/publication/12066/" target="_blank" >https://www.fit.vut.cz/research/publication/12066/</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1109/TVLSI.2019.2940943" target="_blank" >10.1109/TVLSI.2019.2940943</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Improving the Accuracy and Hardware Efficiency of Neural Networks Using Approximate Multipliers

  • Original language description

    Improving the accuracy of a neural network (NN) usually requires using larger hardware that consumes more energy. However, the error tolerance of NNs and their applications allow approximate computing techniques to be applied to reduce implementation costs. Given that multiplication is the most resource-intensive and power-hungry operation in NNs, more economical approximate multipliers (AMs) can significantly reduce hardware costs. In this article, we show that using AMs can also improve the NN accuracy by introducing noise. We consider two categories of AMs: 1) deliberately designed and 2) Cartesian genetic programing (CGP)-based AMs. The exact multipliers in two representative NNs, a multilayer perceptron (MLP) and a convolutional NN (CNN), are replaced with approximate designs to evaluate their effect on the classification accuracy of the Mixed National Institute of Standards and Technology (MNIST) and Street View House Numbers (SVHN) data sets, respectively. Interestingly, up to 0.63% improvement in the classification accuracy is achieved with reductions of 71.45% and 61.55% in the energy consumption and area, respectively. Finally, the features in an AM are identified that tend to make one design outperform others with respect to NN accuracy. Those features are then used to train a predictor that indicates how well an AM is likely to work in an NN.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/LTC18053" target="_blank" >LTC18053: Advanced Methods of Nature-Inspired Optimisation and HPC Implementation for the Real-Life Applications</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2020

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    IEEE Trans. on VLSI Systems.

  • ISSN

    1063-8210

  • e-ISSN

    1557-9999

  • Volume of the periodical

    28

  • Issue of the periodical within the volume

    2

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    12

  • Pages from-to

    317-328

  • UT code for WoS article

    000510674300002

  • EID of the result in the Scopus database

    2-s2.0-85078705685