Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
NeuroPower: Designing Energy Efficient Convolutional Neural Network Architecture for Embedded Systems
Mälardalen University, Sweden.
Shiraz University of Technology, Iran.
Mälardalen University, Sweden.ORCID iD: 0000-0001-5951-9374
Mälardalen University, Sweden.
Show others and affiliations
2019 (English)In: Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349, Vol. 11727 LNCS, p. 208-222Article in journal (Refereed) Published
Abstract [en]

Convolutional Neural Networks (CNNs) suffer from energy-hungry implementation due to their computation and memory intensive processing patterns. This problem is even more significant by the proliferation of CNNs on embedded platforms. To overcome this problem, we offer NeuroPower as an automatic framework that designs a highly optimized and energy efficient set of CNN architectures for embedded systems. NeuroPower explores and prunes the design space to find improved set of neural architectures. Toward this aim, a multi-objective optimization strategy is integrated to solve Neural Architecture Search (NAS) problem by near-optimal tuning network hyperparameters. The main objectives of the optimization algorithm are network accuracy and number of parameters in the network. The evaluation results show the effectiveness of NeuroPower on energy consumption, compacting rate and inference time compared to other cutting-edge approaches. In comparison with the best results on CIFAR-10/CIFAR-100 datasets, a generated network by NeuroPower presents up to 2.1x/1.56x compression rate, 1.59x/3.46x speedup and 1.52x/1.82x power saving while loses 2.4%/0.6% accuracy, respectively. 

Place, publisher, year, edition, pages
Springer Verlag , 2019. Vol. 11727 LNCS, p. 208-222
Keywords [en]
Convolution; Embedded systems; Energy efficiency; Energy utilization; Multiobjective optimization; Neural networks, Architectures for embedded systems; Compression rates; Convolutional neural network; Embedded platforms; Energy efficient; Evaluation results; Neural architectures; Optimization algorithms, Network architecture
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:ri:diva-67478DOI: 10.1007/978-3-030-30487-4_17Scopus ID: 2-s2.0-85072863572OAI: oai:DiVA.org:ri-67478DiVA, id: diva2:1802965
Available from: 2023-10-06 Created: 2023-10-06 Last updated: 2023-10-06Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Sinaei, Sima

Search in DiVA

By author/editor
Sinaei, Sima
In the same journal
Lecture Notes in Computer Science
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf