System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
DeepMaker: A multi-objective optimization framework for deep neural networks in embedded systems
Mälardalen University, Sweden.
Mälardalen University, Sweden.ORCID iD: 0000-0001-5951-9374
Shiraz University of Technology, Iran.
Mälardalen University, Sweden.
Show others and affiliations
2020 (English)In: Microprocessors and microsystems, ISSN 0141-9331, E-ISSN 1872-9436, Vol. 73Article in journal (Refereed) Published
Abstract [en]

Deep Neural Networks (DNNs) are compute-intensive learning models with growing applicability in a wide range of domains. Due to their computational complexity, DNNs benefit from implementations that utilize custom hardware accelerators to meet performance and response time as well as classification accuracy constraints. In this paper, we propose DeepMaker framework that aims to automatically design a set of highly robust DNN architectures for embedded devices as the closest processing unit to the sensors. DeepMaker explores and prunes the design space to find improved neural architectures. Our proposed framework takes advantage of a multi-objective evolutionary approach that exploits a pruned design space inspired by a dense architecture. DeepMaker considers the accuracy along with the network size factor as two objectives to build a highly optimized network fitting with limited computational resource budgets while delivers an acceptable accuracy level. In comparison with the best result on the CIFAR-10 dataset, a generated network by DeepMaker presents up to a 26.4x compression rate while loses only 4% accuracy. Besides, DeepMaker maps the generated CNN on the programmable commodity devices, including ARM Processor, High-Performance CPU, GPU, and FPGA.

Place, publisher, year, edition, pages
Elsevier B.V. , 2020. Vol. 73
Keywords [en]
Budget control; Embedded systems; Evolutionary algorithms; Integrated circuit design; Multiobjective optimization; Network architecture; Neural networks, Classification accuracy; Compression rates; Computational resources; Convolutional neural network; Custom hardwares; Design space exploration; Multi-objective evolutionary; Neural architectures, Deep neural networks
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:ri:diva-67474DOI: 10.1016/j.micpro.2020.102989Scopus ID: 2-s2.0-85077516447OAI: oai:DiVA.org:ri-67474DiVA, id: diva2:1802970
Available from: 2023-10-06 Created: 2023-10-06 Last updated: 2023-10-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Sinaei, Sima

Search in DiVA

By author/editor
Sinaei, Sima
In the same journal
Microprocessors and microsystems
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 14 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf