Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
On Effects of Compression with Hyperdimensional Computing in Distributed Randomized Neural Networks
University of Rome “La Sapienza”, Italy.
University of Rome “La Sapienza”, Italy.
Luleå University of Technology, Sweden.
RISE Research Institutes of Sweden, Digital Systems, Data Science. University of California, USA.ORCID iD: 0000-0002-6032-6155
2021 (English)In: Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349, Vol. 12862 LNCS, p. 155-167Article in journal (Refereed) Published
Abstract [en]

A change of the prevalent supervised learning techniques is foreseeable in the near future: from the complex, computational expensive algorithms to more flexible and elementary training ones. The strong revitalization of randomized algorithms can be framed in this prospect steering. We recently proposed a model for distributed classification based on randomized neural networks and hyperdimensional computing, which takes into account cost of information exchange between agents using compression. The use of compression is important as it addresses the issues related to the communication bottleneck, however, the original approach is rigid in the way the compression is used. Therefore, in this work, we propose a more flexible approach to compression and compare it to conventional compression algorithms, dimensionality reduction, and quantization techniques.

Place, publisher, year, edition, pages
Springer Science and Business Media Deutschland GmbH , 2021. Vol. 12862 LNCS, p. 155-167
Keywords [en]
Classification (of information); Dimensionality reduction; Distributed computer systems; Intelligent computing; Supervised learning, Compression algorithms; Distributed classification; Information exchanges; Randomized Algorithms, Neural networks
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:ri:diva-67780DOI: 10.1007/978-3-030-85099-9_13Scopus ID: 2-s2.0-85115177088OAI: oai:DiVA.org:ri-67780DiVA, id: diva2:1813831
Conference
16th International Work-Conference on Artificial Neural Networks, IWANN 2021. Virtual, Online. 16 June 2021 through 18 June 2021.
Available from: 2023-11-22 Created: 2023-11-22 Last updated: 2023-12-12Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
By organisation
Data Science
In the same journal
Lecture Notes in Computer Science
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 39 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf