Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Few-shot Federated Learning in Randomized Neural Networks via Hyperdimensional Computing
University of Rome 'La Sapienza', Italy.
University of Rome 'La Sapienza', Italy.
LuleƄ University of Technology, Sweden.
RISE Research Institutes of Sweden, Digital Systems, Data Science. University of California, USA; .ORCID iD: 0000-0002-6032-6155
2022 (English)In: Proceedings of the International Joint Conference on Neural Networks, Institute of Electrical and Electronics Engineers Inc. , 2022Conference paper, Published paper (Refereed)
Abstract [en]

The recent interest in federated learning has initiated the investigation for efficient models deployable in scenarios with strict communication and computational constraints. Furthermore, the inherent privacy concerns in decentralized and federated learning call for efficient distribution of information in a network of interconnected agents. Therefore, we propose a novel distributed classification solution that is based on shallow randomized networks equipped with a compression mechanism that is used for sharing the local model in the federated context. We make extensive use of hyperdimensional computing both in the local network model and in the compressed communication protocol, which is enabled by the binding and the superposition operations. Accuracy, precision, and stability of our proposed approach are demonstrated on a collection of datasets with several network topologies and for different data partitioning schemes.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2022.
Keywords [en]
Communication constraints, Compression mechanism, Computational constraints, Decentralised, Distributed classification, Local model, Local networks, Network models, Neural-networks, Privacy concerns, Network topology
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:ri:diva-61440DOI: 10.1109/IJCNN55064.2022.9892007Scopus ID: 2-s2.0-85140772045ISBN: 9781728186719 (print)OAI: oai:DiVA.org:ri-61440DiVA, id: diva2:1718015
Conference
2022 International Joint Conference on Neural Networks, IJCNN 2022, 18 July 2022 through 23 July 2022
Available from: 2022-12-12 Created: 2022-12-12 Last updated: 2023-12-12Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
By organisation
Data Science
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 19 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf