Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Generalized Key-Value Memory to Flexibly Adjust Redundancy in Memory-Augmented Networks
RISE Research Institutes of Sweden, Digital Systems, Data Science. University of California, USA.ORCID iD: 0000-0002-6032-6155
IBM Research, Switzerland.ORCID iD: 0000-0002-0805-4789
IBM Research, Switzerland.
IBM Research, Switzerland.ORCID iD: 0000-0001-5603-5243
Show others and affiliations
2023 (English)In: IEEE Transactions on Neural Networks and Learning Systems, ISSN 2162-237X, E-ISSN 2162-2388, Vol. 34, no 12, p. 10993-10998Article in journal (Refereed) Published
Abstract [en]

Memory-augmented neural networks enhance a neural network with an external key-value (KV) memory whose complexity is typically dominated by the number of support vectors in the key memory. We propose a generalized KV memory that decouples its dimension from the number of support vectors by introducing a free parameter that can arbitrarily add or remove redundancy to the key memory representation. In effect, it provides an additional degree of freedom to flexibly control the tradeoff between robustness and the resources required to store and compute the generalized KV memory. This is particularly useful for realizing the key memory on in-memory computing hardware where it exploits nonideal, but extremely efficient nonvolatile memory devices for dense storage and computation. Experimental results show that adapting this parameter on demand effectively mitigates up to 44% nonidealities, at equal accuracy and number of devices, without any need for neural network retraining.

Place, publisher, year, edition, pages
2023. Vol. 34, no 12, p. 10993-10998
National Category
Computer Systems
Identifiers
URN: urn:nbn:se:ri:diva-63326DOI: 10.1109/tnnls.2022.3159445OAI: oai:DiVA.org:ri-63326DiVA, id: diva2:1732120
Note

The work of Denis Kleyko was supported in part by the European Union’s Horizon 2020 Programme through the Marie Skłodowska-Curie Individual Fellowship under Grant 839179, in part by the Defense Advanced Research Projects Agency’s (DARPA’s) Artificial Intelligence Exploration (AIE) HyDDENN Project Program, and in part by the Air Force Office of Scientific Research (AFOSR) under Grant FA9550-19-1-0241. The work of Geethan Karunaratne and Abu Sebastianwas supported in part by the European Research Council (ERC) through the European Unions Horizon 2020 Research and Innovation Program under Grant 682675. The work of Jan M. Rabaey was supported in part by the DARPA’s AIE HyDDENN Project Program

Available from: 2023-01-30 Created: 2023-01-30 Last updated: 2024-06-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, DenisKarunaratne, GeethanSebastian, Abu
By organisation
Data Science
In the same journal
IEEE Transactions on Neural Networks and Learning Systems
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 9 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf