Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Cellular Automata Can Reduce Memory Requirements of Collective-State Computing
RISE Research Institutes of Sweden, Digital Systems, Data Science. University of California at Berkeley, USA.ORCID iD: 0000-0002-6032-6155
University of California at Berkeley, USA; Intel Labs, USA.
University of California at Berkeley, USA; Intel Labs, USA.
2022 (English)In: IEEE Transactions on Neural Networks and Learning Systems, ISSN 2162-237X, E-ISSN 2162-2388, Vol. 33, no 6, p. 2701-2713Article in journal (Refereed) Published
Abstract [en]

Various nonclassical approaches of distributed information processing, such as neural networks, reservoir computing (RC), vector symbolic architectures (VSAs), and others, employ the principle of collective-state computing. In this type of computing, the variables relevant in computation are superimposed into a single high-dimensional state vector, the collective state. The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation. In this article, we show that an elementary cellular automaton with rule 90 (CA90) enables the space-time tradeoff for collective-state computing models that use random dense binary representations, i.e., memory requirements can be traded off with computation running CA90. We investigate the randomization behavior of CA90, in particular, the relation between the length of the randomization period and the size of the grid, and how CA90 preserves similarity in the presence of the initialization noise. Based on these analyses, we discuss how to optimize a collective-state computing model, in which CA90 expands representations on the fly from short seed patterns--rather than storing the full set of random patterns. The CA90 expansion is applied and tested in concrete scenarios using RC and VSAs. Our experimental results show that collective-state computing with CA90 expansion performs similarly compared to traditional collective-state models, in which random patterns are generated initially by a pseudorandom number generator and then stored in a large memory. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2022. Vol. 33, no 6, p. 2701-2713
Keywords [en]
Automata, Cellular automata (CA), collective-state computing, Computational modeling, Decoding, distributed representations, hyperdimensional computing, Memory management, Neurons, random number generation, reservoir computing (RC), Reservoirs, rule 90, Task analysis, vector symbolic architectures (VSAs)., Cellular automata, Job analysis, Memory architecture, Network architecture, Random processes, Reservoir management, Automaton, Cellular automaton, Cellular automatons, Computational modelling, Distributed representation, Memory-management, Random-number generation, Reservoir Computing, Vector symbolic architecture .
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:ri:diva-57075DOI: 10.1109/TNNLS.2021.3119543Scopus ID: 2-s2.0-85118596577OAI: oai:DiVA.org:ri-57075DiVA, id: diva2:1614215
Available from: 2021-11-24 Created: 2021-11-24 Last updated: 2023-12-12Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
By organisation
Data Science
In the same journal
IEEE Transactions on Neural Networks and Learning Systems
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 117 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf