Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Recursive Binding for Similarity-Preserving Hypervector Representations of Sequences
LuleƄ University of Technology, Sweden; International Research and Training Center for Information Technologies and Systems, Ukraine.
RISE Research Institutes of Sweden, Digital Systems, Data Science. University of California at Berkeley, USA.ORCID iD: 0000-0002-6032-6155
2022 (English)In: 2022 International Joint Conference on Neural Networks (IJCNN), 2022Conference paper, Published paper (Refereed)
Abstract [en]

Hyperdimensional computing (HDC), also known as vector symbolic architectures (VSA), is a computing framework used within artificial intelligence and cognitive computing that operates with distributed vector representations of large fixed dimensionality. A critical step in designing the HDC/VSA solutions is to obtain such representations from the input data. Here, we focus on a wide-spread data type of sequences and propose their transformation to distributed representations that both preserve the similarity of identical sequence elements at nearby positions and are equivariant with respect to the sequence shift. These properties are enabled by forming representations of sequence positions using recursive binding as well as superposition operations. The proposed transformation was experimentally investigated with symbolic strings used for modeling human perception of word similarity. The obtained results are on a par with more sophisticated approaches from the literature. The proposed transformation was designed for the HDC/VSA model known as Fourier Holographic Reduced Representations. However, it can be adapted to some other HDC/VSA models.

Place, publisher, year, edition, pages
2022.
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:ri:diva-62511DOI: 10.1109/IJCNN55064.2022.9892462OAI: oai:DiVA.org:ri-62511DiVA, id: diva2:1730427
Conference
2022 International Joint Conference on Neural Networks (IJCNN)
Available from: 2023-01-24 Created: 2023-01-24 Last updated: 2023-12-12Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Kleyko, Denis

Search in DiVA

By author/editor
Kleyko, Denis
By organisation
Data Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 67 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf