Encoding sequential information in semantic space models: Comparing holographic reduced representation and random permutation
2015 (English)In: Computational Intelligence and Neuroscience, ISSN 1687-5265, E-ISSN 1687-5273, Vol. 2015, article id 986574
Article in journal (Refereed) Published
Abstract [en]
Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, "noisy" permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics.
Place, publisher, year, edition, pages
Hindawi Limited , 2015. Vol. 2015, article id 986574
Keywords [en]
Convolution, Vector spaces, Binding operator, Circular convolutions, Holographic reduced representations, One-to-one mappings, Random permutations, Semantic memory, Sequential information, Vector space models, Encoding (symbols), human, information retrieval, linguistics, natural language processing, semantics, space flight, Humans, Information Storage and Retrieval, Space Simulation, Vocabulary
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:ri:diva-43188DOI: 10.1155/2015/986574Scopus ID: 2-s2.0-84928485033OAI: oai:DiVA.org:ri-43188DiVA, id: diva2:1387053
2020-01-202020-01-202020-01-23Bibliographically approved