Cross-encoded meta embedding towards transfer learning
2020 (English)In: ESANN 2020 - Proceedings, 28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN (i6doc.com) , 2020, p. 631-636Conference paper, Published paper (Refereed)
Abstract [en]
In this paper we generate word meta-embeddings from already existing embeddings using cross-encoding. Previous approaches can only work with words that exist in each source embedding, while the architecture presented here drops this requirement. We demonstrate the method using two pre-trained embeddings, namely GloVE and FastText. Furthermore, we propose additional improvements to the training process of the metaembedding. Results on six standard tests for word similarity show that the meta-embedding trained outperforms the original embeddings. Moreover, this performance can be further increased with the proposed improvements, resulting in a competitive performance with those reported earlier.
Place, publisher, year, edition, pages
ESANN (i6doc.com) , 2020. p. 631-636
Keywords [en]
Embeddings, Intelligent computing, Learning systems, Transfer learning, Competitive performance, Standard tests, Training process, Word similarity, Neural networks
National Category
Natural Sciences
Identifiers
URN: urn:nbn:se:ri:diva-51966Scopus ID: 2-s2.0-85099006558ISBN: 9782875870742 (print)OAI: oai:DiVA.org:ri-51966DiVA, id: diva2:1520448
Conference
28th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN, 2020, 2 October 2020 through 4 October 2020
2021-01-202021-01-202023-06-07Bibliographically approved