Density Encoding Enables Resource-Efficient Randomly Connected Neural NetworksShow others and affiliations
2021 (English)In: IEEE Transactions on Neural Networks and Learning Systems, ISSN 2162-237X, E-ISSN 2162-2388, Vol. 32, no 8, p. 3777-3783, article id 9174774Article in journal (Refereed) Published
Abstract [en]
The deployment of machine learning algorithms on resource-constrained edge devices is an important challenge from both theoretical and applied points of view. In this brief, we focus on resource-efficient randomly connected neural networks known as random vector functional link (RVFL) networks since their simple design and extremely fast training time make them very attractive for solving many applied classification tasks. We propose to represent input features via the density-based encoding known in the area of stochastic computing and use the operations of binding and bundling from the area of hyperdimensional computing for obtaining the activations of the hidden neurons. Using a collection of 121 real-world data sets from the UCI machine learning repository, we empirically show that the proposed approach demonstrates higher average accuracy than the conventional RVFL. We also demonstrate that it is possible to represent the readout matrix using only integers in a limited range with minimal loss in the accuracy. In this case, the proposed approach operates only on small ${n}$ -bits integers, which results in a computationally efficient architecture. Finally, through hardware field-programmable gate array (FPGA) implementations, we show that such an approach consumes approximately 11 times less energy than that of the conventional RVFL.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers Inc. , 2021. Vol. 32, no 8, p. 3777-3783, article id 9174774
Keywords [en]
Density-based encoding, hyperdimensional computing, random vector functional link (RVFL) networks, Encoding (symbols), Field programmable gate arrays (FPGA), Learning algorithms, Machine learning, Network coding, Stochastic systems, Classification tasks, Computationally efficient, Field-programmable gate array implementations, Functional links, Hidden neurons, Resource-efficient, Stochastic computing, UCI machine learning repository, Neural networks
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:ri:diva-55946DOI: 10.1109/TNNLS.2020.3015971Scopus ID: 2-s2.0-85112022593OAI: oai:DiVA.org:ri-55946DiVA, id: diva2:1587095
Note
Funding details: Defense Advanced Research Projects Agency, DARPA; Funding details: Horizon 2020 Framework Programme, H2020, 839179; Funding details: Vetenskapsrådet, VR, 2015-04677; Funding text 1: Manuscript received February 10, 2020; revised July 2, 2020; accepted August 8, 2020. Date of publication August 24, 2020; date of current version August 4, 2021. This work was supported in part by the Swedish Research Council under Grant 2015-04677. The work of Denis Kleyko was supported in part by the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Skłodowska-Curie Individual Fellowship Grant Agreement 839179 and in part by the DARPA’s VIP Program under Super-HD Project. (Corresponding author: Denis Kleyko.) Denis Kleyko is with the Redwood Center for Theoretical Neuroscience, University of California at Berkeley, Berkeley, CA 94720 USA, and also with the Intelligent Systems Lab, Research Institutes of Sweden, 164 40 Kista, Sweden (e-mail: denis.kleyko@ri.se).
2021-08-232021-08-232023-12-12Bibliographically approved