FaCT-LSTM: Fast and Compact Ternary Architecture for LSTM Recurrent Neural NetworksShow others and affiliations
2022 (English)In: IEEE design & test, ISSN 2168-2356, E-ISSN 2168-2364, Vol. 39, no 3, p. 45-53Article in journal (Refereed) Published
Abstract [en]
This article proposes a Fast and Compact Ternary LSTM (FaCTLSTM), which bridges the accuracy gap between the full-precision and quantized neural networks.
Place, publisher, year, edition, pages
IEEE Computer Society , 2022. Vol. 39, no 3, p. 45-53
Keywords [en]
ECG, EMG, Long Short -Term Memory (LSTM), Quantization, Wearable Devices, Convolutional neural networks, Digital arithmetic, Embedded systems, Memory architecture, Network architecture, Computation costs, Computation intensives, Computational costs, Floating point operations, Health care application, Precision signals, Proposed architectures, Long short-term memory
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:ri:diva-59770DOI: 10.1109/MDAT.2021.3070245Scopus ID: 2-s2.0-85103775592OAI: oai:DiVA.org:ri-59770DiVA, id: diva2:1682143
2022-07-082022-07-082023-04-05Bibliographically approved