Memory-augmented neural networks enhance a neural network with an external key-value (KV) memory whose complexity is typically dominated by the number of support vectors in the key memory. We propose a generalized KV memory that decouples its dimension from the number of support vectors by introducing a free parameter that can arbitrarily add or remove redundancy to the key memory representation. In effect, it provides an additional degree of freedom to flexibly control the tradeoff between robustness and the resources required to store and compute the generalized KV memory. This is particularly useful for realizing the key memory on in-memory computing hardware where it exploits nonideal, but extremely efficient nonvolatile memory devices for dense storage and computation. Experimental results show that adapting this parameter on demand effectively mitigates up to 44% nonidealities, at equal accuracy and number of devices, without any need for neural network retraining.
The work of Denis Kleyko was supported in part by the European Union’s Horizon 2020 Programme through the Marie Skłodowska-Curie Individual Fellowship under Grant 839179, in part by the Defense Advanced Research Projects Agency’s (DARPA’s) Artificial Intelligence Exploration (AIE) HyDDENN Project Program, and in part by the Air Force Office of Scientific Research (AFOSR) under Grant FA9550-19-1-0241. The work of Geethan Karunaratne and Abu Sebastianwas supported in part by the European Research Council (ERC) through the European Unions Horizon 2020 Research and Innovation Program under Grant 682675. The work of Jan M. Rabaey was supported in part by the DARPA’s AIE HyDDENN Project Program