TY - JOUR
T1 - In situ Parallel Training of Analog Neural Network Using Electrochemical Random-Access Memory
AU - Li, Yiyang
AU - Xiao, T. Patrick
AU - Bennett, Christopher H.
AU - Isele, Erik
AU - Melianas, Armantas
AU - Tao, Hanbo
AU - Marinella, Matthew J.
AU - Salleo, Alberto
AU - Fuller, Elliot J.
AU - Talin, A. Alec
N1 - Publisher Copyright:
© Copyright © 2021 Li, Xiao, Bennett, Isele, Melianas, Tao, Marinella, Salleo, Fuller and Talin.
PY - 2021/4/8
Y1 - 2021/4/8
N2 - In-memory computing based on non-volatile resistive memory can significantly improve the energy efficiency of artificial neural networks. However, accurate in situ training has been challenging due to the nonlinear and stochastic switching of the resistive memory elements. One promising analog memory is the electrochemical random-access memory (ECRAM), also known as the redox transistor. Its low write currents and linear switching properties across hundreds of analog states enable accurate and massively parallel updates of a full crossbar array, which yield rapid and energy-efficient training. While simulations predict that ECRAM based neural networks achieve high training accuracy at significantly higher energy efficiency than digital implementations, these predictions have not been experimentally achieved. In this work, we train a 3 × 3 array of ECRAM devices that learns to discriminate several elementary logic gates (AND, OR, NAND). We record the evolution of the network’s synaptic weights during parallel in situ (on-line) training, with outer product updates. Due to linear and reproducible device switching characteristics, our crossbar simulations not only accurately simulate the epochs to convergence, but also quantitatively capture the evolution of weights in individual devices. The implementation of the first in situ parallel training together with strong agreement with simulation results provides a significant advance toward developing ECRAM into larger crossbar arrays for artificial neural network accelerators, which could enable orders of magnitude improvements in energy efficiency of deep neural networks.
AB - In-memory computing based on non-volatile resistive memory can significantly improve the energy efficiency of artificial neural networks. However, accurate in situ training has been challenging due to the nonlinear and stochastic switching of the resistive memory elements. One promising analog memory is the electrochemical random-access memory (ECRAM), also known as the redox transistor. Its low write currents and linear switching properties across hundreds of analog states enable accurate and massively parallel updates of a full crossbar array, which yield rapid and energy-efficient training. While simulations predict that ECRAM based neural networks achieve high training accuracy at significantly higher energy efficiency than digital implementations, these predictions have not been experimentally achieved. In this work, we train a 3 × 3 array of ECRAM devices that learns to discriminate several elementary logic gates (AND, OR, NAND). We record the evolution of the network’s synaptic weights during parallel in situ (on-line) training, with outer product updates. Due to linear and reproducible device switching characteristics, our crossbar simulations not only accurately simulate the epochs to convergence, but also quantitatively capture the evolution of weights in individual devices. The implementation of the first in situ parallel training together with strong agreement with simulation results provides a significant advance toward developing ECRAM into larger crossbar arrays for artificial neural network accelerators, which could enable orders of magnitude improvements in energy efficiency of deep neural networks.
KW - ECRAM
KW - analog memory
KW - in-memory computing
KW - on-line training
KW - organic electrochemical transistor
KW - outer product update
UR - http://www.scopus.com/inward/record.url?scp=85104587799&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85104587799&partnerID=8YFLogxK
U2 - 10.3389/fnins.2021.636127
DO - 10.3389/fnins.2021.636127
M3 - Article
AN - SCOPUS:85104587799
SN - 1662-4548
VL - 15
JO - Frontiers in Neuroscience
JF - Frontiers in Neuroscience
M1 - 636127
ER -