TY - JOUR
T1 - A domain wall-magnetic tunnel junction artificial synapse with notched geometry for accurate and efficient training of deep neural networks
AU - Liu, Samuel
AU - Xiao, T. Patrick
AU - Cui, Can
AU - Incorvia, Jean Anne C.
AU - Bennett, Christopher H.
AU - Marinella, Matthew J.
N1 - Publisher Copyright:
© 2021 Author(s).
PY - 2021/5/17
Y1 - 2021/5/17
N2 - Inspired by the parallelism and efficiency of the brain, several candidates for artificial synapse devices have been developed for neuromorphic computing, yet a nonlinear and asymmetric synaptic response curve precludes their use for backpropagation, the foundation of modern supervised learning. Spintronic devices - which benefit from high endurance, low power consumption, low latency, and CMOS compatibility - are a promising technology for memory, and domain-wall magnetic tunnel junction (DW-MTJ) devices have been shown to implement synaptic functions such as long-term potentiation and spike-timing dependent plasticity. In this work, we propose a notched DW-MTJ synapse as a candidate for supervised learning. Using micromagnetic simulations at room temperature, we show that notched synapses ensure the non-volatility of the synaptic weight and allow for highly linear, symmetric, and reproducible weight updates using either spin transfer torque (STT) or spin-orbit torque (SOT) mechanisms of DW propagation. We use lookup tables constructed from micromagnetics simulations to model the training of neural networks built with DW-MTJ synapses on both the MNIST and Fashion-MNIST image classification tasks. Accounting for thermal noise and realistic process variations, the DW-MTJ devices achieve classification accuracy close to ideal floating-point updates using both STT and SOT devices at room temperature and at 400 K. Our work establishes the basis for a magnetic artificial synapse that can eventually lead to hardware neural networks with fully spintronic matrix operations implementing machine learning.
AB - Inspired by the parallelism and efficiency of the brain, several candidates for artificial synapse devices have been developed for neuromorphic computing, yet a nonlinear and asymmetric synaptic response curve precludes their use for backpropagation, the foundation of modern supervised learning. Spintronic devices - which benefit from high endurance, low power consumption, low latency, and CMOS compatibility - are a promising technology for memory, and domain-wall magnetic tunnel junction (DW-MTJ) devices have been shown to implement synaptic functions such as long-term potentiation and spike-timing dependent plasticity. In this work, we propose a notched DW-MTJ synapse as a candidate for supervised learning. Using micromagnetic simulations at room temperature, we show that notched synapses ensure the non-volatility of the synaptic weight and allow for highly linear, symmetric, and reproducible weight updates using either spin transfer torque (STT) or spin-orbit torque (SOT) mechanisms of DW propagation. We use lookup tables constructed from micromagnetics simulations to model the training of neural networks built with DW-MTJ synapses on both the MNIST and Fashion-MNIST image classification tasks. Accounting for thermal noise and realistic process variations, the DW-MTJ devices achieve classification accuracy close to ideal floating-point updates using both STT and SOT devices at room temperature and at 400 K. Our work establishes the basis for a magnetic artificial synapse that can eventually lead to hardware neural networks with fully spintronic matrix operations implementing machine learning.
UR - http://www.scopus.com/inward/record.url?scp=85106591596&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85106591596&partnerID=8YFLogxK
U2 - 10.1063/5.0046032
DO - 10.1063/5.0046032
M3 - Article
AN - SCOPUS:85106591596
SN - 0003-6951
VL - 118
JO - Applied Physics Letters
JF - Applied Physics Letters
IS - 20
M1 - 202405
ER -