Analysis and mitigation of parasitic resistance effects for analog in-memory neural network acceleration

T. Patrick Xiao, Ben Feinberg, Jacob N. Rohan, Christopher H. Bennett, Sapan Agarwal, Matthew J. Marinella

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

To support the increasing demands for efficient deep neural network processing, accelerators based on analog in-memory computation of matrix multiplication have recently gained significant attention for reducing the energy of neural network inference. However, analog processing within memory arrays must contend with the issue of parasitic voltage drops across the metal interconnects, which distort the results of the computation and limit the array size. This work analyzes how parasitic resistance affects the end-to-end inference accuracy of state-of-the-art convolutional neural networks, and comprehensively studies how various design decisions at the device, circuit, architecture, and algorithm levels affect the system's sensitivity to parasitic resistance effects. A set of guidelines are provided for how to design analog accelerator hardware that is intrinsically robust to parasitic resistance, without any explicit compensation or re-training of the network parameters.

Original languageEnglish (US)
Article number114004
JournalSemiconductor Science and Technology
Volume36
Issue number11
DOIs
StatePublished - Nov 2021
Externally publishedYes

Keywords

  • convolutional neural networks
  • in-memory computing
  • machine learning
  • neural network inference
  • neuromorphic computing
  • parasitic resistance
  • sensitivity analysis

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Electrical and Electronic Engineering
  • Materials Chemistry

Fingerprint

Dive into the research topics of 'Analysis and mitigation of parasitic resistance effects for analog in-memory neural network acceleration'. Together they form a unique fingerprint.

Cite this