Handling Stuck-at-Fault Defects Using Matrix Transformation for Robust Inference of DNNs

Baogang Zhang, Necati Uysal, Deliang Fan, Rickard Ewetz

Research output: Contribution to journalArticlepeer-review

18 Scopus citations


Matrix-vector multiplication is the dominating computational workload in the inference phase of deep neural networks (DNNs). Memristor crossbar arrays (MCAs) can efficiently perform matrix-vector multiplication in the analog domain. A key challenge is that memristor devices may suffer stuck-at-fault defects, which can severely degrade the classification accuracy. Earlier studies have shown that the accuracy loss can be recovered by utilizing additional hardware or hardware aware training. In this article, we propose a framework that handles stuck-at-faults using matrix transformations, which is called the MT framework. The framework is based on introducing a cost metric that captures the negative impact of the stuck-at-fault defects. Next, the cost metric is minimized by applying matrix transformations T. A transformation T changes a weight matrix W into a new weight matrix W= T(W). In particular, a row flipping transformation, a permutation transformation, and a value range transformation are proposed. The row flipping transformation results in that stuck-off (stuck-on) faults are translated into stuck-on (stuck-off) faults. The permutation transformation maps small (large) weights to memristors stuck-off (stuck-on). The value range transformation is based on reducing the magnitude of the smallest and largest elements in the weight matrices, which results in that the stuck-at-faults introduce smaller errors. The experimental results demonstrate that the MT framework is capable of recovering 99% of the accuracy loss on both the MNIST and CIFAR-10 datasets without utilizing hardware aware training. The accuracy improvements come at the expense of an 8.19× and 9.23× overhead in power and area, respectively. Nevertheless, the overhead can be reduced with up to 50% by leveraging hardware aware training.

Original languageEnglish (US)
Article number8852740
Pages (from-to)2448-2460
Number of pages13
JournalIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Issue number10
StatePublished - Oct 2020


  • Analog computing
  • deep neural networks (DNNs)
  • memristors
  • stuck-at-faults
  • transformations

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design
  • Electrical and Electronic Engineering


Dive into the research topics of 'Handling Stuck-at-Fault Defects Using Matrix Transformation for Robust Inference of DNNs'. Together they form a unique fingerprint.

Cite this