Abstract
Random telegraph noise (RTN) has become an important reliability issue in nanoscale circuits recently. This study proposes a simulation framework to evaluate the temporal performance of digital circuits under the impact of RTN at 16 nm technology node. Two fast algorithms with linear time complexity are proposed: statistical critical path analysis and normal distribution-based analysis.The simulation results reveal that the circuit delay degradation and variation induced by RTN are both >20% and the maximum degradation and variation can be >30%. The effect of power supply tuning and gate sizing techniques on mitigating RTN is also investigated.
Original language | English (US) |
---|---|
Pages (from-to) | 273-282 |
Number of pages | 10 |
Journal | IET Circuits, Devices and Systems |
Volume | 7 |
Issue number | 5 |
DOIs | |
State | Published - 2013 |
ASJC Scopus subject areas
- Control and Systems Engineering
- Electrical and Electronic Engineering