Improving the Retention Power of Embedded Memories

שלחו לחבר
Noa Edri
Engineering Building 1103, Room 329
Faculty of Engineering, Bar-Ilan University

Advances in technology has been accompanied by greater power consumption, an issue facing every System-on-Chip (SoC) design. Today, any complete solution for the reduction of SoC power must also decrease the power consumed by the embedded memories in the system. The high leakage current in more advanced technologies makes the memory retention power the dominant factor in the total memory power. This thesis was designed to find a solution to lower the standby power of two types of embedded memory.
The first type of embedded memory known as the Gain Cell embedded DRAM (GC-eDRAM) is more innovative and is still in the research phase. These memory cells have many advantages including high density, low bitcell leakage, logic compatibility, and suitability for 2-port memories. However, they suffer from a major drawbacks; namely, limited data retention times (RTs) and the large spread of RT across an array, which degrades energy-efficiency due to refresh cycles. There is a lack of analytical and statistical RT models for GC-eDRAM that could reveal the limiters and circuit parameters that result in the large observed RT spreads. In this thesis, we derive the first comprehensive analytical model for the statistical distribution of the per-cell retention time of 2T-bitcell GC-eDRAMs, which is found to follow a log-normal distribution. The high accuracy of the proposed retention time model is verified by extensive Monte Carlo circuit simulations and silicon measurements of a 0.18 μm test chip. The insights gained from the retention time model will assist technology developers and circuit designers to achieve better GC-eDRAMs with longer RTs and sharper RT distributions.
In contrast, the second type of memory studied here is the most commonly used memory: the 6T SRAM. Lowering the SRAM standby supply voltage (VDD) is one of the best ways to decrease memory power consumption. However, VDD reduction also lowers the noise margins of the bit cell, resulting in degraded stability. In this thesis we propose a system methodology to determine the optimal data retention voltage. It includes a bitcell simulation and an on-chip test. The proposed solution significantly reduces the risks involved in low standby operation while maintaining simple implementation and test efficiency for maximum power savings.
* M.Sc. thesis