When I read that, I wondered how long we will keep using DRAM in mobile devices. If Moore's law holds, we will easily be able to put 10GB of DRAM in a phone in ten years. I think having 1 GB of static RAM might be preferable.
If 1 GB of static ram were even remotely cheap enough and dense enough, our processors would have more than a few megs of L1-3 combined cache. Especially since high-end processors cost hundreds of dollars already.
CPUs have a few megs of L1-3 cache because any more would slow it down. If you made the caches larger, it would increase the distance a signal has to travel, meaning more latency in the signal. With the clock speed of modern processes, this does matter.
The best possibility would be fully realising the NUMA architecture, and giving each core a stack of dedicated SRAM or DRAM at sizes of 1GB (these would have to be off-die though).
Yes, that is the thing about SRAM. It is not 10x more expensive than DRAM. Oh no. No, no, no, no, no. If it were to ever fall to only 10x more, it would be like the 2nd coming of Memory.
On modern CPU's, HALF- or MORE of the silicon is used to afford 4-16MB L3 caches. A CPU die is not much smaller than a DRAM chip, and a 1GB chip of DRAM is less than $10 these days judging by the prices of 16GB, 16-chip sticks of DRAM.
- CPU caches are wired much more complex than SRAM memory modules would need to be. It may be shared between CPUs, and being n-way surely requires silicon, too.
- if the ratio is way more than 10, why, then, do I find zillions of references stating that a) DRAM needs a transistor per bit, and b) SRAM can be built with 6 transistors per bit?
However, RAM remains one of the largest power consumers, and so reducing memory usage also reduces power usage, which improves battery life.
Tom's hardware has a great article on this here: http://www.tomshardware.com/reviews/lovo-ddr3-power,2650-2.h...
Note in particular the figures at the bottom of the article page.