Print 12 comment(s) - last by scrapsma54.. on Mar 1 at 7:23 PM

IBM presents its fastest eDRAM technology

IBM today pulled the wraps off its fastest system-on-a-chip eDRAM technology. The new technology presented to International Solid State Circuits Conference, or ISSCC, with engineers boasting the fastest recorded eDRAM access times ever.

The new eDRAM technology takes advantage of IBM’s silicon-on-insulator technology for low power and high-performance. In its current applications, the new eDRAM technology is implemented in 65nm SOI designs to improve the memory performance of on-processor memory controllers.

IBM suggests its eDRAM technology achieves the same performance as conventional SRAM while only occupying one-third the space and using one-fifth the power. The eDRAM technology is also ready for IBM’s upcoming 45nm fab technologies.

“As semiconductor components have reached the atomic scale, design innovation at the chip-level has replaced materials science as a key factor in continuing Moore’s Law. Today’s announcement further demonstrates IBM’s leadership in this critical area of microprocessor design innovation,” said Dr. Subramanian Iyer, IBM’s Distinguished Engineer and director of 45nm technology development.

The specifications for IBM’s new eDRAM technology are as follows:
  • cell size: 0.126 mm2
  • Power supply: 1 V
  • availability: 98.7%
  • Tile: 1K RowX16 Col X146 (2Mb)
  • AC power: 76 mW
  • standby keep alive Power: 42 mW
  • Random cycle time: 2ns
  • Latency: 1.5ns
IBM’s eDRAM technology has yet to make its way into PCs. However, it is found in many gaming consoles. Nintendo has taken advantage of eDRAM since the GameCube with its Flipper graphics processor. The latest Nintendo Wii also integrates eDRAM in its Hollywood GPU. The Xenos GPU found in Microsoft’s Xbox 360 also features eDRAM.

Coincidentally, all three console GPUs are designed by the former ATI Technologies.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By fxnick on 2/14/2007 10:17:05 PM , Rating: 2
so if 5 year old consoles are using this..why are PCs not!?

RE: wtf
By Assimilator87 on 2/14/2007 11:30:32 PM , Rating: 2
I'm wondering that too. The Xbox360 gets free AA because of the massive bandwidth of the eDRAM so I don't see why ATi hasn't integrated it into R600, which is basically an evolution of Xenos technology.

RE: wtf
By kamel5547 on 2/15/2007 12:06:25 AM , Rating: 2
Good question. It may just be a matter of shifting a larger market, the console market is 100% proprietary with no concern for past or future. Its much easier to get console companies to adopt a tech than it is to turn the PC industry.

RE: wtf
By defter on 2/15/2007 1:31:05 AM , Rating: 2
Because consoles are using much lower resolutions, thus they can fit framebuffer in a small eDRAM.

Memory requirements for e.g. 1600x1200 resolution are much higher and using such a huge eDRAM chip is not economically feasible.

RE: wtf
By KristopherKubicki on 2/15/2007 3:16:53 AM , Rating: 2
Memory requirements for e.g. 1600x1200 resolution are much higher and using such a huge eDRAM chip is not economically feasible.


RE: wtf
By tarv on 2/15/2007 6:04:57 PM , Rating: 2
Correct me if I'm wrong but isn't the 360 outputing games in 1920x1080? If it is using edram seems like it would be a good fit.

RE: wtf
By Cincybeck on 2/15/2007 9:48:49 PM , Rating: 1
Um.. Did you read the article?

"eDRAM technology achieves the same performance as conventional SRAM while only occupying one-third the space"

It's one-third the size of it's main competitor because DRAM is denser than SRAM, it's the performance that's been lacking until recently. You can have three times the amount of usable cache on the chip.

I would say this is a more plausible reason, that and maybe patent licensing cost.
"However, the difference in manufacturing processes make on-die integration difficult, so several dies have to be packaged in one chip, raising costs. The latest developments overcome this limitation by using standard CMOS process to manufacture eDRAM, as in 1T-SRAM."-Wikipedia

RE: wtf
By scrapsma54 on 3/1/2007 7:23:43 PM , Rating: 2
The fact that the logic eDram on the 360 takes out the performance hit of Hdr+AA and Af is a major advantage over the ps3. Plus eDram may some day replace DDr.

The next generation processor
By giantpandaman2 on 2/14/2007 9:19:09 PM , Rating: 2
Am I the only one wondering if the AMD chip 2 generations from now is going to be a game console/computer/media appliance hybrid? AMD has the on chip memory controller, access to IBM's eDram tech, and ATI's experience making eDram compatible video processors. Honestly, this makes me not care as much at what Barcelona might be but care quite a bit more about what might come immediately after.

Then again, maybe I'm just being overly optimistic.

RE: The next generation processor
By codeThug on 2/14/2007 9:31:03 PM , Rating: 2

Not at all. Sounds logical to me.

RE: The next generation processor
By Doormat on 2/15/2007 1:14:14 AM , Rating: 2
IBM did say they'd have chips with this tech in 2008, so maybe AMD with ATI's knowledge of eDRAM could have something arrive in late 08/early 09 at the 45nm node. Imagine 16MB or 32MB of L3 cache on an eight core chip that has CPU and GPU cores.

micron southbridge cache
By ehovland on 2/15/2007 1:22:15 PM , Rating: 2
When I first read this press release I thought of how this was roughly the same idea that Micron came up with. Their idea was to shove a whole ton of DRAM into the southbridge to provide an L3 cache that was 16MB. It was interesting then and if DRAM had remained at the prices they were at then we probably would have seen that concept productized. Instead we see IBM getting a clue through the console work they have done and figuring out that the massive about of DRAM should be right there next to the processors.

The added help of the consoles working this stuff out on GPUs before it comes to PCs is incredibly interesting. This is because not only does this help GPUs but it also gives a big leg up to multi-cores as well since the DRAM is fast to whoever accesses it. Hats off to IBM for pushing this concept along. I will sit and wait now for my octo-core opterons with 64MB of on die DRAM!

"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki