backtop


Print 18 comment(s) - last by LRonaldHubbs.. on Feb 21 at 6:18 PM

Could this mean mainstream GDDR5?

Korean DRAM giant Samsung has been making a lot of technology announcements recently. It touted its high density 4Gb DDR3 chips last month, and showed off its advanced 40nm process for producing 2Gb DDR3 that it plans to introduce by the end of the year.

One area of sales that it hasn't been as successful as it would like has been in the GDDR5 market, designed for use in high-end video cards. Most graphics cards have been using Qimonda's GDDR5 chips, which are both faster and cheaper than GDDR4 memory.

However, Qimonda's insolvency presents a unique market opportunity for Samsung to take the lead. It has put its latest 7Gb/s GDDR5 design into mass production on its recently introduced 50nm-class process, which it has been using to produce more cost effective DDR3 DRAM.
 
Samsung says its GDDR5 will provide up to 28GB/s bandwidth, more than doubling GDDR4's 12.8GB/s. It will be available in 32Mbx32 or 64Mbx16 configurations, summing up to a 1Gb density.

Both Advanced Micro Devices and NVIDIA are currently working on 40nm die shrinks of their most powerful GPUs. AMD is planning to shift lower-speed GDDR5 downstream to its mainstream graphics cards, which currently use GDDR3 and DDR3 memory. It will then use the latest high-spec GDDR5 chips for its newest and most powerful video cards.

Samsung noted that it was able to tweak its 50nm process so much that it increased its production efficiency by a hundred percent over its 60nm process. If true, this will cut prices low enough that GDDR5 will be able to finally move into mainstream video cards.

Meanwhile, Samsung stated that while it has plans to extend 50nm production across its entire graphics memory product line, it expects that GDDR5 memory chips will capture more than 50 percent of the high-end PC graphics market by 2010.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

ATI and GDDR4 !
By geoffreymac on 2/17/2009 10:13:25 AM , Rating: -1
What are you guys talking about ?

ATI worked closely to create GDDR4 memory and has been using it for a while now R520 chip supported it and the 4870 is using it !!! and there was large speed improvements over GDDR3 ! Seems that nVidia skipped due to the close ties with ATI and the GDDR4 standard

It seems they are just moving on to a newer technology that is faster

http://www.dailytech.com/article.aspx?newsid=3446
http://www.xbitlabs.com/news/memory/display/200602...




RE: ATI and GDDR4 !
By VaultDweller on 2/17/2009 10:27:40 AM , Rating: 3
4870 uses GDDR5, not GDDR4. Some 4670's use GDDR4, though.


RE: ATI and GDDR4 !
By JonnyDough on 2/18/2009 3:14:15 AM , Rating: 2
RE: ATI and GDDR4 !
By LRonaldHubbs on 2/17/2009 11:55:08 AM , Rating: 2
Agreed, ATi certainly did not skip GDDR4. The 1950XTX, 2900XT, 3870, 3870X2, and 4670 all used GDDR4.


RE: ATI and GDDR4 !
By V3ctorPT on 2/17/2009 1:45:26 PM , Rating: 2
HD2900XT used GDDR3... unfortunately I had one... :D Bought a HD4870 that is GDDR5

HD3870 was GDDR4 and has GDDR3 variants

X1900XTX was the first card ever to have GDDR4


RE: ATI and GDDR4 !
By LRonaldHubbs on 2/21/2009 6:18:09 PM , Rating: 2
ATi designed the 2900XT to support both GDDR3 & 4. What specific board partners chose to use on their cards is another story, but like I said, ATi did not skip GDDR4.

http://ati.amd.com/products/radeonhd2900/specs.htm...
http://www.sapphiretech.com/us/products/products_o...

Here's one that did use GDDR4:
http://www.pcstats.com/articleview.cfm?articleID=2...


"What would I do? I'd shut it down and give the money back to the shareholders." -- Michael Dell, after being asked what to do with Apple Computer in 1997

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki