backtop


Print 18 comment(s) - last by LRonaldHubbs.. on Feb 21 at 6:18 PM

Could this mean mainstream GDDR5?

Korean DRAM giant Samsung has been making a lot of technology announcements recently. It touted its high density 4Gb DDR3 chips last month, and showed off its advanced 40nm process for producing 2Gb DDR3 that it plans to introduce by the end of the year.

One area of sales that it hasn't been as successful as it would like has been in the GDDR5 market, designed for use in high-end video cards. Most graphics cards have been using Qimonda's GDDR5 chips, which are both faster and cheaper than GDDR4 memory.

However, Qimonda's insolvency presents a unique market opportunity for Samsung to take the lead. It has put its latest 7Gb/s GDDR5 design into mass production on its recently introduced 50nm-class process, which it has been using to produce more cost effective DDR3 DRAM.
 
Samsung says its GDDR5 will provide up to 28GB/s bandwidth, more than doubling GDDR4's 12.8GB/s. It will be available in 32Mbx32 or 64Mbx16 configurations, summing up to a 1Gb density.

Both Advanced Micro Devices and NVIDIA are currently working on 40nm die shrinks of their most powerful GPUs. AMD is planning to shift lower-speed GDDR5 downstream to its mainstream graphics cards, which currently use GDDR3 and DDR3 memory. It will then use the latest high-spec GDDR5 chips for its newest and most powerful video cards.

Samsung noted that it was able to tweak its 50nm process so much that it increased its production efficiency by a hundred percent over its 60nm process. If true, this will cut prices low enough that GDDR5 will be able to finally move into mainstream video cards.

Meanwhile, Samsung stated that while it has plans to extend 50nm production across its entire graphics memory product line, it expects that GDDR5 memory chips will capture more than 50 percent of the high-end PC graphics market by 2010.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

GDDR4?
By R0B0Ninja on 2/17/2009 9:33:13 AM , Rating: 2
quote:
AMD is planning to shift lower-speed GDDR5 downstream to its mainstream graphics cards, which currently use GDDR3 and DDR3 memory. It will then use the latest high-spec GDDR5 chips for its newest and most powerful video cards.


So is AMD skipping GDDR4 in order to get a bigger slice of the juicy GDDR5 pie?




RE: GDDR4?
By omnicronx on 2/17/2009 9:50:45 AM , Rating: 2
You say this as though this has never happened before. Both Nvidia and AMD essentially skipped using GDDR2 and went straight to GDDR3 (mainly because of latency issues). Sometimes the change in tech is not warranted, GDDR4 was not exactly cost effective either.


RE: GDDR4?
By geokilla on 2/17/2009 9:53:49 AM , Rating: 4
NVIDIA and AMD skipped GDDR4 cus it wasn't that much better than GDDR3, despite costing a lot more.


RE: GDDR4?
By anonymo on 2/17/2009 9:54:24 AM , Rating: 2
I thought it's been common knowledge for over a year now that most companies would be doing everything possible to skip GDDR4 and go right into GDDR5.

Is anyone using GDDR4 for anything or even planning on it?


RE: GDDR4?
By GaryJohnson on 2/17/2009 4:42:05 PM , Rating: 2
RE: GDDR4?
By nafhan on 2/17/2009 10:16:25 AM , Rating: 2
Actually, I think AMD was the only company that used GDDR4. Granted, they didn't use it on very many cards. Specifically, I remember the 2600XT and the original 3870 using GDDR4. NVidia was able to get the same speed/bandwidth with cheaper GDDR3. I've noticed that most of the 3870's still available are using GDDR3 at this point.


RE: GDDR4?
By xenos123 on 2/17/2009 4:02:22 PM , Rating: 2
Yeah my HD 3870 has GDDR4. The card seems to use a lot less power than my bros GDDR3 HD 4850. His is faster but really I havent played any games where the difference in speed is warranted or can be really used.


Game performance/visuals?
By andrewkfromaz on 2/17/2009 11:33:17 AM , Rating: 1
Someone help me out. How will this enable game manufacturers to make cooler, faster, and/or better looking games?

Congratulations to Samsung, btw. Anytime a die shrink or other process change nails it, it's a great thing for the manufacturer. They've gotta be busting out the champagne over there.




RE: Game performance/visuals?
By TheFace on 2/17/2009 4:10:38 PM , Rating: 2
GDDR5 essentially has double the data rate of GDDR3, or simply put, it's like quad data rate ram. That may be an oversimplification, but you get the idea. It's the reason that the 4870 is so much faster than the 4850; they're the same chip, one just has the faster RAM.


Not buying anything until GDDR6
By LTG on 2/17/2009 12:29:08 PM , Rating: 2
GDDR6 will own everything.




By GaryJohnson on 2/17/2009 4:45:09 PM , Rating: 1
Except GDDR7... and GDRR8... and GDDR-ad-infinitum.


Samsung FTW
By JonnyDough on 2/18/2009 3:11:42 AM , Rating: 2
As a proud owner of two Samsung LCD monitors I am not ashamed to say that I might be just a bit of a fanboy.




ATI and GDDR4 !
By geoffreymac on 2/17/09, Rating: -1
RE: ATI and GDDR4 !
By VaultDweller on 2/17/2009 10:27:40 AM , Rating: 3
4870 uses GDDR5, not GDDR4. Some 4670's use GDDR4, though.


RE: ATI and GDDR4 !
By JonnyDough on 2/18/2009 3:14:15 AM , Rating: 2
RE: ATI and GDDR4 !
By LRonaldHubbs on 2/17/2009 11:55:08 AM , Rating: 2
Agreed, ATi certainly did not skip GDDR4. The 1950XTX, 2900XT, 3870, 3870X2, and 4670 all used GDDR4.


RE: ATI and GDDR4 !
By V3ctorPT on 2/17/2009 1:45:26 PM , Rating: 2
HD2900XT used GDDR3... unfortunately I had one... :D Bought a HD4870 that is GDDR5

HD3870 was GDDR4 and has GDDR3 variants

X1900XTX was the first card ever to have GDDR4


RE: ATI and GDDR4 !
By LRonaldHubbs on 2/21/2009 6:18:09 PM , Rating: 2
ATi designed the 2900XT to support both GDDR3 & 4. What specific board partners chose to use on their cards is another story, but like I said, ATi did not skip GDDR4.

http://ati.amd.com/products/radeonhd2900/specs.htm...
http://www.sapphiretech.com/us/products/products_o...

Here's one that did use GDDR4:
http://www.pcstats.com/articleview.cfm?articleID=2...


“And I don't know why [Apple is] acting like it’s superior. I don't even get it. What are they trying to say?” -- Bill Gates on the Mac ads

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki