backtop


Print 156 comment(s) - last by teldar.. on May 27 at 7:07 PM

NVIDIA's upcoming Summer 2008 lineup gets some additional details

Later this week NVIDIA will enact an embargo on its upcoming next-generation graphics core, codenamed D10U.  The launch schedule of this processor, verified by DailyTech, claims the GPU will make its debut as two separate graphics cards, currently named GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20). 

The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU.  The D10U-30 will enable all 240 unified stream processors designed into the processor.  NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

The main difference between the two new GeForce GTX variants revolves around the number of shaders and memory bus width.  Most importantly, NVIDIA disables 48 stream processors on the GTX 260. GTX 280 ships with a 512-bit memory bus capable of supporting 1GB GDDR3 memory; the GTX 260 alternative has a 448-bit bus with support for 896MB.  

GTX 280 and 260 add virtually all of the same features as GeForce 9800GTX: PCIe 2.0, OpenGL 2.1, SLI and PureVideoHD.  The company also claims both cards will support two SLI-risers for 3-way SLI support.

Unlike the upcoming AMD Radeon 4000 series, currently scheduled to launch in early June, the D10U chipset does not support DirectX extentions above 10.0.  Next-generation Radeon will also ship with GDDR5 while the June GeForce refresh is confined to just GDDR3.

The GTX series is NVIDIA's first attempt at incorporating the PhysX stream engine into the D10U shader engine.  The press decks currently do not shed a lot of information on this support, and the company will likely not elaborate on this before the June 18 launch date.

After NVIDIA purchased PhysX developer AGEIA in February 2008, the company announced all CUDA-enabled processors would support PhysX.  NVIDIA has not delivered on this promise yet, though D10U will support CUDA, and therefore PhysX, right out of the gate.

NVIDIA's documentation does not list an estimated street price for the new cards.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

No DX10.1?
By L33tMasta on 5/20/2008 4:18:17 PM , Rating: 5
Are you kidding? Why doesn't is support DX10.1?




RE: No DX10.1?
By robert5c on 5/20/2008 4:28:20 PM , Rating: 1
what do you need 10.1 now for?


RE: No DX10.1?
By L33tMasta on 5/20/2008 4:29:38 PM , Rating: 5
What do you NOT need it for? it's the newest API thus the hardware devs need to support it.


RE: No DX10.1?
By dsx724 on 5/20/2008 4:37:34 PM , Rating: 5
DirectX 10.1 introduces a new shader model as well as more flexibility. Flexibility costs in performance which is why Nvidia is clinging to DX10. ATI has far superior hardware in terms of programability and Nvidia knows it can't match ATI's performance if it upgrades the GPU's to comply with 10.1 spec.


RE: No DX10.1?
By L33tMasta on 5/20/2008 4:39:01 PM , Rating: 2
It also introduces "free" 4xAA with no performance hit.\


RE: No DX10.1?
By dubldwn on 5/20/2008 4:56:21 PM , Rating: 4
Free or required?


RE: No DX10.1?
By leexgx on 5/20/2008 5:38:52 PM , Rating: 3
required to support 4x hardware aa
not that the game has to use it thought


RE: No DX10.1?
By thartist on 5/21/2008 2:18:19 PM , Rating: 2
Yes of course, as if AA would now be magically invisible to horsepower requirements.


RE: No DX10.1?
By MrPoletski on 5/21/2008 10:55:28 PM , Rating: 2
Tile based renderers can do MSAA for virtually no performance cost.


RE: No DX10.1?
By gochichi on 5/27/2008 4:09:23 PM , Rating: 2
It is certainly possible to design hardware that would take care of AA with no performance degradation. In fact, I am surprised an unswitchable 4XAA hasn't been released long ago.

It would just be a matter of making the hardware just do anti-aliasing without asking drivers, games, or user settings. It is similar to saying that you can have a printer connected to your computer without performance degradation (you can), but more hardware is necessary (the printer). It is also similar to 3D acceleration in general... this hardware could be called "AA-accelerator" and it could certainly create a scenario where turning AA off would offer absolutely no performance benefit.

I look forward to the day where AA is considered at the hardware level to the degree that turning it off offers no benefit, as it stands I always err towards turning it off because any degree of added smoothness is always welcome. Even if it's a change of 60fps to 90 fps I still disable AA, and this is a failure of the hardware. The more specific the hardware, the more performance you'll get from it. Currently, the GPU is handling AA though some generalized processor, and it would be best if AA was a separate process done by its specific processor (or portion thereof).


RE: No DX10.1?
By StevoLincolnite on 5/21/2008 12:46:21 AM , Rating: 5
There is also the issue of future compatibility, this is the exact problem ATI had with the Radeon x850 series, they were SM2 capable, not SM3 like nVidia cards, then games like Bioshock were released which required SM3 hardware, despite the x850 being more than enough to run the game at fairly good settings, it failed because it didn't meet all the check boxes, so can we expect the same from the Direct X 10 cards in the future?


RE: No DX10.1?
By Jedi2155 on 5/21/2008 2:52:16 AM , Rating: 4
It is also the same issue Nvidia had with their DX8 cards when Battlefield 2 came out. All of Nvidia's DX8 cards didn't support it while ATI still had support with their DX8.1 based R200 and their more advanced Pixel Shader 1.4 while Nvidia only did up to 1.1 with the Geforce 3 and Geforce 4 Ti.

So yes...there is a concern for DX10.1 if developers choose to support it. Of course we don't really know till a big game hits....


RE: No DX10.1?
By StevoLincolnite on 5/21/2008 8:41:18 AM , Rating: 3
The Geforce 4 Ti Series was Direct X 8.1 cards, however they were still limited to SM 1.3, not 1.1 as you claim and still below the level of 1.4 that was required.

The Geforce 4 MX440 was a big killer in the end, it was faster than the Geforce FX 5200, yet it could not do any form of Pixel Shader work outside of the Fixed Function arena, thus even though people got Oblivion to run on the Geforce 3 Ti200 and Geforce FX 5200 - the Gefore 4 MX440 had no hope thanks to it being a Geforce 2 on Steroids. (Direct X 7 class Requirement for Old Oblivion is Direct X 8 class).
Heck, even Id Software said that the Geforce 4 MX440 would hold back games graphics because of it's limited feature set.
Yet because of this, people with a Geforce 256 released in 1999 could play games such as Half Life 2 which came out in 2004, which is 5 years of gaming thanks to probably the Geforce 4 MX, which had a huge market share.

ATI aren't innocent either, with the Radeon 9200 series, although not on as large of a scale as the MX440 was.
Thank goodness nVidia and ATI stopped that practice a long time ago.


RE: No DX10.1?
By PhantomRogue on 5/21/08, Rating: -1
RE: No DX10.1?
By omnicronx on 5/21/08, Rating: -1
RE: No DX10.1?
By afkrotch on 5/21/08, Rating: -1
RE: No DX10.1?
By fikimiki on 5/21/08, Rating: -1
RE: No DX10.1?
By afkrotch on 5/21/08, Rating: -1
RE: No DX10.1?
By DigitalFreak on 5/21/2008 3:55:43 PM , Rating: 2
Well, this sheds more light on the removal of DX10.1 from Assassin's Creed.


RE: No DX10.1?
By overzealot on 5/22/2008 10:02:54 AM , Rating: 2
They said they would remove it in the first patch because it was BROKEN!
They didn't, by the way. They just repaired it.
Funnily enough, it does have improved performance under 10.1 and AA on surfaces that didn't previously. Google it.


RE: No DX10.1?
By Mitch101 on 5/22/2008 11:33:19 AM , Rating: 3
I googled it and here is what I found.

http://enthusiast.hardocp.com/article.html?art=MTQ...
Lo and behold, when we enable 4X MSAA, we can clearly see that performance is better with Service Pack 1 installed than without. It is, in fact, right at 20% faster on average.

From the above graph, it is obvious that AMD's ATI Radeon HD 3870 enjoys a considerable performance boost with Windows Vista Service Pack 1. In fact, the Radeon HD 3870 averaged about 34% faster with Vista SP1 at 1600x1200 with 2X MSAA.

http://enthusiast.hardocp.com/article.html?art=MTQ...
Clearly, Anisotropic Filtering is not working on AMD hardware in Assassin's Creed using the in-game settings. Aside from that, the only other problems immediately apparent are the Shadow Artifacting which happens equally on NVIDIA and AMD hardware. There is a difference in contrast and perhaps HDR tone, but it is debatable which one looks better. Other than the above mentioned things, all the post processing effects and pixel shader qualities look exactly the same between NVIDIA and AMD hardware.


RE: No DX10.1?
By PICBoy on 5/22/2008 1:42:47 PM , Rating: 2
quote:
They have no need to make the change... Why bother going back to rework it, when you essentially have a chip that has support for nothing out there.


The answer is rather simple: MARKETING and FUD. Those two combined are the main reasons why AMD made their hardware DX 10.1 compatible. It doesn't matter if there aren't DX10.1 games announced right now. Maybe there will be in the future, but you cannot deny the fact that some people are going to prefer AMD's hardware just because they support it.

If it was so simple to change the hardware to DX10.1 then NVIDIA would have done it with their GeForce 9000 series because they know how important is this for marketing purposes alone.


RE: No DX10.1?
By PlasmaBomb on 5/20/2008 4:38:11 PM , Rating: 4
It gives a nice boost in some games, provided Nvidia doesn't help the developer fix "graphical glitches"...


RE: No DX10.1?
By Kougar on 5/20/2008 5:06:24 PM , Rating: 2
Heh, NVIDIA's "The Way it's Meant to be Played" game Company of Heroes had it's sequel/expansion CoH Opposing Fronts launched before NVIDIA finally addressed a dozen small, very problematic "graphical glitches", including with CoH's performance benchmark.

Happened repeatedly enough and got worse to the point I thought my NVIDIA GPU had gone bad, as it happened even under different OS installs. I didn't learn otherwise until I read the driver release notes for 162 or 169 or something, I don't remember the specific driver release anymore that listed the plethora of fixes.


RE: No DX10.1?
By Shawn5961 on 5/20/2008 4:41:10 PM , Rating: 2
While I agree, that they really should start with DX10.1 support, I don't think it's a requirement at the moment. Yes, it would be nice if the usage of it started sooner rather than later, but enough developers haven't even switched to DX10 yet.

Personally, I'm strictly an Nvidia user. I'm not a fan of newer ATI products at all. But I think the whole Geforce 9xxx series has just been a joke. It's more or less Geforce 8xxx-2 than anything. The lack of DX10.1 support and GDDR4 is going to make them go the same way the 8xxx series goes when they do come out with DX10.1/GDDR4 cards.

At least one thing's for certain. If they come out with the 10xxx series over the 9xxx series as fast as 9xxx came out after 8xxx, it shouldn't take too long.


RE: No DX10.1?
By L33tMasta on 5/20/2008 4:42:54 PM , Rating: 2
I completely agree. I've stuck with my 8800GTX but it looks like this 9900GTX will be worth the upgrade simply in power even if it lacks DX10.1 support.


RE: No DX10.1?
By Shawn5961 on 5/20/2008 4:48:49 PM , Rating: 2
I'm the same way. I've kept my 8800GTS when the first wave of 9xxx cards came out because the performance boost was mostly minimal. But we'll see what the benchmarks of the new cards look like, and if it's as much of a boost as it should be, it'll be upgrade time for me.


RE: No DX10.1?
By ajfink on 5/20/2008 10:57:25 PM , Rating: 5
GDDR4 isn't all it's cracked up to be, thus is sparse use (and total neglect by Nvidia). GDDR5, on the other hand, is something else.

RV770 (HD 48X0 series) will have predominantly GDDR3 and GDDR5, if I recall correctly.


RE: No DX10.1?
By bfellow on 5/23/2008 1:27:51 PM , Rating: 2
Even ATI is going away from GDDR4 so why crack at Nvidia for not doing GDD4?


RE: No DX10.1?
By ImSpartacus on 5/20/2008 4:45:23 PM , Rating: 2
It doesn't really matter. It is a few small changes that mostly give dev's more control and set better standards for dx10.

http://www.extremetech.com/article2/0,1558,2168429...

DX10.1 is incremental at best, unnecessary at worst. If given choice, I would obviously get a DX10.1 card over a DX10 is all other things are equal, but it isn't a dealbreaker.


RE: No DX10.1?
By Mitch101 on 5/20/2008 5:39:47 PM , Rating: 1
I would think Microsoft would have more to say about DX10.1 being cut off at the knees by NVIDIA having game development companies not support DX10.1 because of the advantages DX10.1 can provide for ATI based cards even for Futuremark. But I guess Microsoft is a wuss when it comes to what is best for Vista benchmarks.

I guess Nvidia knows that performance degrading Vista's DX10.1 implementation is better for Vista sales and benchmarks than Microsoft.

NVIDIA > BALLMER

Where's your balls Balmer? Nvidia own them? DX10.1 a waste of Microsofts time?


RE: No DX10.1?
By afkrotch on 5/21/08, Rating: -1
RE: No DX10.1?
By ImSpartacus on 5/21/2008 3:10:27 PM , Rating: 1
Yeah, DX10.1 doesn't really change much of anything or have any impact on performance.


RE: No DX10.1?
By Mitch101 on 5/22/2008 11:15:04 AM , Rating: 3
It does make a difference if you have an ATI card.
Up to 20% difference on Assassins Creed.

Nvidia owns Ballmers Balls.
http://www.freshscoop.com/modules.php?name=News&fi...

Assassin's Creed DX10.1 - Addendum
http://rage3d.com/articles/assassinscreed-addendum...

Lo and behold, when we enable 4X MSAA, we can clearly see that performance is better with Service Pack 1 installed than without. It is, in fact, right at 20% faster on average.
http://enthusiast.hardocp.com/article.html?art=MTQ...


RE: No DX10.1?
By ImSpartacus on 5/22/08, Rating: 0
RE: No DX10.1?
By Mitch101 on 5/22/2008 11:41:15 AM , Rating: 5
quote:
DX10.1 is incremental at best


Sure if you buy an NVIDIA card and believe what NVIDIA tells you.

I understand justifying and defending a purchase and without 10.1 the NVIDIA card if faster however Crippling game companies and benchmarks to win an advantage because your hardware doesnt support it but the competitor does is pretty freaking low. NVIDIA fanboys serve their masters well.

http://enthusiast.hardocp.com/article.html?art=MTQ...
performance is better with Service Pack 1 installed than without. It is, in fact, right at 20% faster on average. This is with a ATI Radeon 3870 Video Card.

the Radeon HD 3870 averaged about 34% faster with Vista SP1 at 1600x1200 with 2X MSAA.

http://enthusiast.hardocp.com/article.html?art=MTQ...
Other than the above mentioned things, all the post processing effects and pixel shader qualities look exactly the same between NVIDIA and AMD hardware.

Tell me again how DX10.1 is incremental at best?


RE: No DX10.1?
By Totally on 5/20/2008 7:21:46 PM , Rating: 2
RE: No DX10.1?
By Spoelie on 5/21/2008 3:50:48 AM , Rating: 4
http://rage3d.com/articles/assassinscreed/
http://techreport.com/discussions.x/14707

In short: better performance, better anti-aliasing
"[..] demonstrated some instances of higher quality antialiasing—some edges were touched that otherwise would not be—with DX10.1 [..]"


RE: No DX10.1?
By mcnabney on 5/20/2008 4:30:25 PM , Rating: 5
Because they thought that ATI/AMD was going to continue to suck and didn't think too hard for the new generation. Surprise, it looks like there will be competition this summer. Good me. Good for you. Good for the consumer. And best of all, good for PC gaming.


RE: No DX10.1?
By Etsp on 5/20/2008 4:33:43 PM , Rating: 2
It's too late for competition to occur this summer... if ATI is supporting it and Nvidia is not, ATI is going to have the advantage, and Nvidia will need to catch up to compete, which in the GPU world, that's a 6 month cycle. So it will be a competitive year, not just the summer =D.


RE: No DX10.1?
By allnighter on 5/20/2008 4:43:36 PM , Rating: 3
Competition? What kind of competition/catching up are we talking about here?
In features?
Clearly ATi will have the lead by virtue of nV not supporting dx10.1, which is extended current api,hardly a brand new one.
How much good will it do and how much edge will it bring to ATi? - That would be on that other side of the coin, where, again, the competition will be a little one-sided, since ATi will continue to play catch-up in what matters more to the masses - performance.


RE: No DX10.1?
By Shawn5961 on 5/20/2008 4:45:26 PM , Rating: 2
The competition that's going to be going on is going to be between two different markets, as these things usually are. AMD (ATI) will be targeting more of the budget and mainstream users as it has as of late. The ATI cards are great cards for someone looking for a decent video card that will last a year or two, at a decent price. Nvidia on the other hand has always been more for the enthusiast. People who don't mind spending $500+ to get the best performing cards, and then spend another $500+ when the next card comes out.


RE: No DX10.1?
By RamarC on 5/20/2008 5:10:57 PM , Rating: 2
competition: ati's new lineup should debut by the end of june. the top card should slightly outperform a 9800gtx.


RE: No DX10.1?
By Strunf on 5/20/08, Rating: -1
RE: No DX10.1?
By ok630 on 5/20/08, Rating: -1
RE: No DX10.1?
By L33tMasta on 5/20/2008 11:15:45 PM , Rating: 2
Wow. He was only telling the truth. Intel's line is still better than anything AMD has come up with so far.


RE: No DX10.1?
By Ringold on 5/21/2008 12:28:38 AM , Rating: 5
Someone needs a hug.

And counseling.

http://www.google.com/search?q=counseling


RE: No DX10.1?
By Belard on 5/21/2008 1:14:05 AM , Rating: 1
Heeeeeeey now.

I'm an AMDer, but we gotta face facts. AMD has not made the fastest chips since C2D came out. This is somewhat AMD's fault, but also Intel and its back-room deals with companies like Dell. For the past few years, with AMD selling CPUs that murder the Pentium4/D CPUs, people were still buying the "intel inside" crap. You'd have intel fanboys blowing $1000 for the PentiumEE (Extreme Edition) which we on par against $200 AMD64 3200/3500 CPU. Imagine if more people bought AMD powered computers, which would have helped in a bigger and better R&D budget and talent.

At least AMD CPUs are only slower, rather that horribly slow and hot like the P4s. Cost with CPU + mobo is still cheaper than an Intel setup. But if I want to build the fastest thing with money I don't have, it'll be an intel CPU. Luckily, a $110 AMD 5600 is more than fast enough for me.

PS: With such anger... what are you like with someone to really hate? ;)


RE: No DX10.1?
By pow1983 on 5/21/2008 5:16:58 AM , Rating: 4
As far as im concerned the CEO needs to be replaced. It's the only way AMD can move forward.


RE: No DX10.1?
By psychobriggsy on 5/21/2008 10:26:45 AM , Rating: 4
I think that AMD are doing the right thing by concentrating on selling the overall package. Their integrated graphics are a great sell for the mass marketplace, and their discrete graphics are good value. Their CPUs are good enough. Their chipsets run cool and are reliable, even if they're not best in breed in every aspect.

The marketing is poor. Even good ideas, like the GAME! concept, come across poorly, and don't matter anyway because the potential end users never know about it. They need to fix this. Get some software houses to agree to code to the GAME! specifications and put splash screens in the games, etc.


RE: No DX10.1?
By afkrotch on 5/21/08, Rating: 0
RE: No DX10.1?
By evident on 5/21/08, Rating: -1
RE: No DX10.1?
By timmiser on 5/21/2008 4:05:49 AM , Rating: 1
This loser posts the same message in every one of his posts. If ever someone should be banned from DT, this is the one.


RE: No DX10.1?
By BrownJohn on 5/22/2008 2:03:52 PM , Rating: 1
Yeah, I looked up his history to see if you were right, and he has posted this in almost all of his comments:

By user OK360:
quote:
Die painfully okay? Prefearbly by getting crushed to death in a garbage compactor, by getting your face cut to ribbons with a pocketknife, your head cracked open with a baseball bat, your stomach sliced open and your entrails spilled out, and your eyeballs ripped out of their sockets. Fucking bitch


Has DailyTech ever banned someone?


RE: No DX10.1?
By winterspan on 5/21/2008 5:59:59 AM , Rating: 5
woah... See, Jack Thompson was RIGHT! Gamers are violent, crazed uber-nerds..


RE: No DX10.1?
By robinthakur on 5/21/2008 12:27:13 PM , Rating: 1
Spot who just came from the Scream 4 auditions...

This is possibly the most disturbing fanboi post i've seen on the net, at least regarding hardware!! The worst thing is that he's probably only 8!


RE: No DX10.1?
By PWNettle on 5/21/2008 4:34:51 PM , Rating: 1
Wow, so much for thinking that semi-intellectual and semi-mature people posted here.

Lighten up, Francis!


RE: No DX10.1?
By DingieM on 5/21/2008 8:48:13 AM , Rating: 1
Was there an expectancy that Phenom would be faster?


RE: No DX10.1?
By theapparition on 5/20/2008 10:57:23 PM , Rating: 2
Correction.......
ATI's new lineup will most likely be announced in June, without a single product to sell. 3months later, you may actually be able to pre-order one.

I really hope ATI get's thier act together here, because thier previous releases have been the definition of paper launch.


RE: No DX10.1?
By Spoelie on 5/21/2008 3:54:54 AM , Rating: 2
Have you totally missed the 8800GT launch versus HD3870/50??


RE: No DX10.1?
By theapparition on 5/21/2008 7:41:05 AM , Rating: 2
Nope,
One time doesn't make a trend. You have to admit that Nvidia has pretty much consistantly met launch dates where ATI has slipped or "paper launched". ATI did good on the last round, but remember that they were already 6mos late on R300 and the 3870/50 is just a mild respin.


RE: No DX10.1?
By 4wardtristan on 5/20/2008 7:35:11 PM , Rating: 2
consumer sees "intel" badge on front

consumer sees "nvidia" badge on front

consumer buys pc

thats pretty much how it rolls unfortunatly


RE: No DX10.1?
By Elementalism on 5/20/2008 8:39:04 PM , Rating: 3
Well first lets see how the real world performance is between these two cards. If lets say the ATI card is much slower than the Nvidia card. Then the technical details will mean very little. Can you name a single game being developed right now that will even take advantage of 10.1 in the next 8 months?


RE: No DX10.1?
By ajfink on 5/24/2008 11:41:42 AM , Rating: 2
Since it's becoming more widespread, I can see further support for 10.1 being nurtured along. It doesn't require massive programming changes to implement, and can add a good bit of quality. Seeing as a lot of graphics cards this year and next will have 10.1 enabled, it makes sense for them to at least get a toe in the water.


RE: No DX10.1?
By pauldovi on 5/20/2008 4:39:25 PM , Rating: 5
Because now it can release a rebranded version of all these cards for 25% more with official DX10.1 support.


RE: No DX10.1?
By Elementalism on 5/20/2008 8:35:00 PM , Rating: 2
I'd venture a guess the cost of silicon for the performance gain at this time is not worth it. No purpose in complicating your design if it wont add anything we will see in quantity over its lifetime.


RE: No DX10.1?
By DeMagH on 5/21/2008 4:25:53 AM , Rating: 5
Guys, please before spreading this non-sense about DX 10.1, do some research or even ASK before declaring it useless.

DX 10.1 is important, it provides quality with LOWER performance hit and it was proven useful through assassin's creed patch in this link: http://enthusiast.hardocp.com/article.html?art=MTQ...

quote:
There should be no doubt that Assassin's Creed contains DirectX 10.1 technology, and that that technology provides AMD's video cards with a solid performance boost.


Also, it was rumored that Nvidia's new generation die will hold around 1 to 1.5 billion transistors and i REALLY DOUBT that supporting DX 10.1 is as costy.

Nvidia's card was also rumored to consume around 240W, which is expected with this 512-bit memory interface "if i remember correctly this caused some power leaks within the previous ATi generation, 2900 series" + massive number of 65nm transistors.

To sum things up, looks like Nvidia is providing pure performance boost, some card show off muscle show, i DOUBT they'll be making GOOD use of the 512-bit memory interface which provides unbelievable increase in the memory bandwidth, but we'll have to wait and see.

On the other hand, ATi is providing:
- New crossfirex fix enabling GPUs to share memory
- 7.1 audio support through hardware HD decoding
- integration of physics capabilities in mid-range cards
- 55nm 6 months ago
- dx 10.1 6 months ago

IMO, ATi is providing something new, something fresh as always, may be it will not hold the performance crown but it will be holding the development one, but as always, we'll have to wait and see.


RE: No DX10.1?
By DingieM on 5/21/2008 8:51:48 AM , Rating: 2
Even the firm S3 is getting on the DX10.1/SM4.1 bandwagon.


RE: No DX10.1?
By Elementalism on 5/21/2008 9:27:33 AM , Rating: 3
Where did I declare it useless? I simply said my opinion is at this time Nvidia didnt feel the silicon necessary to make DX10.1 happen was worth it. Silicon real-estate is expensive. Why add complexity to your design if a small % of your users will ever use it and see a benefit from it? By the time 10.1 is useful we will probably be 1-2 generations down the road and nobody will remember nor care that this GPU didnt support it.

The other issue is WinXP install base. It is simply huge and DX10 is irrelevant in that space.

And quite frankly if it doesnt hold the performance crown? How can it be worth it? About all it provides is a test platform for devs who will use it for games coming out in 18-24 months.


RE: No DX10.1?
By DeMagH on 5/21/2008 5:03:26 PM , Rating: 2
quote:
I'd venture a guess the cost of silicon for the performance gain at this time is not worth it . No purpose in complicating your design if it wont add anything we will see in quantity over its lifetime.


Well, the parts in bold is what made me think you declaring it useless.

Also, holding the performance crown in games is not everything.
- ATi has been dominating the HTPC world for as long as the HD series was newly introduced from the HD 2400 pro till the HD 3870X2.
- ATi allowed crossfireX on intel platform or ANY platform.
- ATi folding@home capabilities with their latest drivers and cards .. etc.


RE: No DX10.1?
By larson0699 on 5/22/2008 12:02:00 PM , Rating: 1
quote:
512-bit memory interface "if i remember correctly this caused some power leaks within the previous ATi generation, 2900 series"
For real man, ATI's wasn't a 512-bit bus, but two 256-bit rings.

The HD 2x00 series was shit anyway, power hungry and not even close to competitive. If you bought one, you effed up awfully bad.

Now quit quoting yourself and state some hard facts.


RE: No DX10.1?
By drafz on 5/22/2008 4:36:46 PM , Rating: 2

On the other hand, ATi is providing:
- New crossfirex fix enabling GPUs to share memory


is it better of worst ? i mean , we don't care. we just want 70 - 90 % performance boost in dual gpu context. nothing more. i don't want to know how they reach it .

integration of physics capabilities in mid-range cards

huh ? where ? when ? How much ?
3 years ago , they alread ytalked about it . since this day, i however never see a single game using that.

55nm 6 months ago
what is it for ? for myself i means.

dx 10.1 6 months ago
Good point here, nothing to say.

actually, the most great thing with my GF8800 is the fan noise. since i unplugged my X1900XT , my ears came back to normal .


RE: No DX10.1?
By DeMagH on 5/24/2008 8:59:08 AM , Rating: 3
About new crossfire:
well, at the very least we should expect a tiny boost over current crossfirex card available and better compatibility with low performing games in crossfireX mode.

About physx:
well, if it was proven useless when it is included in a useful package, who cares?! you still got the option available for everybody for a decent price. Also, i think it is about time this technology be made useful. Nvidia bought Ageia, ATi integrated physics capabilities within the mid-range cards, what we need is 3 good game titles to support this technology coming within 3-6 months to join/support the currently available titles.

P.S:
performance may vary from dual to quad core though based on reviews comparison a few months ago, so if you are reading reviews about physics dropping gaming performance, it was probably running on a dual core processor, if it increased performance, it is probably a quad core processor.

about 55nm:
manufacturing processing technology, UNIT == NanoMeter "10^(-9)" (i.e: One billionth of a meter) it represents the distance between the source and drain within a transistor. The lower the number, the smaller the transistor, the cheaper it is to be produced in mass production, the lower power it consumes and finally the higher frequencies it can reach.

Nvidia is still @ 65nm, ATi was @ 55nm 6 months ago.


RE: No DX10.1?
By drafz on 5/24/2008 11:09:15 AM , Rating: 1
" about 55nm:
manufacturing processing technology, UNIT == NanoMeter "10^(-9)" (i.e: One billionth of a meter) it represents the distance between the source and drain within a transistor. The lower the number, the smaller the transistor, the cheaper it is to be produced in mass production, the lower power it consumes and finally the higher frequencies it can reach."

okay, so why ATI gpu aren't as fast as nvidia's one ?


GDDR3? ARE YOU JOKIN
By GhandiInstinct on 5/20/2008 5:10:12 PM , Rating: 2
My 1950XPro has GDDR4 for AGP!!!

What is going on? Why are we getting hotter slower cards because a company doesn't want to transition memory chips?

GDDR5 is out now for dice sake.




RE: GDDR3? ARE YOU JOKIN
By xeizo on 5/20/2008 5:19:50 PM , Rating: 2
HD4870 _will_ have GDDR5, and 1GB of it, according to official AMD launch-documents leaking to a German website.

1GB GDDR5 @ 3870MHz över a 256-bit bus ~128GB/s bandwith. It should be enough.


RE: GDDR3? ARE YOU JOKIN
By DeMagH on 5/21/2008 4:35:01 AM , Rating: 2
i read somewhere if ATi reached 1.5Ghz @ core clock speeds the 256-bit would be a limiting factor, but i guess that won't be the case anytime soon.

Unless some hardcore overclocker brought his liquid nitrogen to break a world record or something.


RE: GDDR3? ARE YOU JOKIN
By AmazighQ on 5/20/2008 5:21:44 PM , Rating: 1
I Think they made a mistake in the newspost
as i read in on the interwebz ATI will useGDDR5 that runs at 3.92ghz and skip gddr4
so im wonder where dailytech get its info
great you got GDDR4 on your AGP but where is the bandwith?


RE: GDDR3? ARE YOU JOKIN
By xeizo on 5/20/2008 5:30:44 PM , Rating: 2
Here is one reference to the ATI-document:

http://www.sweclockers.com/nyhet/7734-amd_bekrafta...


RE: GDDR3? ARE YOU JOKIN
By AmazighQ on 5/21/2008 12:16:00 PM , Rating: 2
dont see them using it for the RV770 cards
they also could have used Gddr4 instead of Gddr3
Xeizo where is the Gddr4 on that link
only see Gddr3 and gddr5


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/20/2008 6:16:56 PM , Rating: 5
They can't be skipping gddr4 if they already have been using it for some time...


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/20/2008 6:12:10 PM , Rating: 1
I wondered how long it would be before some unenducated idiot stated that.

The new NV cards have 512-Bit memory interfaces. So they will have more than enough memory bandwidth. In comparison the 9800GTX/GX2 have a pathetic 256-Bit memory interface.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/20/2008 6:33:57 PM , Rating: 4
While I agree that GDDR4 for the new card is a bit risky, the 9800GTX reigns at the top with GDDR3 while the competition uses GDDR4, so don't give me that BS.

What really matters is the bandwidth and latency, and in graphic latency has always been hidden by the very long pipeline.

You have to keep in mind the cost of the cards relative to it's performance. If the wider buss(256 vs 512 bits) cost difference is less then the memory(GDDR4 vs GDDR3) cost difference and you still have enough bandwidth, it's a good thing for the chip and card maker.

Quite frankly, why the hell do you think you know better than some guys that does this for a living? The NVidia guys seem to know where there going. And if this ends up being a bottleneck for these cards, then good, we'll finally have some real competition in the high end.


RE: GDDR3? ARE YOU JOKIN
By panfist on 5/20/2008 7:52:35 PM , Rating: 5
quote:
Quite frankly, why the hell do you think you know better than some guys that does this for a living?


Because they are in business, and everyone in business does one thing for a living: try to suck every last dollar out of your wallet. Everything else they do is secondary to that principle.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/20/2008 11:36:52 PM , Rating: 2
Thats exactly my point, it's about efficiency, if it cost less for the same, than it is not only good for the company but also for it's customer. But are you in the GPU business? You know what they are trying to do, like all business, but do you frankly know how to get there better than NVidia or ATI experts?


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/20/2008 11:24:14 PM , Rating: 2
Not another idiot...

As already pointed out Nvidia is a BUSSINESS, they most probably gave the GX2 a 256-Bit bus to save money on the PCB as then it's a lot less complex.

I actually have a 9800GX2 so i DO know what i'm talking about. The 9800GTX/GX2 are very bandwidth limited with the 256-Bit bus. I can cripple this thing at 1920x1200 on certain games with enough AA/AF, it will get to a point where things will become a slideshow because of the memory issues. Turn things down one step and eveythings fine.
Forums have picked up on this when a lot of reviews fail to mention it ... suspicious.

Anyway, next time, STFU fanboi and stop acting like you know anything.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/21/2008 12:04:51 AM , Rating: 1
Only an idiot would try to play any recent games at such a resolution with very high AA/AF settings. You just think that the small group of customers you belong to matter more than the rest, they force you to buy 2 cards to reach those settings, this way normal people can pay less to play in 1600x1200 without any AA/FF.
Lots of reviewer failed to mention this? They probably forgot about you, whine some more next time.
Suspicious!?! there goes the worst kind of idiots in the world, the conspiracy theorist. NVidia must have payed key reviewer to hush it all up!!!

They made a business decision and your here crying over it... get a life.

And I'm no fanboi I just stated the name of the company that is in the current discussion. I'm pro-competition, and sadly there hasen't been much of that at the high end since the GF8800GTX came out.

Ho and by the way owning any card doesn't mean you know anything about there job and how to do it right.

You sound like an ass, but I give you this, your amusing.


RE: GDDR3? ARE YOU JOKIN
By Noya on 5/21/2008 12:42:54 AM , Rating: 2
quote:
You sound like an ass, but I give you this, your amusing.


And so do you.

quote:
Suspicious!?! there goes the worst kind of idiots in the world, the conspiracy theorist. NVidia must have payed key reviewer to hush it all up!!!


Yes, payed. Just like every other company that sends out review samples. "If you say this or point out that, you don't get the inside on our next release or we pull ads, etc., etc.".

It's all politics idiot. Just like the guy that got fired for blasting that game recently (Kane & Lynch?) while the biggest banner ad on the site was for said game.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/21/2008 8:19:34 AM , Rating: 2
Fine, prove it?
Ho and just so you know, rumors are no proof.


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/21/2008 2:10:19 PM , Rating: 2
I dont know if you realise but people buy cards like the GX2 for these high resolutions. So when a card cant run them well with AA/AF it's a big deal.

1920x1200 was just an example. I actually have a 2560x1600 monitor, and games like Crysis, even with no AA will still be a slideshow because of the 256bit memory bus. For example even my 8800GTX runs Crysis better at that res because it has more usable vRAM and 384bit bus.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/21/2008 6:40:24 PM , Rating: 2
Why don't you look at the cards review and see what frame rate you should get at whatever resolution they show? And if your is not in there, extrapolate!

You pointed to the one game we all know won't run at high res on any card, just look at all the benchmark with that game, couldn't you predict what kind of frame rate you would get on your huge computer screen's resolution? I'm pretty certain there are some graphic settings you can tweak to make it run fine at your preferred resolution.

Why should the latest high end graphic card be able to play that game or any other game at a good frame rate and at any arbitrary resolution you chose?

I'm a game developer myself, it's my job, and graphics on PC is my specialization and I can assure you that any developer out there could max out any card out there at a 640x480 resolution in no time if they wanted too. But as it was mentioned earlier, it's a business, they don't care about the 0.1% of gamer who have such screens, they care about making the game good enough for the majority.

Now the guys who make Crysis always seem to like showing that they can make things look even better, if we had more power in our computers, future proofing there game in a way. They don't do that so it can be played like that right now. So stop your ranting and wait for the next gen card, anyway you seem to have enough money to buy a top end card every time there is a refresh.


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/22/2008 5:52:02 AM , Rating: 2
I already know all this, and none of that had much to do with what i was saying. My point was the poor design of the GX2 with it's limited 256bit bus, which is one of the main reasons Crysis is not more playable at high resolutions and/or with AA. Obviously even if the GX2 had a 384 or 512bit bus it would still never run Crysis at 2560x1600, but performance at higher resolutions with that game, and many others, when using AA/AF would be increased a lot.

A really good example is ET: Quake Wars, i could play that game at 2560x1600 on my GTX, but on my GX2 i cannot because of the memory issues. The Card easily has the GPU power, a lot more than the GTX, but the 256bit bus lets it all down. This point goes back to my original comment about this not being mentioned in reviews, when the reviewers have even tested with 2560x1600 monitors.

There have been countless examples of this on Forums with many games. The GX2 could really do with a 384bit or higher bus, which was my original point in earlier posts. It's a step down from the 8800GTX in that respect, and that card is getting on a bit now in graphics cards terms.


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/22/2008 6:30:58 AM , Rating: 2
I'd also like to point out KernD, before you mention it again... i know that NV gave the 9800GTX/9800GX2 a 256bit bus as a business decision, they must have known it would cripple performance at high res/with AA/AF. But that does not mean it's still a retarded thing to do.
Because at the end of the day these cards are more for people with 24" monitors and larger, and a lot more of these people exist than you seem to think. Because for someone with a 1680x1050 monitor a 8800GTX would easily do the job.

It's also retarded of you to assume in the first place that i thought i knew better than the NV engineers. When i never mentioned anything like that. This whole argument has basically been from you assuming i meant things i clearly didnt.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/22/2008 8:34:40 AM , Rating: 2
Yep, they knew it would at 2560x1600 and they told themselves, screw that one customer, we won't screw the 100 other just for him.


RE: GDDR3? ARE YOU JOKIN
By FITCamaro on 5/21/2008 6:37:52 AM , Rating: 4
Using AA at 1920x1200 is kinda pointless though. Any jaggies that exist are going to be so small you'll likely not notice them. In my mind its not worth enabling at those resolutions. Even at 1680x1050 I never notice a lot of jagged line.


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/21/2008 8:25:53 AM , Rating: 2
Yep, AF has far more impact than AA, at those resolution, and it doesn't take the max AF settings to improve the visual.

Nor does it require 16 AA to make a difference at low resolution.


RE: GDDR3? ARE YOU JOKIN
By B3an on 5/21/2008 2:19:23 PM , Rating: 1
I've heard this so many times from people like yourself ... who clearly dont even own a 1920x1200 monitor.

If the monitor was something like 17", with 1920x1200 res, then ok, it would be hard to notice jaggies. But monitors with that res are usually around 24".

A 24" monitors screen pixels are roughly the same size of a 19" 1280x960 monitor.

Even on my 30" 2560x1600 monitor i have no trouble seeing jaggies, and i dont have super amazing eyesight or anything.


RE: GDDR3? ARE YOU JOKIN
By Elementalism on 5/20/2008 8:40:29 PM , Rating: 2
And what has that GDDR4 done for you exactly?


RE: GDDR3? ARE YOU JOKIN
By KernD on 5/21/2008 12:07:56 AM , Rating: 2
Probably gave him a hard-on.
He's a tech geek after all.


RE: GDDR3? ARE YOU JOKIN
By GhandiInstinct on 5/21/2008 1:14:03 PM , Rating: 2
Extreme Performance
By AggressorPrime on 5/20/2008 6:53:13 PM , Rating: 1
I don't know why you are complaining.
1. DX10.1 adds no quality to graphics. It only forces 4x AA. Even Crytek won't add DX10.1 to their Crysis game.
2. GDDR3 is an excellent choice over GDDR5. DDR prices are on the rise in 2008. By investing in newer DDR, not only do you have to pay more, you also have to deal with more problems that have not been overcomed yet. Also, there is the problem of low supply of newer DDR technologies.

By using a 512-bit memory controller, nVidia can have the power of the Radeon HD 4800's memory without high cost memory. It is eaiser to make a supper chip than to depend on super memory.

And then you have to look at the power: 50% more performance per clock and 240 shaders. That is the same as 360 G92 shader processors. And it is supposed to be more powerful than two GeForce 9800 GX2 in Quad SLI. With Quad SLI needing 394W, you even have better performance/power. Just imagine the power of three of these in Tri SLI. That is Crysis ownage.




RE: Extreme Performance
By baddog121390 on 5/20/2008 7:38:43 PM , Rating: 2
1. DX10.1 doesn't force 4x AA. It specifies that all dx10.1 cards have to support 4xAA, not always have it enabled. The main benefit it has is adding access to the depth buffer so you can perform faster anti-aliasing in a deffered renderer.

2. Using a 512bit memory controller over a 256bit one makes the die larger, and more expensive to produce. The nvidia card will have cheaper memory, but a more expensive die. The ATI card will have more expensive memory, but a cheaper die.

3. Good luck having three 200watt+ tdp video cards in your case. I would have a phone handy to call the fire department in case your computer starts on fire. ;)


RE: Extreme Performance
By teldar on 5/20/2008 7:44:51 PM , Rating: 2
I would like to see the power dissipation of this new card. I'm thinking it's going to draw some outrageous power. Probably need supplementary power supplies for each card.


RE: Extreme Performance
By teldar on 5/20/2008 7:41:51 PM , Rating: 2
1. Actually, DX10.1 DOES add to graphics. I'm not going to summarize it all but you can have a look here http://www.tomshardware.com/reviews/amd-hd-3800-su... to see what DX 10.1 does for software.

2. How is it that GDDR3 can run as fast as GDDR5? I mean, I understand 512bit=512bit, however, if you have a 1GHz DDR3, how is that comparable to a 2GHz DDR5? It is eaiser to make a supper (sp) chip than to depend on super memory.
How do you figure? If it was easier to make 'super chips' (i guess you mean processors, as memory are chips as well, and I would call 2+GHz memory super) why are AMD and Intel not putting out petaflop processors?

Your lack of logic and support leaves your 'argument' more full of holes than a piece of cheesecloth.


RE: Extreme Performance
By bobsmith1492 on 5/20/2008 9:39:58 PM , Rating: 2
1GHz, 512bit bus is likely faster than a 2GHz, 256bit bus because the higher clock speed will have higher latency.


RE: Extreme Performance
By AggressorPrime on 5/20/2008 9:56:18 PM , Rating: 1
Thank you for showing AMD's Radeon HD 4800's flaw.


RE: Extreme Performance
By just4U on 5/20/2008 10:58:57 PM , Rating: 2
And the flaw in the current line up of Nvidia 9X cards ...


RE: Extreme Performance
By AggressorPrime on 5/21/2008 8:09:48 AM , Rating: 2
Yep, at least the G92's. By going from 384-bit to 256-bit (and 768MB to 512MB), the GeForce lost the ability to render games like Crysis at 2560x1600 with Very High settings in SLI.

As for the Radeon HD 2900 XT, there was enough memory bandwidth, but not enough core power. These next generation GPUs should have an even balance though.


RE: Extreme Performance
By DeMagH on 5/21/2008 4:47:14 AM , Rating: 2
well, that depends on how high the latencies we are talking about?! and how fast the clocks of the DDR5 are at, may be it is 2.4Ghz? 2.5?! there is nothing in hands right now to show us the rights and wrongs of your statement, but we'll have to wait and see.

Also, i know that a 512-bit memory interface is very costy from the design/implementation point of view + power requirements point of view

"at least that's what ATi's previous experience with the 2900 series told us"


RE: Extreme Performance
By suddeath on 5/21/2008 6:44:15 AM , Rating: 2
New Radeon 4870 (RV770XT) is going to have 1GB GDDR5 1935MHz (DDR 3870MHz), but they've reached 2GHz (4GHz) easily (all on Qimonda memory chips).


RE: Extreme Performance
By Spoelie on 5/21/2008 4:48:21 AM , Rating: 4
how so?
if you have 1ghz @CAS4 you have to wait 4 nanoseconds for your data
if you have 2ghz @CAS7 you have to wait 3,5 nanoseconds for your data

It's not because the latency number is higher that actual latency is higher.

Also, depending on the memory controller configuration, the 512bit bus has a higher potential for redundancy.


RE: Extreme Performance
By teldar on 5/27/2008 7:07:01 PM , Rating: 2
Exactly.

I think people forget that latency is in terms of cycles, not ns.

I say give me a low power GDDR5 256 bit bus that costs $20 to make rather than a moderate power GDDR3 512 bit bus that costs an arm and a leg to make.


Why release?
By Mikescool on 5/20/2008 5:39:11 PM , Rating: 2
There isn't alot of new high powered PC games coming out as developers seem to be shifting their attention to console gaming. I think they should hold off releasing new graphic cards since there aren't any new PC games coming out anyway.




RE: Why release?
By Inkjammer on 5/20/2008 6:48:26 PM , Rating: 5
We shouldn't have to wait for PC gaming to die further before we try to improve the playing field. The faster the graphics envelope is pushed on the PC front the better. As graphics cards improve, so do they on both the high and low end, better opening the window for gaming.

Case in point: prior to the 8800GT and the 9600GT there really wasn't a good high performance low to mid-range card for years. Thanks to innovations at the top of the spectrum we got lots of cards that rock on the low end. And not just budget cards, but $150 - $200 cards that hold their own even in the performance areas.


RE: Why release?
By walk2k on 5/20/08, Rating: -1
RE: Why release?
By lagitup on 5/20/2008 8:30:57 PM , Rating: 3
quote:
no self respecting gamer uses Vista 10% slower!!

Vista sucks...its so slow I hate it.
/sarcasm.
http://www.extremetech.com/article2/0,2845,2304031...
Do research before running your mouth. Vista SP1 vs XP SP2 = more or less identical preformance.
quote:
agreed. like all the people crying it doesn't support DX10

No, their wondering why nVidia made the decision to not go with DX10.1.


RE: Why release?
By just4U on 5/20/2008 11:19:10 PM , Rating: 3
quote:
no self respecting gamer uses Vista 10% slower!!


I don't want to ignore statements like this because they spread a completely untrue rumour which will be taken as factual unless people start to correct it.

I use it and install it to all the "gaming" systems I build... It works and works well! Some of you might still be gunshy with it but you will find more and more of us are giving it the thumbs up.


RE: Why release?
By AlmostExAMD on 5/21/2008 5:09:40 AM , Rating: 3
"nobody cares. nothing uses it and dx10 requires vista anyway and no self respecting gamer uses Vista 10% slower!!"

OMG 10% slower when I am already getting fps rates 3 times higher than my eyes actually require for smoothness,LOL
Chances are if your using Vista u probably have a very capable machine for games anyway like myself, If you don't then why bother switching over to Vista when u know it's pointless and will run slow on old pc's?
No self respecting gamer uses Vista, R u serious?
If your a hard core gamer u would be well ahead in both software and hardware for running your games compared to the average Joe Bloggs out there,Not living in the past on previous hardware/software.
I would like to know what games are 10% slower on Vista as yet I am unable to tell the difference with frames rates reaching 90+ in some games on high definition. I think it just comes down to your hardware, U get what u pay for like most things in life, Vista bashing is just a lame excuse I don't notice any performance loss in games(so far) but then again I am constantly upgrading my hobby.


RE: Why release?
By rdeegvainl on 5/21/2008 11:31:13 AM , Rating: 2
Actually, yesterday I was playing AOC. I noticed my FPS was at about 60-70, so I wanted to see what features I could up, I noticed it was running in DX9 mode by default, I switched it to DX10, and the difference was astounding, and my frame rates were still in the 40's. The only time I have hitching is from my hard drive, but with another on the way and soon to be set up in a raid, that will be fixed.


RE: Why release?
By Sungpooz on 5/24/2008 3:52:04 AM , Rating: 1
http://www.google.com/search?hl=en&q=download+vist...

There you go.

Seems we have to hand-feed another retard.


So when is something really new coming out
By michal1980 on 5/20/2008 5:51:50 PM , Rating: 2
seems like all we are getting is re spins of the 8800gts chips lately.

BORING.




RE: So when is something really new coming out
By Staples on 5/20/2008 6:47:25 PM , Rating: 2
I will tell you what else is boring, Intels price cuts.

I want a 45nm quad core Q9450 and I don't want to pay $400 for it.


RE: So when is something really new coming out
By Obsoleet on 5/20/2008 8:00:23 PM , Rating: 2
Keep waiting. They'd price cut such a desirable chip right now because? It's in demand?

I got mine a long time ago for $360. Enjoy (maybe) saving $50 under what I paid as Bloomfield gets closer and closer..

either you want it or you don't. Not going to be $150 soon, nor leave the $300 range so I hope guys like you enjoy your waiting.


By just4U on 5/20/2008 11:02:24 PM , Rating: 2
Yep, with the price war virtually over ... Intel has no reason what so ever to cut prices. AMD is not competing with them on the higher end, so it's just the lower/mid range chips that enjoy excellent pricing.


By shabby on 5/20/2008 6:56:36 PM , Rating: 2
If by respin you mean doubling the sp's/memory bus then im all for it.


By AggressorPrime on 5/21/2008 1:32:18 PM , Rating: 2
It is a refresh in the aspect that there is no new Shader Model technology. It is next-gen in the aspect that we are seeing a single GPU perform as fast as 4 GeForce 9 GPUs. It is a next-gen performance product.


Misleading title..
By imaheadcase on 5/21/2008 2:08:25 AM , Rating: 4
In title it says Next-Gen..while its not a next-gen its just a refresh as stating in the article itself.

2 different things. Next gen is new graphics chip itself, but the chip being using IS a refresh.




RE: Misleading title..
By tkSteveFOX on 5/21/2008 3:45:06 AM , Rating: 1
First of all , you guys can`t expect that for 6 months a graphics company can come up with a brand new GPU.Second of all sometimes it`s better to redevelop a michroarchitecture than to do a new one.The Physcs part is so not necsesary at this time due to the fact that even without it the differance isn`t that big , and the GPU would still have to enlist some Shader Units to do the physcs wich will result in lesser performance.The DX10.1 isn`t a great hype but it brings some nice features and betters the shader code.As for the DX10.1 titles we had one just recentley.Assassin`s creed , but due to the fact that it had the logo "The way it`s not meant to be played" in the newest patch for the title the DX10.1 has magicaly vanished.Bye , bye 20% performance for the HD3xxx series with AA enabled.A word on GPU architecture , a GPU with larger memory bus is more costfull to produce than another one with more memory.I agree on the latency do.The fact that the RV770XT memory frequency can go beyond 4GHZ(effective) is awesome.ATI has made some nice improvements in the new GPU , they`ve clocked the Shader Unirs higher than the clock frequency , and bettered the architecture of its SU`s as well.This 50% differance of the shaders from Nvidia is due to the fact that they`ve clocked them higher(and believe me it`s not 50%),the result will be a card thats huge and verry hot , and will drow more power.ATI`s strategy is working out niceley , their conquering the mid range and low end market first , a gpu company`s profits come from the low end and the mid range cards , the high end ones are more expensive to produce and harder to sell, lets face it a 3-4% of all gamers play at 1900x1200 with 8xAA and 16 AF their titles(couse frankley most of them don`t know what AA really is or AF for that matter).A mid range card will play all your games just fine in a resolution of 1600x1200 with 2-4xAA enabled ,witch is more than anyone needs.I`m currentley with HD3850 with 256 GDDR3 memory and i play crysis at most settings high ,a few in medium at a resolution of 1280x768 with 2xAA enabled , the graphics is just fine for a 19 inch wide screen LCD.If you have a 9600GT the result will be the same.Lets face facts 90% of people who buy PC`s don`t know what their bying they just know 2 words-"Pentium and GeForce" , i can`t believe how many CPU`s intel sells for these kind of customers , and GPU`s NVidia.When you go to your average Joe`s house and ask him what is your PC?He says Pentium or GeForce , he doesn`t ever consider that the Pentium line is dead for 2 years now.That really gets my goat.I think that 30-40% percent of Intel and Nvidia sells are due ot this fact and this fact alone.


RE: Misleading title..
By tkSteveFOX on 5/21/2008 3:47:58 AM , Rating: 1
First of all , you guys can`t expect that for 6 months a graphics company can come up with a brand new GPU.Second of all sometimes it`s better to redevelop a michroarchitecture than to do a new one.The Physcs part is so not necsesary at this time due to the fact that even without it the differance isn`t that big , and the GPU would still have to enlist some Shader Units to do the physcs wich will result in lesser performance.The DX10.1 isn`t a great hype but it brings some nice features and betters the shader code.As for the DX10.1 titles we had one just recentley.Assassin`s creed , but due to the fact that it had the logo "The way it`s not meant to be played" in the newest patch for the title the DX10.1 has magicaly vanished.Bye , bye 20% performance for the HD3xxx series with AA enabled.A word on GPU architecture , a GPU with larger memory bus is more costfull to produce than another one with more memory.I agree on the latency do.The fact that the RV770XT memory frequency can go beyond 4GHZ(effective) is awesome.ATI has made some nice improvements in the new GPU , they`ve clocked the Shader Unirs higher than the clock frequency , and bettered the architecture of its SU`s as well.This 50% differance of the shaders from Nvidia is due to the fact that they`ve clocked them higher(and believe me it`s not 50%),the result will be a card thats huge and verry hot , and will drow more power.ATI`s strategy is working out niceley , their conquering the mid range and low end market first , a gpu company`s profits come from the low end and the mid range cards , the high end ones are more expensive to produce and harder to sell, lets face it a 3-4% of all gamers play at 1900x1200 with 8xAA and 16 AF their titles(couse frankley most of them don`t know what AA really is or AF for that matter).A mid range card will play all your games just fine in a resolution of 1600x1200 with 2-4xAA enabled ,witch is more than anyone needs.I`m currentley with HD3850 with 256 GDDR3 memory and i play crysis at most settings high ,a few in medium at a resolution of 1280x768 with 2xAA enabled , the graphics is just fine for a 19 inch wide screen LCD.If you have a 9600GT the result will be the same.Lets face facts 90% of people who buy PC`s don`t know what their bying they just know 2 words-"Pentium and GeForce" , i can`t believe how many CPU`s intel sells for these kind of customers , and GPU`s NVidia.When you go to your average Joe`s house and ask him what is your PC?He says Pentium or GeForce , he doesn`t ever consider that the Pentium line is dead for 2 years now.That really gets my goat.I think that 30-40% percent of Intel and Nvidia sells are due ot this fact and this fact alone.The GLIDE API was great once , it`s so bad that MS sinked it , may the Voodoo Legacy rest in peace.If the GLIDE was developed over time i`m shure that by know we would have 2 times better graphics and twice as cheaper VGA`s.


RE: Misleading title..
By DeMagH on 5/21/2008 5:18:13 AM , Rating: 2
Not that i disagree with you at any point, i find all your points valid and strong, but i have the following comments:

- Please do some spacing next time within the text, it felt like reading an old journal going through your post.

- 9600GT should be compared to HD 3870 not 3850


RE: Misleading title..
By imaheadcase on 5/21/2008 9:21:07 AM , Rating: 2
That has nothing to do with what I said tkStaveFOX. Might as well make a diffrent topic. Take basic writing class to.


RE: Misleading title..
By TennesseeTony on 5/21/2008 7:56:43 PM , Rating: 2
You can spell, now learn about "Paragraphs."


10.1 - oh well
By FXi on 5/20/2008 8:37:45 PM , Rating: 2
Frankly it's faster, that's a good thing, and the physics might be interesting. But this still feels like a larger 8800 really. It's much like the 7800 to 7900, only with the addition of more shaders and a larger memory bus. It's a refresh product, with some tweaks. The 9xxx series is a joke. Everyone agrees on that. What's hilarious is that Nvidia is going "introduce" the 9xxxM mobile chips like they were something new, when the 280/260 are going to be on the desktop market. You think the laptop market will laugh?

I "might" trade the 10.1 for the physics. I might not. Normally a new card is a must buy for me, but this time, if I don't really need the power, I may just say pass till the 10.1 real deal comes out in Spring '09.

I mean how many things need more power than the 8800? Not much. They both have 10.0 so no improvement there. Physics? Maybe. Spring '09 probably would see 10.1 and 45nm chips - plus all those 240 shaders. I guess I'm just not convinced this is a must have. And the 9xxx mobiles are a joke, just like the desktop parts.

Not really seeing much interesting here.




RE: 10.1 - oh well
By PrinceGaz on 5/20/2008 9:28:08 PM , Rating: 2
quote:
I mean how many things need more power than the 8800? Not much. They both have 10.0 so no improvement there. Physics? Maybe.


The 8800 series should support PhysX if nVidia get around to adding it to the driver, as it uses a CUDA compliant GPU.


RE: 10.1 - oh well
By just4U on 5/20/2008 11:47:38 PM , Rating: 2
it's what the 9800GTX should have been... well better late then never I guess. Some of those people who held onto their original 8800GTX's might accually pull the trigger on this one. The extra bandwidth will be nice for high end monitors.


RE: 10.1 - oh well
By Hlafordlaes on 5/21/2008 4:25:36 AM , Rating: 2
The 9600GT, single or in SLI, is not a bad value:
http://techreport.com/articles.x/14686

Not all of the 9xxx series is a joke.


RE: 10.1 - oh well
By FITCamaro on 5/21/2008 6:41:30 AM , Rating: 2
I'll just have to stick with the dual 8800GTS 512s I just ordered. Should be more than enough for the next 2-3 years. DX10.1 would be nice, but I rarely ever enable AA anyway.


Enact an embargo?
By vgermax on 5/20/2008 7:22:11 PM , Rating: 2
Presumably they mean will lift an embargo...

As always with this type of thing, speculation will take you so far, but real world performance tends to dictate better than theoretical limits. As for support of a point release being significant, it didn't play out that way back with DirectX 8 vs 8.1, and considering that release was much more significant, it would be difficult to argue otherwise in this instance.




RE: Enact an embargo?
By KristopherKubicki (blog) on 5/20/2008 7:41:44 PM , Rating: 2
No, later this week NVIDIA will hosts an Editor's Day, putting websites under an embargo. Not DailyTech, though.


RE: Enact an embargo?
By CyborgTMT on 5/20/2008 10:53:03 PM , Rating: 3
At which point Kris will break into Derek's office at 2 AM and run a few benchmarks.... eeerrr... I mean get his stapler back.

http://www.virtualstapler.com/office_space/images/...


hmm
By jay401 on 5/21/2008 8:06:48 AM , Rating: 3
So basically we can expect:

GDDR3
DX10.0
and
$TooMuch

I think I'll keep my 8800GT, thanks. It already offers the first two and doesn't ruin it with the third item. ;P
Then again I'm not exactly playing anything that causes it to struggle so there's no reason to "upgrade" (put in quotes because 10.0 and GDDR3 aren't exactly upgrading when I already have those).




RE: hmm
By Chris123234 on 5/21/2008 1:55:41 PM , Rating: 2
Hahaha holy shit a lot of you are ignorant.

This new card will absolutely and utterly destroy your 8800GT. Just because "GDDR 3" and "DX10.0", it means nothing as far as what it can and cannot do.

As for the rest of you,

DX10.1 gave a huge boost over DX10.0 when using AA. Those of you saying it adds features not performance are morons.

GDDR4 is not used because it is not very good memory and only had a MHz advantage. Now that GDDR5 is out, GDDR4 might as well not exist.

ATi's single chip card is meant for mid to high end while it's dual chip (X2) is meant for enthusiast level performance. You won't see any sort of GX2 style card with these new nvidia processors until they shrink them some time in the distant future. They just use too much power, put out too much heat and are way too big.

To the guy who mentioned that it would be too costly to add the silicon for DX10.1 at this time... no it would be impossible.

tkSteveFOX, they didn't make this card in six months. These companies work on 2 or more architectures at a time and for years for each one. It's not one guy doing all the work. Also, learn to type some paragraphs, jesus christ.


RE: hmm
By ZaethDekar on 5/21/2008 2:29:45 PM , Rating: 2
jay401,

You are slightly confused with what the benefits are with the new cards. For one, if you go with AMD in July it will be GDDR5, and DX 10.1.

If you for with nVidia, then you will have better clock, more bandwidth, and an overall better experience.

However, just like anything else, if it ain't broke, don't fix it. If your card works fine for you, thats good. Honestly I am using a XFX GeForce 8600 GT XXX OCed to a GTS. Honestly it is all I need. I can play Crysis (low-high custom settings and a few mods I play for HL2 and GTA SAMP, thats all I need.

I think I will be upgrading when depending on how the AMD onboard ATI comes out. I planned on getting a 780G board but I may wait depending... we shall see.


AMD using GDDR5, not GDDR4
By Kaldskryke on 5/20/2008 6:03:06 PM , Rating: 2
Every leak and rumor says AMD's HD4870 XT will use GDDR5, not GDDR4. Since AMD is likely sticking with a 256 bit memory bus, GDDR5 will give RV770 the bandwidth it needs to compete. GDDR4 is just too slow.




RE: AMD using GDDR5, not GDDR4
By Kaleid on 5/21/2008 12:05:40 AM , Rating: 2
Yes, and 256-bit bus will hopefully mean a much cheaper card than Nvidia's 512-bit solution.

Both will have enough memory bandwidth, the question is are the cores fast enough?


RE: AMD using GDDR5, not GDDR4
By DeMagH on 5/21/2008 5:00:36 AM , Rating: 2
I think both cards will end up targeting different markets, Nvidia's card might be compared to the 4870X2, but definitely "at least that's what i believe" Nvidia's single GPU solution will be OBVIOUSLY faster than ATi's .

Though it is interesting to see whose approach to increasing the memory bandwidth will work efficiently. I vote ATi


I've waited this long...
By Polynikes on 5/21/2008 8:29:49 AM , Rating: 2
I'm still using an x1950XTX Crossfire setup. I've really wanted to jump to a single GPU, especially when the 8800GT came out, but I've held on... Now the 9900 series is looking pretty good, but I think I'll wait until Nehalem and do a full upgrade of everything. :)




RE: I've waited this long...
By Ananke on 5/21/2008 12:49:39 PM , Rating: 3
From a business point of view everything is just a matter of cost. AMD is getting well positioned in the consumer market, i.e. especially at economic slowdown. Retail consumer see two things:
1. AMD based computer that costs 500 bicks vs. Intel based for $600. Guess which one will buy :)
OR
2. AMD vs. Intel based both for 600, however the retailer margin is 100 on AMD vs. 35 on Intel machine. Again, guess what will be sold :)

This is the market. Money talks always, the rest - hardware tech geeks with bottomless pockets are mere .5% of the market. Negligible.

Average consumer doesn't think about specifications when buys computer, i.e. if that computer cannot support a game, the consumer will just not play that game. Nobody cares about 10 vs 10.1 DirectX in the light of dollar difference.


don't care
By cmdrdredd on 5/21/2008 5:57:17 PM , Rating: 1
Until there is a GOOD game that actually requires me to upgrade my card I won't do it. I got a HD2900XT cheap and won't change it out because I don't see any games (no crysis was a piece of shit so don't waste time mentioning it) that make me say "WOW I gotta play that so I need a new card".

I don't care about these wonder gizmos Nvidia pushes out. And AMD pushing GDDR5? Please...why do I need that exactly? Becides running games at 238912890fps(yes I exaggerated)




RE: don't care
By tkSteveFOX on 5/22/2008 5:09:58 AM , Rating: 2
I`m very much aware that their reserching , year or two in advance , and it`s a big team that does it.Please don`t mock me.I`m just sayin` that the average Joe`s say pentium and geforce , and because of them companies like AMD suffer.They should make a law that bans GPU or CPU companies to support games."The way it`s meant to be played" means we just paid your game studio a couple of milion bucks , make this game run faster on our GPU`s , and disable the DX10.1 support couse we don`t have it , yet,because there`s no need for it.

Don`t these guys underastand , there`s no need for really high end chips anymore , unless working with 3D development.Stick to FirgeGL and Quadro and give the people 200$ max GPU`s only and refresh them every year.And i agree , give the PC more games to justify the 6 month refresh , unless this is done we don`t need 600$ GPU`s and 1,000 $ CPU`s.


RE: don't care
By drafz on 5/22/2008 4:38:20 PM , Rating: 2
ah crysis .... i see it more like a global moddable game engine. not as a single game.

Armed assault 1.14 final patch released.
a good ATI benchmark :thumbup: . give it a try !


just like the G3 iPhone
By FingerMeElmo87 on 5/20/2008 4:35:44 PM , Rating: 2
finally, nvidia spilled the beans on there worst kept secret. to bad all this information was already available on the web weeks before hand




RE: just like the G3 iPhone
By Lonyo on 5/20/2008 6:23:14 PM , Rating: 2
There's been a lot of information about upcoming cards, not all of it was the same.
Some of it was bound to be right, now we just know what was right.


Hmm
By Ilfirin on 5/20/2008 11:16:08 PM , Rating: 2
"NVIDIA has not delivered on this promise yet, though D10U will support CUDA, and therefore PhysX, right out of the gate."

They have, however, made some recent advances on this front:
http://developer.nvidia.com/object/physx.html
http://developer.nvidia.com/object/apex_dev.html




RE: Hmm
By Belard on 5/21/2008 1:23:11 AM , Rating: 2
Also hoping that with AMD's and some developer plans in which DX won't matter anymore. With CUDA on Nvidia and ATI's own version of CUDA - the game designer can say "Eat me M$" and hit the hardware directly. So games with WindowsXP will look just fine or better than Vista.

One of the reasons that Voodoo did well with GLIDE games compared to early DX games used with ATI & TNT cars. (GLIDE is the graphic language API, since DX wasn't much of anything in the 90s)


That's really cool.
By gochichi on 5/27/2008 3:50:06 PM , Rating: 2
I have waited too long to jump in and still have an 8600 GT as my highest end graphics card (I own several computers), I am really looking forward to updating my card so this is excellent news. I hope both ATI and NVIDIA release excellent products in June.

I am definitely not interested in the high end and it has everything to do with power consumption over price per say. OTherwise I'd buy a 8800 GTS 512 in a heart beat. I was most interested in a 9600GT (because of energy consumption) and hopefully AMD/ATI can release a product that is right around that power consumption wise and even better at performance.

I tend to default to ATI all other things being equal but ATI just hasn't offered anything worth buying in a while. I also use Ubuntu Linux sometimes and ATI is just not competitive under Linux. I may forgo upgrading my graphics card altogether and get a Bluray ROM player instead (both from entertaintment budget, so not as unrelated as it may appear).




"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher

Related Articles
NVIDIA, AMD Launch New Video Cards
April 1, 2008, 10:12 PM
GeForce 8 To Get Software PhysX Engine
February 15, 2008, 10:33 AM
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki