backtop


Print 51 comment(s) - last by Hieyeck.. on Oct 31 at 6:11 PM


The AMD "PingPong" demo highlights the use of DirectX 10.1's new global illumination engine  (Source: AMD)
ATI Radeon HD 3800 bets big on next-generation DirectX

On the eve of NVIDIA's GeForce 8800 GT launch, a memo to technology journalists was sent out containing details for AMD's upcoming RV670 graphics processor, officially dubbed the ATI Radeon HD 3800.

Yet rather than attacking the processing power or thermal envelope of NVIDIA's new 65nm GPU, the AMD document focuses on DirectX 10.1 support.  AMD details many of these effects of this new abstraction layer, including even more anti-aliasing patterns and a new global illumination engine -- all of which are supported by the RV670 graphics processor.

RV670, slated to launch next month simultaneously with AMD's upcoming Phenom desktop processor, is still very hush-hush even so close to launch time.  AMD corporate roadmaps previously indicated that RV670 is, for the most part, an optical shrink of the lackluster 80nm R600 graphics processor.

AMD's newest document fleshes out RV670 as "The new ATI Radeon HD 3800 series of GPUs are the first to be designed for DirectX 10.1, as well as other cutting edge technologies, including PCI Express 2.0, Unified Video Decoder (UVD), hardware accelerated tessellation, and power efficient 55nm transistor design."

A copy of the DirectX 10.1 whitepaper is still available at PCPerspective.

NVIDIA declined to launch a next-generation graphics processor in 2007, instead opting for the 65nm optical shrink of the G80 architecture -- dubbed G92.  While G92 features fewer unified shaders than the GeForce 8800 GTX or GeForce 8800 Ultra, the majority of the architecture remains wholly intact. 

However, what is clear is that NVIDIA and AMD will both benefit from the reduced process node.  Smaller nodes mean lower leakage and thermal envelopes -- and in turn quieter and more robust cooling and packaging. 

Banking on DirectX 10.1 selling the Radeon HD 3800 is certainly not without its criticism.  DirectX 10, while clearly the future of game development, has received slow adoption from the developer community even with hardware availability in its second year now.  BioShock, Crysis, Hellgate: London and Unreal Tournament 3 are the only big-ticket titles in 2007 that utilize DirectX 10 support. 

DirectX 10.1 is expected to launch with the Windows Vista Service Pack 1 and is backwards compatible with the existing DirectX 10 layer.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Show me the Goods - I dare Nvidia & AMD
By cheetah2k on 10/29/2007 4:41:02 AM , Rating: 3
I don't really care what these guys are doing, unless they can provide me a video card that does more than avg 40fps in 1024x768 in Crysis.

I mean, WTF guys.. My god damn 8800GTX's in SLI are useless in a real DX10 situation, and now DX10.1?

Sure optimisations are required to make Crysis more playable, but right now - both AMD and Nvidia (especially Nvidia) should be thinking real hard about all the hype they filled the press with touting blazing speeds in DX10...

:-/ Shocking.....




RE: Show me the Goods - I dare Nvidia & AMD
By killerroach on 10/29/2007 9:04:31 AM , Rating: 2
Speed was never really a marketing point of DirectX 10... lower overhead and better memory management, yes; speed... not so much. Most of the lower overhead then, in turn, gets consumed when the far more complex shaders of DirectX 10's Shader Model 4.0 get put into play.

Simply put, a large portion of what DX10 does can be done in DX9... just to do it exactly the way DX10 does it (with the levels of precision and complexity) would completely choke DX9 hardware.


RE: Show me the Goods - I dare Nvidia & AMD
By Mitch101 on 10/29/2007 9:41:37 AM , Rating: 2
Agree speed is being phased out its all going to be about the eye candy.

Its like the Physics cards from Ageia they werent made to create higher frame rates they are there to add elements of realism to the gameplay. Added details created by the physics cards add extra detail for graphics to draw thus slowing them down a bit but ask anyone playing the game with one and they love the eye candy and elements a PPU brings to the game even if the developers havent added that much over the existing game. Necessary Not yet but certainly adds to the gameplay.

I look forward to fragging a buddy by a car tire or engine block when I launch a rocket at a car.


RE: Show me the Goods - I dare Nvidia & AMD
By murphyslabrat on 10/29/2007 10:15:13 AM , Rating: 5
You can already frag a buddy via engine block or tire, only it's done with the gravity gun in HL2:Deathmatch.

Valve needs to add multiplayer to portal. You could have portaling deathmatches, races, CTF matches, etc. All with the awesomeness of the portal gun.


RE: Show me the Goods - I dare Nvidia & AMD
By Bluestealth on 10/29/2007 7:16:49 PM , Rating: 2
You can do similar things in Bioshock with telekinesis or whatever they "want" to call it. When I accidentally threw a corpse into the head of a slicer and watched him dropped to the ground... I had a disturbing moment... in which I thought... that was awesome. I have since learned to use it better, mainly because I realized it wasn't just a gimmick to pickup items.


By Bluestealth on 10/29/2007 8:50:43 PM , Rating: 2
Errr.... dropped = drop... Firefox can only save you from spelling not grammar mistakes :)

I also should really take advantage of this "preview" feature.


RE: Show me the Goods - I dare Nvidia & AMD
By mindless1 on 10/29/2007 1:44:46 PM , Rating: 2
It's been "about the eye candy" ever since the first Geforce and Radeon cards were released. Otherwise, merely producing ~ 50FPS at a typical gamer's monitor resolution isn't a tough target to hit.


By Mitch101 on 10/29/2007 2:43:38 PM , Rating: 3
I agree with that until the Widescreen LCD's started replacing CRT's then FPS was back for a while.

Right now 1680x1050 22" widescreen resolution is the sweet spot for most of the people I know looking for the most eye candy per buck and it looks like the 8800GT is going to make that mainstream now we have to see what ATI is going to drop in the next couple of weeks.

With a SOYO 24" 1920x1080 monitors dropping into the $300.00 range recently it wont be long before that becomes the next standard resolution. Maybe this time next year?

What no one tells anyone about a nice big Widescreen monitor is that you need some GPU power to push all the pixels in its native resolution at a decent framerate in todays games. But it looks like mainstream graphics (8800GT and NV38XX) are going to help the 22" area out for native resolution next gen games.


RE: Show me the Goods - I dare Nvidia & AMD
By wallijonn on 10/29/07, Rating: -1
By killerroach on 10/29/2007 4:02:37 PM , Rating: 3
Have you been using Vista lately? I have... and the video card drivers have improved to the point where Vista and XP are basically on par with each other in DX9 gaming, with a few titles (Oblivion a notable example) that are faster in Vista. Most of the performance penalties from Vista's launch were due to the drivers from both ATI and nVidia being absolute garbage, with the remainder being stuff that Microsoft has fixed through a series of compatibility, performance, and stability patches for Vista.

You may want to actually have some experience with what you're talking about before you make claims like that. I'm no Microsoft fan, but Vista is at a point where, except in some certain situations (mostly if you have obscure hardware or rely on a lot of third-party networking software that hasn't been updated in a while), there's little reason to recommend XP over Vista anymore.


RE: Show me the Goods - I dare Nvidia & AMD
By Tedtalker1 on 10/29/07, Rating: 0
RE: Show me the Goods - I dare Nvidia & AMD
By Dharl on 10/29/07, Rating: 0
By NoSoftwarePatents on 10/29/2007 4:13:12 PM , Rating: 1
God has nothing to do with being exclusively a swear word-assuming that's what you mean. It's a multi-purpose word that can be used for many things.


ATI/AMD brings out the heavy guns
By AOforever1 on 10/29/2007 1:51:35 AM , Rating: 3
I was just about to order the 8800gt tomorrow. Nice move AMD, good to see the market is moving foward.




RE: ATI/AMD brings out the heavy guns
By hrah20 on 10/29/2007 3:59:16 AM , Rating: 1
I would wait a little longer to 2008 to hear something about G90,or better G100, but if you're in a hurry, go for it !!!


RE: ATI/AMD brings out the heavy guns
By wordsworm on 10/29/07, Rating: 0
RE: ATI/AMD brings out the heavy guns
By ZmaxDP on 10/29/2007 3:49:22 PM , Rating: 2
You know, if you really want to analyze "multi-core" graphics cards are way ahead of the CPUs. AMD's TOTL card has 360 "stream processors" and Nvidia's has maybe 120? I can't remember that one for sure. Given, it isn't like 3 complete cores on one die, but both AMD and Intel are transitioning away from that approach anyway. Their chips are sharing cache, FSB - Intel, and hypertransport bandwidth and memory access - AMD and are likely to continue sharing more and more as time goes on. The main difference is the multi-purpose nature of the CPU prevents it from having as much shared hardware as the GPU. Point being, we've got 360 core GPUs out there, so condition 2 is way past being met...


Leaked....
By Regs on 10/29/2007 2:18:15 AM , Rating: 3
I'm getting sick of the word. It should be as familiar now as Stream Processor or shading unit in the GPU world by now.




RE: Leaked....
By JackBeQuick on 10/29/2007 2:24:50 AM , Rating: 1
DT practically said it: AMD sent them the doc to induce confusion for 8800 GT. "Leaked" LOL


RE: Leaked....
By Iger on 10/29/2007 4:56:16 AM , Rating: 1
Was my first thought too... And especially, it seems to me, it became the new everyday tactics for AMD - so annoying... (oh yes, my CPU is AMD and video is ATI)


fixed R600 issues?
By MGSsancho on 10/29/2007 5:03:45 AM , Rating: 2
i hope the fixed the hardware resolve? http://www.beyond3d.com/content/reviews/16/16
I mean architecturally its great. but a few things are lacking. also if the get UVD fully working would be awsomastastic.
here are some filtering throughput benchmarks of the R600 http://www.beyond3d.com/content/reviews/16/13

note thats the 2900XT testing nothing about the R700. im just posting that so you can see where AMD lacked a bit on. and these issues were said back in February (rumors so kill me) but AMD had not more time to stall to do this. oh well, I honestly it works out great oh oh oh and im requesting triple play!




RE: fixed R600 issues?
By MGSsancho on 10/29/2007 5:05:10 AM , Rating: 2
I forgot UVD was added in the 2400 and 2600 sorry I'm an idiot


<no subject>
By Scabies on 10/29/2007 2:56:33 AM , Rating: 3
quote:
"The new ATI Radeon HD 3800 series of GPUs are the first to be designed for DirectX 10.1, as well as other cutting edge technologies, including PCI Express 2.0, Unified Video Decoder (UVD),...."


Which I find interesting, as the cover story for the 2900 series lacking the UVD was (paraphrased) "consumers out to buy the 2900 will have a system powerful enough to not require a dedicated HD-Video decoder. The benefit of the UVD can be found in the 2600 and 2400 lineup."
Is the 3000 series not overkill enough to merit a similar omission?




Ridiculous
By oddity21 on 10/29/2007 6:09:30 AM , Rating: 1
The DirectX 10 titles we have right now are either...

a) ...no different to their DX9 counterparts, or sport extremely minor differences nobody would actually notice in actual gameplay...
b) ...run horribly at 'enthusiast' resolutions...

...and they still want to push DX10.1? This is totally ridiculous. The document's focus on this irrelevant feature scares me because it's possible that the 3800 cards won't really improve DX10 performance.




RE: Ridiculous
By Mitch101 on 10/29/2007 9:32:13 AM , Rating: 1
Marketing - any possible advantage one company might have over another company's product they promote it like its been sent from God himself and if you dont get this then "Youll get cancer and Die"

Its the Smartshader marketing war 2.0 vs 3.0 war all over again.


.
By semo on 10/29/2007 8:43:22 AM , Rating: 2
quote:
Smaller nodes mean lower leakage and thermal envelopes
i thought smaller nodes mean higher leakage since the dielectric loses its effectiveness. wasn't that the reason why prescotts were so hot?




Whitepaper Link
By Mitch101 on 10/29/2007 9:57:03 AM , Rating: 2
Whitepaper from Team ATI.

http://www.teamati.com/DirectX%2010_1%20White%20Pa...

Its actually a decent read with good explanations. If the difference between the 8800GT and ATI's offering stays within a few frames of each other then I might just get the ATI card this round especially if they are priced about the same.




Shader based AA
By scrapsma54 on 10/31/2007 9:30:50 AM , Rating: 2
Anyone agree this is a huge leap in AA performance? Gather 4 seems to also bring in better performance.




I wonder why...
By Hieyeck on 10/31/2007 6:04:56 PM , Rating: 2
quote:
DirectX 10, while clearly the future of game development, has received slow adoption from the developer community even with hardware availability in its second year now.

*cough*vista*cough*

What MS needs to do is address the gamer and performance niche, and present us with a stripped down Vista. No fancy eye-candy or "accessories", just a good ol' NT desktop with DX10/10.1 support. Hell, if they did this, they could blow linux out of the water.

I wonder if server 2008 has DX10 support...




dumb
By Gul Westfale on 10/29/2007 2:02:10 PM , Rating: 1
this is kinda dumb, and for two reasons:

- "leaked": this wasn't leaked, it was sent to journalists by AMD. the word leaked implies that dailytech somehow got the scoop from an unnamed insider, when it was really just a press release. then again, most leaks are calculated releases of information that the involved party then tries to pass off as accidental in order to generate more buzz... i don't think many people that frequent this site regularly would still fall for such BS.

- "superiority of DX10.1": well obviously they would emphasize that. first, it looks like they will be first to market and thus they have a marketing advantage (if not exactly technological advantage, since the move from DX10 to DX10.1 hasn't pulled anyone off their chair yet) over nvidia; and second, this allows them to tell their existing customers to upgrade from previous ATI products to newer DX10.1 hardware.... don't we see the exact same thing going each time someone releases a new graphics chip?

couldn't you just have said "ATI sends out press release, says new card is [insert tired old marketing term here]."




Awesome hardware...
By elpresidente2075 on 10/29/07, Rating: -1
RE: Awesome hardware...
By ChronoReverse on 10/29/2007 2:06:28 AM , Rating: 5
If you haven't noticed, ATI drivers were actually better than Nvidia's on Windows for a large portion of the past year. I wouldn't give either an advantage in terms of drivers under Windows right now.


RE: Awesome hardware...
By elpresidente2075 on 10/29/2007 11:45:50 AM , Rating: 2
Perhaps you should understand that Windows isn't the only operating system that people use. I actually run Windows MCE, SUSE 10.3x64 and Ubuntu 7.10x64 on my machine right now.

Now, don't get me wrong, in no way do I say that ATI sucks, but their software has been quite lackluster. It's getting better, but it's still got a way to go.


RE: Awesome hardware...
By mindless1 on 10/29/2007 1:48:40 PM , Rating: 2
Funny, most people didn't notice that. Not giving either an advantage is arbitrary and almost always wrong. That the perceived quality is arbitrarily equal is like a 1:100 chance if dismissing 1% difference, but nevertheless people keep trying to claim it over and over but it's never true.

What's more true is that certain games may have better support for either GPU and driver at any moment in time, or even permanently so.


RE: Awesome hardware...
By AOforever1 on 10/29/2007 2:06:37 AM , Rating: 4
When was the last time you tried ATI drivers?
I don't know about you, but ATI has improved greatly from the ancient days.


RE: Awesome hardware...
By Samus on 10/29/07, Rating: 0
RE: Awesome hardware...
By Lonyo on 10/29/2007 5:43:04 AM , Rating: 4
I wanted temp monitoring in my nVidia drivers, so I had to install the nTune thing. That resulted in my driver control panel crashing every other time I clicked on something, and still didn't display the temperature in the driver control panel.
THEN I found out there's another program which you run to get temperatures, which wasn't indicated at all anywhere in the control panel (which still crashes every other time I click on something).

nVidia drivers have been getting worse, while ATi improve.


RE: Awesome hardware...
By Serlant on 10/29/2007 8:40:50 AM , Rating: 2
I'm sorry but how long did it take nvidia to get its 8800 series drivers sorted AFTER the product launch? I'm sorry but aying they are where they are now because of they're drivers is bull****.


RE: Awesome hardware...
By StevoLincolnite on 10/29/2007 2:25:04 PM , Rating: 2
Allright, for one S3 and Matrox are still alive and kicking, Matrox are doing small updates to they're Parhelia GPU, S3 are still doing updates with they're Chrome series of GPU's.

And you really haven't tried ATI's drivers lately have you?
They are stable, they run great, and they also have another Advantage for the tweakers out there, called
"The Omega Drivers" - I managed to get my Radeon 9550 from 8000 3D marks in 2001 to about 10k through tweaking the Omega drivers, I also gained similar results with my X1650Pro (Although not as much).

And back when the TnT was the norm, nVidia actually bought a company who made 3D cards for them, which also modified nVidia's own drivers to obtain a staggering 10% performance boost, Thats when nVidia's drivers really kicked off.
(I can't remember the name of the company, but I was around when they acquired them).

If you want complete crap? try and find nVidia's Riva 128 drivers and run them in Windows 98, and have fun.

Seriously, people this day and age should be happy that drivers are not such a pain to install, with constant glitches in every game, random blue screens of death, windows hanging, artifacts, in-compatibilities etc.


RE: Awesome hardware...
By elpresidente2075 on 10/29/2007 11:41:25 AM , Rating: 2
I have an ATI based laptop with the latest ATI Catalyst Driver suite, installed last week.

While I will agree that the drivers are getting better, they're still not up to snuff. Forcing me to download/install the .NET framework 2.0, installing several processes, and having those processes communicate through TCP/IP is not what I call good driver architecture.

Don't even get me started on the Linux drivers, or the fact they want you to install Steam (great program, btw) and some MMO whose name I cannot recall right now.

For those talking about Ntune, there's a program called RivaTuner that is much better and also more stable, not to mention it works with both ATI & Nvidia cards.

Regardless, the ATI drivers have a very long way before they're acceptable, IMO.


RE: Awesome hardware...
By StevoLincolnite on 10/29/2007 2:28:33 PM , Rating: 4
Grab the Omega's, its a Laptops Mobility Radeons wet dream.
No issues, no driver modding, lots of tweakable buttons and settings, over-clocking tools, latency tools, it even boosts performance, stable, fast and you can access it via the System Tray thingy, or through the old fashioned way, by Right clicking the desktop >>> Properties >>> Settings >>> Advanced.


RE: Awesome hardware...
By Hieyeck on 10/31/2007 6:11:44 PM , Rating: 1
Conversely, download only the drivers and not the entire catalyst suite. Your inability to take the time to properly browse a website is not ATI/AMD's problem.


RE: Awesome hardware...
By mindless1 on 10/29/2007 1:52:08 PM , Rating: 2
Yes, ATI's drivers are much better than they used to be, but that's more like admitting they are bad at drivers than any kind of comparison about the final state at any moment in time.

Do I need to write CCC? What a steaming pile.


RE: Awesome hardware...
By Dribble on 10/29/2007 2:09:29 PM , Rating: 2
Look at the reviews with X2900's - performance is all over the place. e.g. bioshock:
DX9 : approx 8800GTX performance
DX10: 2/3 of 8800GTX performance
Xfire: well at first this was slower then no Xfire, now adds a bit.

That's about their most competitive game, if they could get it working across the board like they managed in DX9 bioshock then I would agree they have good drivers. Until then you've got to say drivers are holding them back.


same ole bS
By toyota on 10/29/07, Rating: -1
RE: same ole bS
By ZavyZavy on 10/29/2007 4:15:58 AM , Rating: 3
Even if that were the case, there is nothing wrong with getting the consumer to wait for your product. It encourages competition eventually.
In some cases it works well: PS2 Tekken TT screen shots "leaked" right before Dreamcast launch.
And some cases it doesn't: PS3 FF7 technical demo "leaked" right after Xbox 360 launch.


RE: same ole bS
By ksherman on 10/29/2007 7:58:40 AM , Rating: 2
Hmm, i didn't hear anything about that FF7 demo... Just looked it up on youztubez, for the first 1:26 of 1:30 minutes, I was like WTF, they just put the PS CD in a PS3, who cares... Then cloud jumped of the train. Admitedly, that was pretty sweet, but I like my blocky cloud ;) I half expected it to just be a button-mashing fighting game, not an epic RPG.


RE: same ole bS
By mindless1 on 10/29/2007 2:11:22 PM , Rating: 1
If you find nothing wrong with misleading advertising, you are probably part of the minority. There most definitely is something wrong with "getting a consumer to wait" as a deliberate act instead of just supplying clear and accurate information.

The Pingpong demo is a good example, a deliberate choice to contrast turning off a specific feature instead of contrasting best effort possible using DX10.1 vs DX9.

Deliberate deceit is not something I like to reward by buying a product. Your criteria may be different but it's not as though eliminating this aspect of marketing would be anything but positive for consumers. I find the same fault with nVidia, Intel, et al on some occasions so it's not as though I'm picking on ATI but that's what this news article is about so it is timely. Whoever is falling behind does it.

Equally so, we will continue to point out such factors from all manufacturers, because it matters!


RE: same ole bS
By Lightning III on 10/29/2007 8:11:12 AM , Rating: 1
yeah thats why nvida moved up the launch of the g92 so it wouldn't come out w/ the 670 slapping it around like a redheaded step child


"We basically took a look at this situation and said, this is bullshit." -- Newegg Chief Legal Officer Lee Cheng's take on patent troll Soverain

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki