backtop


Print 140 comment(s) - last by Josh7289.. on Oct 13 at 2:01 PM


ATI DirectX 11 demo rendered on next-generation hardware

Without tessellation

With tessellation
Features like tessellation and physics may finally challenge graphics cards

Video games have reached a plateau in recent years. Games still look good, but there hasn't been a leap forward towards photo-realism that many have been seeking. The hardware is powerful enough, so what is the problem? The finger can be pointed squarely at DirectX.

DirectX is probably the best known collection of Application Programming Interfaces (APIs) out there. Originally developed to transition game development from MS-DOS to Windows 95, it allows game developers to easily program for a wide variety of hardware.

The development of DirectX over the last 14 years has paralleled the development of Windows. There were many frequent updates to DirectX from the release of version 5.0 to DirectX 9.0, with gamers eagerly downloading each update in order to unlock higher performance or better features. That was because there were dozens of video card companies developing hardware during the turn of the century. With only three major players today, most performance boosts have come from GPU manufacturers themselves.

Nonetheless, Microsoft still understands the importance of driving the baseline for graphics technology. Every major release of Windows since Windows 95 has come with at least a DirectX update. Windows XP came with DirectX 8.1, XP Service Pack 2 included DirectX 9.0c, Vista launched with DirectX 10, and Windows 7 will launch with DirectX 11 on October 22.

Windows Vista introduced the Windows Display Driver Model, which allowed new features such as virtualized video memory and scheduling of concurrent graphics contexts. Since DirectX 10 was so closely integrated with Vista, it could not be easily used with older versions of Windows. Along with poor consumer satisfaction, this led to a dearth of game developers programming for DirectX 10, despite its many features. Game developers target the largest number of consumers possible, and none of them wanted to program exclusively for DirectX 10 if nobody had hardware for it.

Microsoft has learned its lesson, and the situation is very different with DirectX 11. It is essentially a superset of DirectX 10.1, itself a superset of DirectX 10. This means that game developers will be able to design their games for DirectX 11, but the Direct3D 11 runtime will scale back graphics features that are not supported by the hardware. This also means that Windows Vista users will be able to install DirectX 11, which means a bigger market for game developers.

There are several key features in DirectX 11 that will make graphics on the screen look closer to reality. Tessellation is used to increase the polygon count in an image. The more polygons, the more realistic the image will look. Gone will be the days of blocky looking characters as the polygon count will increase significantly with the next generation of DirectX 11 hardware.

Multithreaded rendering will allow Direct3D processes to run across multiple CPU cores. Most games are using dual core CPUs, but multithreaded rendering will make gaming on triple and quad cores finally worth the cost. A faster processing pipeline and increased scaling are only some of the benefits.

DirectCompute allows access to the shader cores and pipeline. It allows for non-proprietary physics implementations, which some open-source physics projects are looking to take advantage of. Video transcoding will also take a significant leap due to access to the many processors on a modern GPU.

Many improvements to image quality and realism were already made available in DirectX 10 with Shader Model 4.0, and Shader Model 5.0 in DirectX 11 will bring even more improvements. A new type of texture compression will bring higher image quality as well.

Game developers are very excited about the possibilities that DirectX 11 hardware brings. There are over three dozen DirectX 11 titles currently in development, and some of them will be available for the Windows 7 launch.

Finally, graphics cards will have something to challenge them.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Not that I doubt it...
By therealnickdanger on 9/16/2009 9:09:32 AM , Rating: 5
Buuuuut:

quote:
Games Will Leap Towards Photo-Realism With DirectX 1

-Microsoft, 1995

quote:
Games Will Leap Towards Photo-Realism With DirectX 2

-Microsoft, 1996

quote:
Games Will Leap Towards Photo-Realism With DirectX 3

-Microsoft, 1996 (again)

quote:
Games Will Leap Towards Photo-Realism With DirectX 4, wait oops, nevermind.

-Microsoft, never

quote:
Games Will Leap Towards Photo-Realism With DirectX 5

-Microsoft, 1997

quote:
Games Will Leap Towards Photo-Realism With DirectX 6

-Microsoft, 1998

quote:
Games Will Leap Towards Photo-Realism With DirectX 7

-Microsoft, 1999

quote:
Games Will Leap Towards Photo-Realism With DirectX 8

-Microsoft, 2000

quote:
Games Will Leap Towards Photo-Realism With DirectX 9

-Microsoft, 2002

quote:
Games Will Leap Towards Photo-Realism With DirectX 10

-Microsoft, 2006

quote:
Games Will Leap Towards Photo-Realism With DirectX 11

-Microsoft, 2009




RE: Not that I doubt it...
By Bateluer on 9/16/2009 9:15:58 AM , Rating: 5
And technically, they'd be right on each account. With each release of DirectX, games have gotten more and more realistic looking. We're getting closer, but console oriented development is slowing the process down.


RE: Not that I doubt it...
By ksherman on 9/16/2009 9:30:33 AM , Rating: 5
Boooo consoles! Holding back our beautiful, tessellated graphics for years.


RE: Not that I doubt it...
By StevoLincolnite on 9/16/2009 10:01:01 AM , Rating: 5
The Xbox 360 can "Tessellate", so technically that's one console that hasn't been holding back tessellated games, although I'm pretty sure the Xbox 360's implementation would be incompatible with Direct X 11's version.

However the issue is that nVidia and Intel don't support Tessellation at the moment, where ATI has for some time, even going back to the Radeon 9xxx series, they had a form of Tessellation capability called "Truform" Which was available many many many years ago. (The Radeon 9xxx series had Truform 2).

The big issue is that ATI only controls a fairly small piece of the Pie, it honestly doesn't make sense from a developer stand point to design a games graphics capabilities by a fairly small piece of that pie.

Plus nVidia has been slow to move it's behemoth butt from Direct X 10 and move to Direct X 10.1 across there entire line-up, and strong arming developers not to support Direct X 10.1. (Assassins creed anyone?)


RE: Not that I doubt it...
By bobvodka on 9/16/2009 10:09:30 AM , Rating: 1
You are correct that the XBox360's tessellator isn't the same as the DX11 version.

Nor is AMD's; the tessellator (which took them ages to expose as a GL extension anyway) is 1/3 of the pipeline added with D3D11 as it lacks exposure of Domain and Hull Shaders.


RE: Not that I doubt it...
By omnicronx on 9/16/2009 10:31:47 AM , Rating: 5
quote:
The Xbox 360 can "Tessellate", so technically that's one console that hasn't been holding back tessellated games, although I'm pretty sure the Xbox 360's implementation would be incompatible with Direct X 11's version.
They won't be directly compatible, but word on the street is that it will be very easy to port. Remember previously it was pretty much an ATI/MS implementation of tessellation, as such it was not a standard (ATI introduced tessellation a while back, and MS used it with the r500 in the 360). That being said, the underlying code should be very similar (don't fix what aint broke), so down the road they very well could be compatible as this could easily be handled by a good compiler.

Consoles have not been holding back tessellation, if anything they have been pushing it. The use of tessellation on the 360 will most likely persuade developers to use it once DX11 becomes available. As you stated, it is manufacturers like Nvidia and Intel holding it back, as a standard means absolutely nothing if it is not present in a large portion of hardware out there.


RE: Not that I doubt it...
By B3an on 9/16/2009 1:54:53 PM , Rating: 1
Even if thats true, the 360 does not have the power to use tessellation anywhere near to the extent the upcoming DX11 cards will.
A ported 360 game that uses tessellation would still look average by PC standards.


RE: Not that I doubt it...
By omnicronx on 9/16/2009 2:19:28 PM , Rating: 2
Thats not really the point here though, obviously the r500 which is based on 4 year old technology is not going to perform like the DX11 cards of today. The point is that porting from one to another could be very easy. The DX11 approach is a superset of AMD's/MS's implementation, so just as DX10.1 cards can play DX10 games minus whatever features are present in DX11, the same could potentially be done with tessellation. Of course this all assumes the legwork is put in to accomplish this, but it would make sense from a development perspective, especially for crossplatform games for the 360 and PC. Tessellation is also fully hardware scalable, meaning less complex models could be generated on lower end hardware.


RE: Not that I doubt it...
By MrPoletski on 9/18/2009 10:02:59 AM , Rating: 2
ATi had truform in its graphics cards years ago, before the xbox360 was out...

but is everybody forgetting that the Xbox360 uses an ATi graphics chip?


RE: Not that I doubt it...
By afkrotch on 9/21/2009 2:07:40 AM , Rating: 2
The problem is games get built for lowest common denominator. Look at it nowadays. Games are built on console and later ported to PCs. Giving PC gamers a crappy looking game that is a terrible port, causing a lot of resources to be consumed just to look the same.

Then when a PC game that actually pushes PC hardware comes out, it's a crappy game and everyone hates how it will actually use the latest hardware and require more.

Ppl complained when Doom 3 came out. Then Crysis. "OMG, my system can't handle it. blah blah blah."


RE: Not that I doubt it...
By dragunover on 9/16/2009 9:17:12 PM , Rating: 2
I could of sworn STALKER Clear Sky and Shadow of Chernobyl ran best on Nvidia( had the slogan and everything ) until ATi came out with the 4800 series.. then the first magically had DX10.1 and ran incredibly better on the 4800's, aswell as logo'ing ATi.


RE: Not that I doubt it...
By DrizztVD on 9/17/2009 3:09:28 AM , Rating: 2
quote:
Boooo consoles! Holding back our beautiful, tessellated graphics for years.
quote:
The Xbox 360 can "Tessellate", so technically that's one console that hasn't been holding back tessellated games,
quote:
Consoles have not been holding back tessellation, if anything they have been pushing it. The use of tessellation on the 360 will most likely persuade developers to use it once DX11 becomes available. As you stated, it is manufacturers like Nvidia and Intel holding it back, as a standard means absolutely nothing if it is not present in a large portion of hardware out there.


Consoles have definitely been holding back advancements in PC game graphics. Go compare the Dunia engine and CryEngine 3 to CryEngine2 (thats Far Cry 2, Crysis 2, Crysis respectively) There are large differences.

Most notable are draw distances - which totally suck on Dunia and CryEngine 3. These happen to be the cross-platform engines, for consoles and PC.

The problem is not support for advanced techniques. It's the relative lack of calculations per second. You cannot compare graphics on an xbox 360 with something like a core i7 at 4Ghz with a GTX 295. The latter runs circles around the former, even though they're not in the same category.

I do not think this is something to worry about. Because where PC graphics advance faster, console graphics improve inevitably with newer generations. It's just sad that developers like Crytek don't program their newest engines for PCs anymore.


RE: Not that I doubt it...
By afkrotch on 9/21/2009 2:02:08 AM , Rating: 2
quote:
Plus nVidia has been slow to move it's behemoth butt from Direct X 10 and move to Direct X 10.1 across there entire line-up, and strong arming developers not to support Direct X 10.1. (Assassins creed anyone?)


Who cares? No one is even bothering with DX10. Why bother with 10.1?

I went to Vista for a short while to see the difference between 9 and 10. Didn't see any, so went back to XP. Just because it's a higher number, doesn't mean it's going to be automatically better. I have not seen a single DX10 game that provides a huge quality difference or any difference for that matter between it's DX9 variant. All I've seen is same quality, more processing powered sucked away.


RE: Not that I doubt it...
By MrBlastman on 9/16/2009 11:14:46 AM , Rating: 5
Is it just me or am I the only one who is _quite_ content right now with where graphics are at--on the PC. If I were able to influence game development, I would right now have it take a further pause on graphics (who else has been happy not _not_ have to worry about constantly upgrading for a while) and focus on revamping and improving gameplay?

The graphics that we have right now are darned spectacular. They are far from looking crummy like they did back in the Direct X 5 and 7 days. As is right now, games like Team Fortress 2 (art style all the way), Crysis, UT 3 engine, DCS: Black Shark/Rise of Flight, Supreme Commander etc., I could go on, all look great.

What has lagged behind in this recent generation of games is the gameplay itself. It is the number one reason I have avoided consoles this round completely. The gameplay has become dumbed-down (Even TF 2 is dumbed down), more simplistic and less skill based than in the past. If they took some of those dollars and put it towards having fun with the GREAT graphics we have now, I'd be extremely content.

But, I grew up using an Atari 2600 and before than, pinball. My level of content perhaps is far different from the younger generations due to me dealing with the crummiest of crummy graphics (and I'm still quite happy playing Joust, River Raid, Star Raiders, Solaris, Castlevania, Samurai Showdown, Street Fighter etc.). So really, Direct X 11 doesn't excite me that much--not yet. The tesselation looks neat, the multi-core optimizations are great; the are even better if they apply backwards to DX 9 games. However, I don't look forward to another graphics arms race.


RE: Not that I doubt it...
By Yaos on 9/16/2009 11:41:56 AM , Rating: 4
You're right, because this is 1980 and the people that make the graphics and the people that make the gameplay are the same.


RE: Not that I doubt it...
By MrBlastman on 9/16/2009 12:07:38 PM , Rating: 5
However, unlike 1980 more money nowadays is being spent on graphics rather than the designers/producers who create the gameplay. ;)

Sandbox games are all over the place, yet, none of them stick out in my mind like Ultima 6/7 do (they were sandbox games) with the depth, plot and story that they had. It is typically just one dull pretty game after the next.


RE: Not that I doubt it...
By T2k on 9/16/2009 1:00:57 PM , Rating: 1
I have no idea what are you talking about. Most games I play (Bishock, STALKER-series, Fallout-series) have excellent story and AI, paired with hgih-end graphics nothing non-PC can touch... bad luck/taste there perhaps...?


RE: Not that I doubt it...
By MrBlastman on 9/16/2009 1:59:42 PM , Rating: 2
I was specifically talking about sandbox games there. Modern ones would be GTA Series etc., Fallout, Morrowind, Oblivion etc. Bioshock is less sandbox and more System Shock/Ultima Underworld (well, more linear than Underworld), but, I'll admit, Bioshock was a lot of fun, though not really innovative. If anything it was more restrictive than the System Shocks/Underworlds of the past.

Morrowind was excellent and a lot of fun, though the main plot was kinda weak. Oblivion is kinda, well, blah story-wise but fun just to walk around and hack things. Fallout 3--this has been a lot of fun so far and a definite improvement over Oblivion.

The thing is, none of them really make you care about the characters or the world you're in. In Ultima 6 for instance, I spent a whole afternoon in one crummy little town trying to save a guy from going to the gallows and actually cared about everyone in the town. It had that depth and writing that really sucked you in. That is what is missing. We get more plastic characters or things that are less suprising. There are gems here and there (Mass Effect had excellent character interaction, though the gameplay could have been beefed up quite a bit), but it is rare they go the extra mile like back in the day.

Perhaps it is because back in the day all we had was good story and gameplay and graphics could not be a crutch like they can be now. I know this--I go back and play classics still to this day and am immediately sucked in far more quickly than a lot of the current stuff. I suppose it is like having a chicken leg. One is just well, a chicken leg of an average chicken and the other one is one that is from a fat, well fed, stuffed chicken that is then fried and battered. The average one looks good, being rotisserie and all, but the fried one looks really bad for you but it is oh so good. I'd lump some (they weren't all good) of the older games into the fried chicken category.


RE: Not that I doubt it...
By ClownPuncher on 9/16/2009 2:15:42 PM , Rating: 3
The classics all had "cutting edge" graphics when they were released too. There are many newer games with interesting stories and characters, check almost any Bioware game for an example. KOTOR, NWN2, Mass Effect. I agree that there are many "Michael Bay" type games out too that completely suck, but check into games like Wizardry 8 if you want some real fun.


RE: Not that I doubt it...
By The0ne on 9/16/2009 2:19:27 PM , Rating: 1
You're not going to convince any "serious" or "hardcore" gamers with the old games that HAD cutting edge tech/graphics/sound. :)


RE: Not that I doubt it...
By MrBlastman on 9/16/2009 2:26:06 PM , Rating: 2
Wizardry 8 is one of the greatest RPG's ever made. :) I loved every minute of it. I wouldn't say it was cutting edge graphically though when it came out due to the delay of its release (I remember sitting in college in 1998 looking at a preview of it on the internet, only to relish it years later). Graphics aside, it was a masterpiece of action, strategy, plot (every character spoke what they said! amazing!) and flexibility.


RE: Not that I doubt it...
By ClownPuncher on 9/16/2009 3:28:02 PM , Rating: 2
If they would release the source on W8 the mod community might very well revive that type of gaming


RE: Not that I doubt it...
By Josh7289 on 10/13/2009 2:01:27 PM , Rating: 2
It's irrelevant that the classics had "cutting edge" graphics when they were released, too. The fact is still that much much much more time and money is put into graphics today than it ever was in the past (in total, and percentage-wise).

Game development in general is much more high risk than it was in the past. When there's so much money being poured into each title, developers/publishers want to minimize the risk of losing it all by not making any sales. The trend seems to be to do this by pumping as much time and money into graphics as they can, while making very conservative decisions on gameplay (thereby minimizing the time and money that has to go into gameplay development and/or avoiding the risk of trying something new).

It's easier to make a sale by showing a person impressive graphics than it is to make a sale by trying to explain to them that your game has really good/innovative/original gameplay and (if applicable) a really well-written story/characters.


RE: Not that I doubt it...
By Spoelie on 9/17/2009 5:54:39 AM , Rating: 2
Have you tried the witcher? That has some very good character interactions


RE: Not that I doubt it...
By MrBlastman on 9/17/2009 1:20:55 PM , Rating: 2
Yeah I have, actually. The only thing that is weak in it is the combat--it is too one clicky. It wants to be a semi-arcade hack n' slash RPG while at the same time not being one. I like the game, it is just different.


RE: Not that I doubt it...
By Phoenix7 on 9/19/2009 6:34:19 AM , Rating: 2
RE: Not that I doubt it...
By piroroadkill on 9/22/2009 9:45:39 AM , Rating: 2
Shenmue was incredible for the time, and honestly, Shenmue and Shenmue II for Dreamcast still stand out as amazing looking games with a lot of rich detail.

For a quick example of the animation detail, stand near a curb, with one foot on, one foot off. Almost every game, ever, you'll be floating off the side of the curb or with one foot sunk into the curb. Not so in Shenmue, where the character animates and bends his knee to accomodate the step.


RE: Not that I doubt it...
By Pirks on 9/16/09, Rating: 0
RE: Not that I doubt it...
By MrBlastman on 9/16/2009 2:51:53 PM , Rating: 5
Hi Pirks :)

quote:
Come on guys ignore Blastman, he has not a SLIGHTEST clue about modern PC gaming which is VERY FAR from simplistic.


You wouldn't either seeing how you're attached to your Mac from the hip. :P


RE: Not that I doubt it...
By Pirks on 9/16/09, Rating: 0
RE: Not that I doubt it...
By MrBlastman on 9/16/2009 3:05:51 PM , Rating: 4
Oh. I'm sorry. I suppose that Apple fanbow(oi) aura of colors coming from your eyes blinded my ability to make a cognitive reply temporarily. ;) Of course you don't own a Mac.

Pirks to Apple is nothing like a Fish and Water... o_O


RE: Not that I doubt it...
By Pirks on 9/16/09, Rating: 0
RE: Not that I doubt it...
By FITCamaro on 9/16/2009 12:35:16 PM , Rating: 2
Well said. The focus needs to be on game play and story. Not more of the "shiny!" mentality.

It can look as pretty as it wants. Pretty doesn't make a good game for longer than it takes for the graphics to become the standard.


RE: Not that I doubt it...
By kroker on 9/17/2009 8:22:58 AM , Rating: 2
I'll probably get rated down for saying this, but I don't think you are right. I do understand where you're coming from, personally I had more fun with NES (I still play them with emulators!) and DOS/Win 95 games than with all of today's gazillion-polygon HDR lit games running on 3GHz quad-cores with 4GB of RAM and accelerated by a teraFLOP-capable monster of a GPU. But I think you need to look at the bigger picture. Gameplay and us having fun with games, while it would be great and all, is not important in the grand scheme of things... In the end, the important thing is that the tech is advancing (no matter what that tech is about). If devs would put more effort in the gameplay, graphics would be neglected (see Wii), and the need for powerful hardware and innovating technologies for simulating reality (physics, ray-tracing, AI etc) would decrease... Games just need to be fun enough to drive the sales of new hardware.

And to everyone who says that current graphics are spectacular, I can just say you don't know what you're talking about. Even Crysis, with it's advanced graphics, looks fake, painfully polygonal if you look closely, and far from realistic. It's graphics are pretty and detailed compared to older games, I'll give you that, but not realistic, though they may look so at first glance. I thought is was great at first, but after playing it casually for a year, now I find it primitive and painful to watch even in the highest details (yes, you've read that right!). I won't even mention the physics engine, which I think is terrible. People have said that the graphics are good enough for a long time now - and if we would have stopped there, today we wouldn't have games like Crysis. Personally I am sick to my stomach of the polygonal look all 3D games have - I can't wait for DX11 tesselation to become commonplace. Computing technology is advancing, and games are the driving force (because of the money consumer spend on hardware). But it will be worth the wait for us gamers too - with more powerful hardware we will finally have real-time ray-tracing.

I can also say the same for pr0n and pirated media - they drove the need for larger bandwidth requirements and now broadband is commonplace. I'm sorry, but the damage caused to the music & movie industry (a bunch of whiny, arrogant and untalented clowns calling themselves "artists", which are waaaaaaaaay overpaid for the service that they provide to humanity anyway), is insignificant by comparison.

Technology is what really matters, and in order to advance, it needs our consumer money for the massive R&D costs. We are depleting the worlds resources (because of our lifestyles), and we can't afford to stand still and relax, we need to move as fast as possible.


RE: Not that I doubt it...
By Mitch101 on 9/16/2009 9:31:17 AM , Rating: 1
I read that before and made that statement as well that developers will developer games for the most widespread audience to maximize sales.

Basically a large majority of games will continue to be written with the Wii in mind then some eye candy thrown in for the more powerful consoles 360/PS3 up to the PC with DX11 eye candies but wont be developed full blown with DX11 in mind because sales would be limited. Considering how much money needs to go into a blockbuster game today this shouldn't be surprising.

Sadly we will most likely only see a few DX11 titles that are additionally funded by the graphics company trying to promote their new graphics products over the next year or so until DX11 graphic cards become popular. And with PC games getting a back seat to consoles sadly its going to be bleak for a while.

The good is that DX11 is near and Microsoft has most likely begun developing the next gen console around it already.


RE: Not that I doubt it...
By The0ne on 9/18/2009 1:58:53 PM , Rating: 2
That was in part mostly true of the previous console versions. With the start of the PS3, 360 and Wii many companies wanted/signed to just developed for one platform. There are many reasons to this, with cost being one of the main ones, because they "believed" that one platform was going to really out do the other two. Because the PS/PS2 was so popular, even with the huge amount of crap games, they could at least recover some cost. It's not to say that was bad judgment on the developers part. No one could have predicted it.

A few years after the console releases and we're just now starting to see companies shifting to the idea you just stated. Due to the popularity of the Wii and it's control scheme even Sony and MS has jumped on the boat to try to take some of the market share. Who in their right mind wouldn't? We're talking about a single game that sold MILLIONS in the first week alone. Add to insult it's a rather simple game to begin with and you can't but not scratch your head and said boy we're screwed or boy we need to change.


RE: Not that I doubt it...
By TheBaker on 9/19/2009 4:12:14 AM , Rating: 2
quote:
With each release of DirectX, games have gotten more and more realistic looking.


Promo developer 1: Hey, man those skyscrapers and trucks look awesome! And that giant robot looks like it could be totally real!

Promo developer 2: Do you think we should try to make the dude's hair look more realistic?

Promo developer 1: Why would anyone be looking at his hair?


RE: Not that I doubt it...
By mmntech on 9/16/2009 9:29:57 AM , Rating: 2
Yah, I remember when they released the DirectX 10 "screenshots" for Microsoft Flight Simulator X and we all said how amazing they looked. In fact they did look photo-real. Then it turned out they were done by an artist rather than being actual screenshots. The DX9 and DX10 versions of FSX are virtually identical.


RE: Not that I doubt it...
By scrapsma54 on 9/16/2009 9:49:35 AM , Rating: 2
Hopefully this time around there will be so much more in the way of Dx11 being way different between Dx9.


RE: Not that I doubt it...
By mmntech on 9/16/2009 10:47:14 AM , Rating: 2
The big thing is supposed to be GPGPU. OpenCL has a big head start though, especially now that it's been integrated into most current Macs. Now all we need is software than actually supports it.


RE: Not that I doubt it...
By IcePickFreak on 9/16/2009 10:13:40 AM , Rating: 2
Time to pull out the ol' hype machine and a can of spray paint to change the 10 to 11.

The advancement is always good, but I can't be the only one tired of marketing monkeys trying to justify their overpaid wage with a bunch of BS & hype.


RE: Not that I doubt it...
By dragunover on 9/16/2009 9:25:52 PM , Rating: 2
Hype machine?
Your post makes no sense, you're saying 10 = 11 because there are no changes but advancements don't count as changes?

Well hell, Pentium 4's are really the same as Core i7's after all it's just advancements after all with BS & Hype.
...............


RE: Not that I doubt it...
By The0ne on 9/16/2009 2:21:42 PM , Rating: 2
I don't think I could have said it better myself :) I remember PS3 and their beautiful demo graphics of various games. Come time for release and you get standard, not so realistic graphics. Of course, anyone in their right mind would stayed away from the claims. And then DX10 on the PC, urgh...yea, that was the biggest joke of all for me. I could probably go on various tech/game forums and poke fun at all those believers.


RE: Not that I doubt it...
By descendency on 9/16/2009 2:31:12 PM , Rating: 1
This comment deserves a 7.


RE: Not that I doubt it...
By Belard on 9/16/2009 2:39:41 PM , Rating: 3
Nvidia said that about their NEW GeForce3 card, which was when The Final Fantasy movie came out.

In that the GF3 can render 3D like the FF movie in real time... something like that... its been many years.


RE: Not that I doubt it...
By Funksultan on 9/17/2009 8:24:10 AM , Rating: 5
W/E.

Clever post, although I think you were more interested in looking for attention than making an accurate point. Those are NOT quotes from Microsoft when D/X was released. (Yeah, they were REALLY thinking that was gonna bring photo-realistic games to WINDOWS 3.1 )

Look ma, ANYONE can hit the quote button!

quote:
Wow, I have the journalistic integrity of Jason Mick.

- therealnickdanger, 2009


By therealnickdanger on 9/18/2009 9:19:38 AM , Rating: 2
... really?

There's this thing called "tongue-in-cheek", consider it for a moment. Since you didn't get the obvious point I was making, I'll just explain it to you:

In the context of graphics, with each new API or hardware advancement, promises are made regarding "graphics realism"; promises that rarely reach fruition. Leaps and or bounds are sometimes made with each new version, however it never reaches that holy grail of "photo-realism". The promises are sometimes made by the innovator, sometimes by a developer, but the fact remains that we never quite reach the goal. It's not a criticism, I'm just poking fun at the history of this promise.

Understand now, buddy?

LMAO


RE: Not that I doubt it...
By DOOA on 9/18/2009 8:39:00 PM , Rating: 1
I have yet to see a post about why we are getting DX11 at all. Truth is Microsoft is an out-for-profit company. If stifling innovation is best, they do it. If innovation is best they do that. If announcing pure BS as the truth creates sales, they do that. This is unfortunately normal for many companies.

DX11 could most likely run just fine under Win 98 and most likely DOS if MS did not want to use it to push Win7. DX11 would be unnecessary if Microsoft went with Open GL. But they need to do something to get power users and gamers because each Windows evolution has less power and poorer game play. Win 7 is worse than Vista which is worse than XP for power users. It is not impossible, but certainly they progressively emphasize cute over functional. Win 7 does seem a tad faster than Vista in games, but still not as fast as XP. Win 7 has no real feature set for power users (except IT administration) or gamers that make up for its increase in resources.


RE: Not that I doubt it...
By monkeyman1140 on 9/21/2009 4:39:40 PM , Rating: 2
Which begs the question....

"Where is my Duke Nukem Forever in DX11???"


RE: Not that I doubt it...
By Regs on 9/22/2009 6:14:34 PM , Rating: 2
Microsoft can just STFU. Single threaded applications are still dominant and software developers are never going to develop for only 2% of the market. It's simple economics. Just take a look at Jurassic Park. Photo realistic dinosaurs and to tell you the truth I think the original was better looking than the sequels and that was over 10 years ago.

Thanks to MS for making it so difficult in the first place to develop for PC and their craptacular release of Vista. MS derailed 8 years of progress.


Is this the improvement we have been waiting for?
By gplracer on 9/16/2009 9:14:15 AM , Rating: 2
I remember DX10 being touted as the next best thing. Unfortunately this never really came true. Many games show very little improvement when run in DX10 instead of DX9 and the hit on performance is too great.




By FITCamaro on 9/16/2009 9:21:25 AM , Rating: 2
You apparently didn't read the article. The reason DX10 didn't really bring greater visuals was that developers didn't want to invest in DX10 features when a large number of people wouldn't be able to see them due to still being on XP. Yes first gen DX10 hardware wasn't really powerful enough anyway but that is no longer the case. So many games were written primarily with DX9 and DX10 features were added if you had the hardware.


RE: Is this the improvement we have been waiting for?
By tfk11 on 9/16/2009 11:26:31 AM , Rating: 2
Despite the content of the article the fact remains that the failure of DX10 was due to Microsoft's many claims about DX10 simply not being true.

There were a few games that implemented DX10 from an early stage but they failed to exhibit the vast majority of the enhancements that MS claimed DX10 would provide.

Until an actual game shows some tangible superiority over anything already done with DX9 I wouldn't put much value in Microsoft's claims about DX11.


By monomer on 9/16/2009 2:41:40 PM , Rating: 2
Part of the problem with DX10 was that it was completely neutered before it was released. If I remember correctly, tesselators were supposed to be part of DX10, which is why all ATI have had the ability since the HD2000 series, but Nvidia wasn't able to implement it in time, so MS cut the feature at their behest.


By mindless1 on 9/16/2009 2:17:43 PM , Rating: 2
True but most gamers don't have these modern high performance video cards, developers still have to design for the lowest common denominator and guesstimate how much further development cost is warranted to wow reviewers.

I suspect that since integrated video has gotten ever faster and more feature rich, even fewer will invest in mid to high end video cards in the future, and fewer will want to spend more money on games to support the inherent development cost increases for higher realism (though I couldn't begin to put a dollar figure on these increases).


By MrPoletski on 9/18/2009 10:05:41 AM , Rating: 2
that and a number of the features in directx 10 don't improve visuals, just allow things to be done more efficiently.

On top of that, winXP directx9 drivers were pretty damn mature and tweaked to oblivion when vista came out.. the directx 10 drivers using this brand spanking new driver model nobody had coded for before were....umm.. not.


By Spivonious on 9/16/2009 9:25:42 AM , Rating: 4
Right because if it's not 60+ fps it's not playable.

</sarcasm>


RE: Is this the improvement we have been waiting for?
By HrilL on 9/16/2009 11:35:55 AM , Rating: 3
actually I don't enjoy playing a game with less fps. I could careless about graphics quality I want FPS. When you play games competitively the more FPS you got the easier it is to make sure your shots hit. Games that look amazing and get 30fps on the newest cards are almost not worth playing. That is why older games are the most popular competitive games. Everyone is on a more level playing field and that makes for a lot better game then to play against people with more money than you so they have better hardware and now you're at a major disadvantage. 100FPS is what is needed for good game play imo.


By omnicronx on 9/16/2009 1:37:29 PM , Rating: 2
While your eyes can perceive more than 60fps (when motion is involved) if you are not using a CRT monitor I really don't see the benefit.

In fact with LCD monitors 60FPS+ can be a bad thing. AS most LCD's run at 60HZ, and while the image is updated faster in the gfx's framebuffer, it usually results in tearing/artifacts while switching frames (not to mention extra frames are discarded). This is why many new games turn on Vsync by default.

Just giving an example, assuming your monitor is running at 20hz at 20fps, each line is refreshed 20 times in a second to make one full frame meaning each frame is shown for 0.05 seconds. If the monitor is 20hz with a game at 40FPS you see half a frame every 0.025 seconds but you still only see one full frame every 0.5 seconds.

You are talking about fractions of a second difference, its just not possible for this to give you any perceivable advantage, your eyes just don't process the information that quick. Going beyond the spec of your monitor is just putting extra stress on your graphics card for no reason.

Whats more is no console game runs more than 60fps right now (heck EA sports games of 2007 and some of 2008 only ran at 30fps, I never would have known if I didnt look up the specs), they just assumed that most people will have 60hz displays.

People get set on their ways and they start to imagine things that are not there (probably because it was once true with CRT displays), i.e chances are, its all in your head. I'm not saying there are not certain situations (a select few to have very sensitive eyes) but not the vast majority of gamers out there that claim it to be true are in fact incorrect.


By omnicronx on 9/16/2009 2:01:00 PM , Rating: 3
I want to make it clear that your eyes can see far more than 60fps, it has already been proved that your eyes can see flashes of light up to and beyond 1/250 of a second, but that does not mean your eyes can somehow get past the limitation of the screen you are using.


By The0ne on 9/18/2009 1:37:11 PM , Rating: 2
A good point that most users don't seem to understand when talking or referring to FPS. There are limitations on the screen you're using.


By mindless1 on 9/16/2009 2:22:34 PM , Rating: 2
BUT, when people speak of FPS they generally mean an average FPS, often it's the case you aim for more than 50 FPS avg. just to keep the game from dropping to a low enough minimum FPS that you have stuttering in the more complex action scenes.


RE: Is this the improvement we have been waiting for?
By HrilL on 9/16/2009 2:48:06 PM , Rating: 2
I haven't switched off my CRT. Why should I when the new monitors don't offer the refresh rates I want. I use 120hz and 100hz for most games. While I agree most peoples LCDs will only do 60hz then doing more than 60fps is pointless. But for me its not at all. And when playing competitive games every little thing counts. Why do think there are no major competitions for First person shooters over the internet? Sure there is CEVO but all the major tournaments are on LAN because that eliminates the latency difference that is common on the internet. While 50ms doesn't seem like a lot of time. It is enough to give you an advantage or disadvantage. Say I have 70ms and the person I am playing only has 20ms. If we both clicked at the same time and we're both aiming dead on to each others heads to get a one shot kill I would lose every time because his packets get to the server first.


By omnicronx on 9/16/2009 3:36:12 PM , Rating: 2
Not a valid comparison in the slightest. I'll try to dub it down as simple as possible, imagine watching TV on a CRT (which displays a half a frame every 1/60th of a second) vs an LCD (displays a full frame every 1/30 of a second). To your eyes the actual picture you see (flickering aside) should be the same every 1/30th of a second, you either see two half frames making up one full frame every 1/30th of a second, or 1 full frame every 1/30th of a second. Either way, every 1/30th of a second, your eyes will see exactly the same thing.

60hz at 120fps you are talking 8ms at the very most difference (or 0.008th of a second) between each half frame compared to 16ms every full frame, and while the server may be able to perceive a 10ms difference (which is simply because the command from the PC with the lower latency will reach there first) your eyes cannot. After each 1/60th of a second, you see exactly the same information on the screen. With latency, one PC is literally sending and receiving the information faster.


By Triple Omega on 9/16/2009 12:36:15 PM , Rating: 3
I'm sorry, exactly where did he mention 60fps? A performance hit has nothing to do with 60fps at all! It could be going from a just playable 40fps to a just not playable 33fps and then your not happy.

Damn, I'm still stunned by the fact that you got that 60fps number out of his post. Even more so that at least two other people who put you on 4 didn't see that his post had nothing to do with 60fps. Where do you people learn to read and interpret?


By mindless1 on 9/16/2009 2:25:16 PM , Rating: 2
It's classic internet arguing, you imply someone meant something they didn't write so you have something to argue about that suggests they are wrong.


By omnicronx on 9/16/2009 3:47:45 PM , Rating: 2
Its just a classic argument that most gamers will try to make. It goes back to the quake/HL days, 60fps (60hz output for games was the norm at the time) would be considered the minimum, and people would always keep up with the latest and greatest to push their FPS to the max.

I found the comment bang on, just like the OP stated many other gamers would not dare sacrafice their 60+ FPS for DX10's eye candy. They 'needed' it for gameplay, even though this has been proved incorrect for the large majority of gamers, time and time again.


By fatedtodie on 9/16/2009 10:03:48 AM , Rating: 3
You are right, we should wait for the game developers to decide what technology should be innovated and what shouldn't...

Unfortunately if we did wait for the "blessed ones" to decide our fate we would not have dual-core processors (for those that remember Mr Carmack whining about how we stopped the Mhz race and how pissed he was that he had to change his programming to fit multi-core), and other wonderful innovatations.

Or how about we do it the way it is going, they will release new DirectX versions and the programmers get to look like idiots when they don't use it.


By kroker on 9/17/2009 9:17:08 AM , Rating: 2
Every technology is touted as the next big thing - it's called marketing. Some then succeed, some fail, and some are somewhere in-between (like DX10).

Just because something is not a huge leap forward, it doesn't mean it's not an important step towards that leap.


what if
By lenardo on 9/16/2009 9:30:23 AM , Rating: 1
What if i do NOT want photo realism in my games?

to date, the better the graphics the worse the game performed usually.

what i WANT is a good fun GAME to play. if it happens to look reallly good, great, but there is a reason why a game like world of warcraft sold better than the "realistic" mmo's, one of which is, the game performed better graphically on more computers, at release EQ2 performed like CRAP on even the best computers, while on the same computer wow was doing 60 frames per second.......

also another problem with photo realistic games is, the games cost more due to the increased artwork needed, there is a REASON why more 3rd party devs are losing money, the costs to make the game have skyrocketed, and most of that cost is increased graphical fidelity. I remember a comment about one of the more recent car racing games about the cars, 8 years ago it took 1 artist about 24hrs to do a car body, 4 years ago it took ~2 weeks to do a car body,
last year it takes 1 artist 1 month to do a car body. the reason why is the increased polygon count to increase the fidelity of the graphics on screen.

now take that to the other aspects of all games, leaves, trees, ground, grass, buildings, npc's etc and you have an explosion in costs associated in making a game,

the ps3 360 and wii are good examples of this as well, most devs when interviewed said that a quality game on the wii costs about 1/3rd to half as much as the same game would cost on the ps3 or 360, the reason why is all due to the graphics, the wii does 480p the 360 and ps3 do 720p....

give me good games first, the graphics don't matter as much.




RE: what if
By safcman84 on 9/16/2009 10:07:58 AM , Rating: 2
the price of 360 and PS3 is due the the license that game devs need to pay MS and Sony. they dont have to pay a thing to develop on PC, and i am pretty sure they can develop for free on the wii as well.


RE: what if
By randomname on 9/16/2009 11:43:14 AM , Rating: 2
quote:
i am pretty sure they can develop for free on the wii as well.


Despite your beliefs, Nintendo is making a huge profit, not only from hardware sales, or first party titles. There are other reasons why Wii games don't cost that much to develop. (People don't expect them to look as good as on PS3 or Xbox 360, because they can't.) In exchange for the fee (or more precisely, a cut from every game sold) console developers pay, they get a stable platform, and only one hardware configuration to design for. And access to a lot more active gamers than with the PC.


RE: what if
By mindless1 on 9/16/2009 2:27:57 PM , Rating: 2
Let's not forget a lot less piracy than with PC gamers... = higher return on investment.


RE: what if
By The0ne on 9/18/2009 1:46:35 PM , Rating: 2
I'm pretty sure the casual gamer has no clue whether or not they should care that the Wii games cannot look any better. Shrug, show me a first timer parent gamer, grandma or grandpa that does. The millions of consumers that bought Wii fit or Wii sport probably has almost 0 clue or rather interest in how HQ the graphics are.

Your other points are well taken.


RE: what if
By epobirs on 9/16/2009 2:31:04 PM , Rating: 2
the price of 360 and PS3 is due the the license that game devs need to pay MS and Sony. they dont have to pay a thing to develop on PC, and i am pretty sure they can develop for free on the wii as well.

Whoa, somebody needs a history lesson.

Nintendo INVENTED the controlled publishing model for third parties on consoles. The crash in the previous generation was largely driven by the mountains of crap games and the lack of direct benefit for the console makers when a third party had a hit.

What made the NES a money machine was not just Nintendo's own games but also that it generated huge royalty fees if a Capcom or a Namco had a big hit seller on a Nintendo console. Nintendo also reserved the right to control game content and deny production to games that were so poor and/or buggy as to be intolerable. Nintendo controlled cartridge production and collected its payments up front.

This is a big reason why selling games by download is so appealing for small companies. The capital outlay before anything goes on sale is greatly reduced, making it more feasible to produce niche titles for limited audiences.


RE: what if
By The0ne on 9/18/2009 1:49:42 PM , Rating: 2
The Nintendo Gold Seal of approval ultimately failed. Personally I think because Nintendo got fed up with, like you said, the "...mountains of crap game..." and said fck it we're not going to change these idiots.


RE: what if
By mixpix on 9/16/2009 10:07:48 AM , Rating: 2
Nintendo's next console will be HD so the production should go up, and we'll hopefully see games somewhat close to current 360/PS3, at least.


RE: what if
By Luticus on 9/16/2009 10:33:31 AM , Rating: 2
I’m sure that all sounded good in your head and I respect your opinion and all but I feel your argument is horribly flawed.

First you are not accounting for the cost of things like the SDK’s, part of the reason why Sony’s system is more expensive to produce for is because Nintendo’s SDK was much cheaper last time I looked. On top of that the Cell processor in the PS3 is a pain in the butt to program for, or so I’ve read. While part of it probably IS the increase in workload to make larger more expansive and prettier games there are other contributing factors as well.

Also you are completely ignoring the fact that as graphic rendering technology gets better so does the programs and tools necessary to create such art. You’re talking like they are still using the same bum ugly programs and tools that they were 15 years ago to make 1080p quality games. I assure you that they are not.

While I agree that as games have gotten bigger and better they’ve gotten more expensive to produce to say that better graphics is the direct reason that games are more expensive to produce is incorrect as there are other contributing factors that play a much larger role in a games production costs. Most of the reason why game companies are having trouble, in my opinion, is that they are focusing way too much on pretty and not enough on story, artistic style, and making the game FUN. A lot of games these days are turning into 90% CGI rendered scenes where your just sitting there watching and 10% actual game play. I feel more like I’m watching an interactive movie (cast your votes: A if you want batman to beat the joker, B if you want joker to dip batman in acid…) While I’m tapping B like crazy here in this hypothetical scenario, I am secretly wishing I could actually PLAY the game instead of watching a movie that I only occasionally get a say in what happens.


RE: what if
By shredder1970 on 9/16/2009 10:43:30 AM , Rating: 2
While I agree that as games have gotten bigger and better they’ve gotten more expensive to produce to say that better graphics is the direct reason that games are more expensive to produce is incorrect as there are other contributing factors that play a much larger role in a games production costs. Most of the reason why game companies are having trouble, in my opinion, is that they are focusing way too much on pretty and not enough on story, artistic style, and making the game FUN..

Give Savage2 a try. This title is from a small indi developer and shows what can be done without over blown production cost. The game is free to play unless you want access to replays, player ladder, and player stats. If you do want those features it is a ONE TIME 9.95 fee.

www.s2games.com

Some game play vids (streamed live at times. check fourm)
www.justin.tv/savage2

For me, this is one of the most addictive games i have played. Download it and give it a try. After all it is FREE.

I am not associated with s2 games in any way. Just a fan.


RE: what if
By Luticus on 9/16/2009 4:12:43 PM , Rating: 2
Will do man, and I appreciate the tip.


RE: what if
By torahtrance on 9/17/2009 12:07:28 PM , Rating: 2
that was probably the best comment I've gotten out of many threads :) recommended 'actually fun' games.

I too at 26, gamer all my life, feel as if games have gone down hill. I remember though in 2000, thinking, man in 5 years gaming quality will be insane! But gfx just went insane, I have not finished a game in a long time. I think the major flaw if it can be summed up into one word is

----->FLUIDITY<--------

most games feel like you are controlling a robot. For example, i feel its a programming error when there is a rock in my way, and I cannot go over it, in real life its no problem. Artificial problems combined with non fluid feeling gameplay = quickly boring.

We need games that the only problem is your skills.

Diablo for example, was extremely fluidic gameplay, I think this is a key point I have not seen touched much anywhere else. Also Quake and other quality fps games, you felt in it fluidic-like, as if it just flows. Without that magic flow, a lot of people are missing a key component no one has really focused on IMO


RE: what if
By torahtrance on 9/17/2009 12:08:33 PM , Rating: 2
that was probably the best comment I've gotten out of many threads :) recommended 'actually fun' games.

I too at 26, gamer all my life, feel as if games have gone down hill. I remember though in 2000, thinking, man in 5 years gaming quality will be insane! But gfx just went insane, I have not finished a game in a long time. I think the major flaw if it can be summed up into one word is

----->FLUIDITY<--------

most games feel like you are controlling a robot. For example, i feel its a programming error when there is a rock in my way, and I cannot go over it, in real life its no problem. Artificial problems combined with non fluid feeling gameplay = quickly boring.

We need games that the only problem is your skills.

Diablo for example, was extremely fluidic gameplay, I think this is a key point I have not seen touched much anywhere else. Also Quake and other quality fps games, you felt in it fluidic-like, as if it just flows. Without that magic flow, a lot of people are missing a key component no one has really focused on IMO


DX11, my ass...
By Glabrezu on 9/16/2009 9:21:20 AM , Rating: 5
RE: DX11, my ass...
By BladeVenom on 9/16/2009 9:33:04 AM , Rating: 3
So the "DirectX 11" example was done with Maya mental ray in March 2005. LOL!


RE: DX11, my ass...
By Glabrezu on 9/16/2009 9:40:20 AM , Rating: 4
Yup. For those, who didn't see the original comparison...
http://imgur.com/AWbnM.jpg


RE: DX11, my ass...
By Natfly on 9/16/2009 11:15:38 AM , Rating: 2
Wait, was that image originally attached to the article?


RE: DX11, my ass...
By Glabrezu on 9/16/2009 12:22:06 PM , Rating: 4
Yes, that image was part of the article when I posted my reply.


RE: DX11, my ass...
By bhieb on 9/16/2009 9:48:50 AM , Rating: 5
Busted give this man a 6 for editing yet another DT article.


perfect
By MadMan007 on 9/16/2009 9:21:39 AM , Rating: 3
Just in time for me to have stopped caring very much about games. I guess that will save me some money, I'm thinking there's a good chance the next computer I build in late 2010/early 2011 I'll use integrated graphics.

Now if someone can come out with a game that isn't the same old thing in a new wrapper gameplay-wise I may become interested again.




RE: perfect
By Reclaimer77 on 9/16/09, Rating: -1
RE: perfect
By werfu on 9/16/09, Rating: -1
RE: perfect
By rdeegvainl on 9/16/2009 10:06:28 AM , Rating: 4
You really sound like you don't know what you are talking about. Nvidia 8 series were DX10.


RE: perfect
By StevoLincolnite on 9/16/2009 10:15:38 AM , Rating: 2
quote:
Microsoft hyped Vista because they new it would be a bad sale. Vista breaks too much things and haven't been fine tuned enough.


I guess you were to young to experience Windows ME...

*Hears a crowd yell Blasphemy! for uttering those words...*

quote:
Now Windows 7 brings in much needed tweaks and improvement over Vista and most Vista and XP users will be happy to switch over 7.


We don't know that for sure until Windows 7 is firmly in our laps and fully out the manufacturing door, however I installed Windows 7 RC1 on my neighbors 5 computers that used to have a mix of Vista/XP rigs, and they love Windows 7, most of the issues they had are gone, and things are easier to find, and it's faster than Vista as well!

quote:
Now, DX10 was a Vista feature and as other features, it breaks over DX9.


It was a feature for Vista simply because of the massive amount of changes made to the Operating system over Windows XP, namely in the Driver department, memory management etc'. - Hence having it on Windows XP would more than likely ended up being a bloated API with dropped features to accommodate Windows XP.

quote:
Now DX11 is coming to Vista and 7 and the hardware will be ready to use it to its full capacity.


You still need to buy new hardware to accommodate the new Direct X 11 features, and developers still need to release there games, and expect several driver releases before most issues are worked out as well.

Will be awhile before we are all running Direct X 11 games.

quote:
Remember that the 9 series of NVidia were just 8 series card with some little things added to make them DX10 compatible, nothing ground breaking.


The 9 series had the same feature set as the 8 series, that was Direct X 10, the 9 series was mainly die shrinks, a few silicon re-spins, some changes to the memory controllers and shader configurations etc'.

The 7 series was the series that didn't support Direct X 10.


RE: perfect
By omnicronx on 9/16/2009 11:48:22 AM , Rating: 2
quote:
We don't know that for sure until Windows 7 is firmly in our laps and fully out the manufacturing door
7 Hit RTM on in mid July and was officially released over a month ago. I've been running the RTM build since build 7600 hit the web a while back.
quote:
You still need to buy new hardware to accommodate the new Direct X 11 features, and developers still need to release there games, and expect several driver releases before most issues are worked out as well.
As the article states DX11 is a superset of DX 10.1 (which is a subset of DX10) meaning DX11 games will run perfectly fine on DX 10.1/10 hardware. As such it is not in the same situation as DX10 vs 9, where you needed a DX10 card and a DX10 capable OS to play DX10 games. This way, developers can program for the cutting edge, without leaving people on older hardware behind. The decision to do this was a direct result of the Vista/DX10 fiasco.

I agree with your other comments though.


RE: perfect
By omnicronx on 9/16/2009 10:16:13 AM , Rating: 1
Ya DX10 was never used for anything, damn MS and its hyped up products.. *Looks at Xbox 360 and what is essentially a DX10 GPU* oh wait..

DX10 would have been great if it were not limited to Vista, as the article mentions, it made absolutely no sense to code for DX10 when only a small percentage of users can take advantage.

Taking a look at the 360, you can see what DX10 was suppose to be, sure MS 'attempted' to push Vista with DX10, but to say it was merely 'hype' is completely incorrect.


RE: perfect
By bobvodka on 9/16/2009 10:24:15 AM , Rating: 2
360 isn't really a 'mostly D3D10' GPU; its more advanced than D3D9 yet is missing various D3D10 features which place it somewhere in the middle of the two.


RE: perfect
By omnicronx on 9/16/2009 10:59:36 AM , Rating: 2
The 360 GPU is very loosely based on DX9c, it can easily be extended (and is) with most of the dx10 functionality though it does not follow the spec precisely (it is based on dx9 afterall). When push comes to shove, the 360 can produce graphics very much so on par with DX10. Consoles are far more programmable and can be curtailed for certain needs to perform tasks that are just not possible on the PC with different hardware from system to system.

The 360 has unified shader technology among many others things present in DX10 (the 2 big features missing are Geometry shaders and SM4.0 capability), even then it is possible with a console to perform certain tasks (such as geometry shading) in different ways. No its not Dx10 complient but it does showcase what DX10 could have been on the PC had it not been exclusive to Vista.

This was really my point, so saying that DX10 was nothing but vapourware to push Vista is completely incorrect. It would have been great had it been widely deployed and used.


RE: perfect
By bobvodka on 9/16/2009 11:26:45 AM , Rating: 2
Unified shader technology wasn't a requirement for D3D10 anyway; yes the two major vendors happend to go that route however you could have produced a D3D10 compliant chip with seperate hardware for vertex, geom and frag/pixel shaders with ease.

And yes, the console can produce things which look on par with D3D10 hardware; D3D10 just made certain things easier.

I do agree with your final point; D3D10 wasn't vapour ware, it is a MUCH nicer API to work with from a programmers point of view. Where it suffered was from D3D9 graphics engines being ported to it without the significant work required to really take advantage of it. A properly written D3D10 graphics system is going to take less CPU resources than a D3D9 app.

(That in fact highlights a typical misunderstanding by the general gaming comunity at large; D3D10 was never about 'more fps' it was about 'more things at the same fps and a lower cpu cost'. However, as with any article on technical issues the translation from what we, the programmers, understand to what is told to the general public something gets lost. I suspect multi-threaded rendering to suffer the same fate this time out..)


RE: perfect
By The0ne on 9/18/2009 1:40:01 PM , Rating: 2
I am curious though, where are all the DX10 loving fanboys? I would think they wouldn't stand for this kind of talk. The years of frustration must be wearing them down, maybe.


RE: perfect
By Reclaimer77 on 9/16/2009 1:07:11 PM , Rating: 2
/shrug

Tomato toMATo.


RE: perfect
By MadMan007 on 9/16/2009 11:22:21 AM , Rating: 2
Thanks for sharing but I'm not sure what that has to do with my comment.


Waiting
By FITCamaro on 9/16/2009 9:06:01 AM , Rating: 2
For the people who will say "DX11 should be on Windows XP!".




RE: Waiting
By goku on 9/16/2009 9:19:44 AM , Rating: 2
DX11 should be on Windows XP! Maybe at least on XP X64!


RE: Waiting
By danielgarciaromero on 9/16/2009 11:52:28 AM , Rating: 3
Windows sucks, bring back DOS


RE: Waiting
By TSS on 9/16/2009 6:24:10 PM , Rating: 2
DOS sucks, bring back punched cards.


RE: Waiting
By geekman1024 on 9/16/2009 11:46:29 PM , Rating: 2
Punched cards sucks! Knots rulez!


RE: Waiting
By FDisk City on 9/16/2009 9:22:39 AM , Rating: 3
I'm waiting for Windows Mojave to release before I switch.


RE: Waiting
By Mitch101 on 9/16/2009 9:48:50 AM , Rating: 2
XP is dead I collect spores, molds, and fungus. Windows 7!


RE: Waiting
By HotFoot on 9/16/2009 10:34:08 AM , Rating: 1
For those out there who haven't yet figured out the definition of forum troll, I think Fit just provided a great example. Keep waiting under the bridge for a goat dumber than you, Fit.


they have it wrong
By glennc on 9/17/2009 11:24:27 PM , Rating: 2
DirectX is the reason it hasn't gone video realism. go MS and your BS again. DX is holding back graphics. 3 years from now it will be a memory, it is an entirely short term flawed solution to graphics as are current graphics cards.




RE: they have it wrong
By Lerianis on 9/18/2009 2:25:52 AM , Rating: 2
Uh, what in the world are you blathering about? DirectX is NOT holding back things from going photo-realistic in the slightest.

In fact, DirectX being a STANDARD on computers that everyone builds with if they are smart might actually get us to these things faster.


RE: they have it wrong
By monkeyman1140 on 9/21/2009 4:43:19 PM , Rating: 2
He might be referring to the idea of real time raytracing replacing GPUs completely.


RE: they have it wrong
By Headfoot on 9/21/2009 11:31:50 PM , Rating: 2
Not possible.

Ray Tracing is "graphics" and a GPU is a graphics processing unit. GPUs will just accelerate raytracing *evenentually* instead of vector graphics.


Tesselation
By danielgarciaromero on 9/16/09, Rating: 0
RE: Tesselation
By bobvodka on 9/16/2009 10:22:04 AM , Rating: 2
This isn't the same as ATI's inital tesellation, nor is it the same as the implimentation the current cards.

There are shader segments which give more control to the artists over what happens. This is considerably different to anything around now.

As for WDM, it was never about raising frame rate and more about dropping CPU usage for drawing. Due to some... intresting.. choices made when D3D9 happened draw call costs are pretty high, the new driver model fixed this as well as bring a host of other improvements mostly for driver writers.

In Win7 something you will see with recent drivers is a decrease in memory usage, as WDM1.1 doesn't require copies of windows to be kept in main memory.


RE: Tesselation
By danielgarciaromero on 9/16/2009 11:49:04 AM , Rating: 2
Interesting, thanks for clarifing.


RE: Tesselation
By T2k on 9/16/2009 3:30:13 PM , Rating: 2
You're confusing their old - Radeon 8500 - fixed HW tesselator with pure shader-based tesselation.


oh joy
By mydogfarted on 9/16/09, Rating: 0
RE: oh joy
By T2k on 9/16/2009 1:02:47 PM , Rating: 2
Rrrriiiiiight.Because you were [b]forced[/b] to upgrade but now you aren't yet your gfx is improving.


RE: oh joy
By mydogfarted on 9/16/2009 2:43:46 PM , Rating: 2
My point was stated in the "incoherent" branch below:
quote:
And to get any significant benefit from new DX versions 5.0-9.0, you had to get new hardware that supported the new features. Like shaders.)


Back when I gave a damn about PC gaming, I was upgrading my video card/RAM/CPU about once a year to handle the latest big game. After upgrading to a 6800 series NVIDIA card (am I showing the age of my PC?) to play HL2 and Far Cry, dropping several hundred dollars on the card, I gave up. For what I paid for the card, I could have bought a brand new PS2 at the time. The price of the games for the console version is the same, but the consoles have a longer useful life before an upgrade is "needed" - i.e. the next gen console gets released.


RE: oh joy
By mindless1 on 9/16/2009 2:33:19 PM , Rating: 2
Ok, if you insist on upgrading that much but then you end up with newer parts having more of their lifespan ahead of them (fewer system failures), and let's not forget performance increases in most things you do on a PC besides the gaming.

I do agree to some extent though, years ago I used to upgrade quite frequently (minimum once a year) but these days not so much.

When my gaming box had a video card failure I pulled an old 7600GT card out of retirement, put it in the system and haven't felt a terrible urge to replace it yet since I haven't been (demanding) gaming much recently.


tessellation?
By kylebilenki on 9/16/2009 1:35:51 PM , Rating: 2
I'm fuzzy on the whole tessellation thing... currently objects in a 3D space are already tessellated into triangles. How is this new tessellation feature different? Are we talking about using different shapes?




RE: tessellation?
By mindless1 on 9/16/2009 2:42:04 PM , Rating: 2
Looks like more triangles, a more granular number coming yet again closer to realistic detail levels and shapes than with prior DX versions.

Some keep thinking "leap" but I like the word walk more. Developers aren't making games as photorealistic as they could on DX10, didn't on DX9, etc, either so what is possible and what actually happens aren't necessarily the same, aren't necessarily a realized leap.


RE: tessellation?
By bobvodka on 9/17/2009 8:35:40 AM , Rating: 2
Graphics, such as we do them today, will always be made up of triangles simply because of their coplanar nature.

3D Objects are indeed tessellated, however this is normally done at the art stage and as such requires artist time. The tessellation support in the new cards will allow for extra details to be synthesised either from a second source, such as a texture, or via maths, adding more triangles to the scene/model to increase the detail.

Effectively the next stage on from normal mapping, where in normal mapping we faked the extra detail this allows the creation of it.


Incoherent
By randomname on 9/16/2009 11:16:59 AM , Rating: 2
"Microsoft has learned its lesson, and the situation is very different with DirectX 11. It is essentially a superset of DirectX 10.1, itself a superset of DirectX 10. This means that game developers will be able to design their games for DirectX 11, but the Direct3D 11 runtime will scale back graphics features that are not supported by the hardware. This also means that Windows Vista users will be able to install DirectX 11, which means a bigger market for game developers."

No doubt you will again have to wait years for big games, where the main development effort has been geared towards DX11 (if last minute additions of irrelevant eye-candy aren't considered), and no doubt almost all new games will also have a DX9 mode for years, as well (to support older cards and XP). The change in driver model had to be done at some point, I guess. There would be no sense in changing the driver model again for DX11, for no apparent reason. Again, I don't see what lesson has been learned, and by whom.

(And to get any significant benefit from new DX versions 5.0-9.0, you had to get new hardware that supported the new features. Like shaders.)

Is this more of a criticism of Vista than of DX10? But then again, Vista will support DX11.




RE: Incoherent
By T2k on 9/16/2009 4:07:01 PM , Rating: 1
quote:
This also means that Windows Vista users will be able to install DirectX 11, which means a bigger market for game developers."


You're really clueless about Vista if you think it has any significant market share... Vista was DOA and I said it back then and I turned out to be right (never upgraded any machine at work, I decided to skip the entire generation it was so bad.)


RE: Incoherent
By Silver2k7 on 9/17/2009 9:06:08 AM , Rating: 2
Vista is working fine, and is better than XP.


Years..
By swaaye on 9/16/2009 4:52:41 PM , Rating: 2
It is gonna be years before DX11 cards have any kind of market share. With the way games aren't demanding more of hardware than they did in '06 or so, I don't think many people have much reason to upgrade. We're just now getting decent DX10 cards in some sort of volume, going by the Steam stats. I see those 8800GTs (and friends) and 4850s being in-use for a long time.




RE: Years..
By whowantstobepopular on 9/16/2009 7:34:19 PM , Rating: 2
Add me to the list of those with 8800GTs! Still plays Crysis (the original) on Windows 7 RC, DX10, WHQL190.62 Nvidia drivers, all settings on Very High..... with smooth 1680x1050 framerates!

The performance increases caused by driver updates have been enough to keep the old 8800GT chugging along nicely. Having said that, an upgrade to a HD5870 may not be just too far away...


By perhaps314 on 9/16/2009 11:20:20 PM , Rating: 2
I did some research and found the video of that woman running from the giant robot. It's over a year old, not sure if that's really next generation hardware. Not even sure if it's really DX11. http://www.youtube.com/watch?v=7fzkHGch12c&feature...




By cactusdog on 9/20/2009 5:50:32 AM , Rating: 2
They were running that video when DX10 came out....


Sounds good
By Chaser on 9/17/2009 7:17:33 PM , Rating: 2
quote:
Microsoft has learned its lesson, and the situation is very different with DirectX 11. It is essentially a superset of DirectX 10.1, itself a superset of DirectX 10. This means that game developers will be able to design their games for DirectX 11, but the Direct3D 11 runtime will scale back graphics features that are not supported by the hardware. This also means that Windows Vista users will be able to install DirectX 11, which means a bigger market for game developers.


This is a good thing. Outside of the complaining nay sayers that gripe about anything I won't be camping out for a DX11 game but I'm all for inovative improvements to the API. And as far as those "other" graphic APIs...Please.

The question is why didn't Microsoft think of doing this with DX10? Epic fail on that.




RE: Sounds good
By cactusdog on 9/20/2009 5:47:28 AM , Rating: 2
Ya if i can poont noobs with DX 11 i will buy. pew pew


hogwash
By the goat on 9/16/2009 9:56:30 AM , Rating: 3
quote:
DirectX is probably the best known collection of Application Programming Interfaces (APIs) out there.

Depends on what you mean by "best known". POSIX and Win32 APIs are several order of magnitudes more well known among programmers then DirectX.

Best known among "regular" people? They have no idea what an API is. So they don't really know anything about DirectX other then "the game says I need to install something to make it work".




Vista?
By Belard on 9/16/2009 2:51:26 PM , Rating: 1
quote:
This also means that Windows Vista users will be able to install DirectX 11, which means a bigger market for game developers.


Er... not really. Many gamers and most users stayed AWAY from Vista. DX10 wasn't worth it for people buy the crap. Vista is at about 25% market share... it'll never go much higher than what it is today.

A good portion of Vista users will get Win7. Most gamers with Vista, will get Win7. Where the market changes is the WinXP users that upgrade their hardware and OS to Windows7 and DX11.

ATI already has DX11 cards coming out in days & weeks (depending on type) so a 4670/GF9600 performance class DX11 video card will sell for about $65 will be excellent for pushing the standard. And with the new $100 AMD Quads hitting the market, that can help PC gaming quite a bit.

*BUT* Microsoft doesn't support PC gaming all that much. Halo3, Gears of War 2?

We'll see.




RE: Vista?
By Visual on 9/17/2009 7:02:29 AM , Rating: 1
Fuck Halo 3, it was a disappointment for me anyway. Not bad, just not up to the hype it got pre-launch.

But that's off-topic anyway. Exclusives are the exception, not the rule. And nowadays they don't stay exclusives forever. Halo 3 and GOW 2 may be late on the pc, but they will come to it eventually... just like Halo 2 and GOW are now on the pc.

And for a lot of other games, the pc version launches together with the console one and has a number of advantages. It is usually priced way cheaper, online multiplayer or downloadable content for it is free, homebrew user-made content and mods are possible only on the PC version. Graphics options are sometimes better on the PC, especially if you have some 30" monitor with double the pixels of 1080p TVs.

And you can play with the xbox controller and a 1080p TV on the PC and get absolutely the same feeling as a console, thanks to Microsoft's "Games for Windows" program. So I wouldn't say that Microsoft doesn't support PC gaming, actually it seems quite the opposite to me.

I wish it was this simple to use a PS3 controller with PCs. In theory it should be, actually for the wireless controllers it should be even easier, as they use standard bluetooth and you won't need a proprietary dongle. But in reality there are significant hassles with drivers and whatnot, and even once you have it set up, the game does not refer to the controller buttons with the correct names or icons... you can even find yourself in the absurd situation that you are playing with a PS3 controller and the game shows you XBox 360 controller icons. A terrible decision by Sony to not try and get into PC gaming on equal level with Microsoft, in my opinion.


bring on dx11 hw!
By riottime on 9/17/2009 12:51:51 AM , Rating: 2
i can't wait. have you tried playing doom 3, quake 4, supreme commander, morrowind, oblivion, gta:sa with modern pc hardware? they're eye sore compared to crysis, anno 1404, bionic commando, fallout 3, gta iv, sims 3, batman: aa, prototype, xmen o: wolverine, and far cry 2 to name a few.

granted most of the old games i've listed have 'graphic' enhancement packs to make them look more tolerable to today's graphic standard, but boy they don't age graphically very well on their own.

graphic tech on the pc moves at an alarming rate. a year from now, the games i've listed will look terrible compared to the games released in that year. then they'll have 'graphic' enhancement packs released for them as well. :P

i don't know about you guys, but when i load up a game and the graphic isn't good, i usually stop playing it. it just turns me off to the game. who cares about the story if you don't play long enough because of the bad graphics?




I do NOT want photo realism.
By MFK on 9/20/2009 3:32:59 PM , Rating: 2
Its fairly common knowledge that people do not really like photo realism. I do not want the boring dull colours of every day life. I want a picture where the colours jump out at you with fake contrasts and unrealistic brightness.

IMHO, photo realism is over rated...




Old demo -- please update photo
By Headfoot on 9/21/2009 11:36:31 PM , Rating: 2
The picture of Ruby running from the Robot is old news, that was for the last generation ATI stuff.

Their new demo is the one involving the Occlusion Shadowing on the tank and showing the AI pathing, a quick google should bring it up. Search "AMD Evergreen" and you should find it.




"Folks that want porn can buy an Android phone." -- Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki