backtop


Print 96 comment(s) - last by artpearson.. on Feb 6 at 2:42 AM

Industry authority and rocket scientist John Carmack shares his views on the latest software and technology

When John Carmack speaks, the industry tends to listen. While it can be argued that his influence today on the gaming industry isn’t as big as it was when nearly every 3D shooter was using one of his Quake engines, he is still regarded as part of the heart of that keeps PC gaming alive. He continues to influence gaming hardware too, especially in the area of graphics. In fact, NVIDIA and ATI consult with John Carmack on design decisions when engineering new GPUs.

Carmack and id Software were recognized last week with two Technology Emmy Awards from the National Academy of Television Arts & Sciences for the areas “pioneering development work in 3D game engines” and “technological leadership in rendering breakthroughs with the Quake technology.”

At CES, Game Informer magazine sat down with John Carmack and Todd Hollenshead of id Software to discuss many facets of the game industry as it applies to both PCs and consoles. Right away, Carmack confirms that he is working on a new engine for a completely new franchise not based on any of the company’s currently existing intellectual properties. Carmack said that Quake Wars, which is based on an upgraded Doom 3 engine, will not be a DX10 game.

On the topic of DX10, Carmack said that there’s nothing at the moment motivating him to move to the new API just yet for Quake Wars, citing that he’s quite satisfied with DX9 and the Xbox 360. “DX9 is really quite a good API level … Microsoft has done a very, very good job of sensibly evolving it at each step--they’re not worried about breaking backwards compatibility--and it’s a pretty clean API,” he said. “I especially like the work I’m doing on the 360, and it’s probably the best graphics API as far as a sensibly designed thing that I’ve worked with.”

Gamers often look to Carmack to tell the fortunes of PC gaming hardware. His opinions on hardware can sway hardcore gamers to purchase one hardware choice over another. Those in awe of the potential offered by DX10 may want to hold off on that shiny graphics card purchase, as Carmack says that there isn’t a huge need for new hardware just yet, as current hardware is more than adequate. “All the high-end video cards right now -- video cards across the board --are great nowadays,” he said. “Personally, I wouldn’t jump at something like DX10 right now. I would let things settle out a little bit and wait until there’s a really strong need for it.”

Those wishing to take the plunge into DX10 will also have to do so while upgrading to Windows Vista. Carmack, however, isn’t all that excited about upgrading to the new OS: “We only have a couple of people running Vista at our company. It’s again, one of those things that there is no strong pull for us to go there. If anything, it’s going to be reluctantly like, ‘Well, a lot of the market is there, so we’ll move to Vista.’”

Carmack then said that he’s quite satisfied with Windows XP, going as far to say that Microsoft is ‘artificially’ forcing gamers to move to Windows Vista for DX10. “Nothing is going to help a new game by going to a new operating system. There were some clear wins going from Windows 95 to Windows XP for games, but there really aren’t any for Vista. They’re artificially doing that by tying DX10 so close it, which is really nothing about the OS ... They’re really grasping at straws for reasons to upgrade the operating system. I suspect I could run XP for a great many more years without having a problem with it,” he said.

Then on to the topic of multi-core gaming systems. Carmack has expressed his dislike for multi-cores, but with the two high-powered new generation consoles both making use of multiple cores, it may be something he just has to deal with. He says of the Xbox 360: “Microsoft has made some pretty nice tools that show you what you can make on the Xbox 360 [with the multi-cores] … but the fundamental problem is that it’s still hard to do. If you want to utilize all of that unused performance, it’s going to become more of a risk to you and bring pain and suffering to the programming side,” he laments. “So we’re dealing with it, but it’s an aspect of the landscape that obviously would have been better if we would have been able to get more gigahertz in a processor core. But life didn’t turn out like that, and we have to just take the best advantage with it.”

As far as the PlayStation 3 goes, Carmack isn’t thrilled at the lack of developer support in comparison to what he’s received from Microsoft. Nevertheless, he plans to support Sony’s console with his next generation engine and games. “We’ve got our PlayStation 3 dev kits, and we’ve got our code compiling on it. I do intend to do a simultaneous release on it. But the honest truth is that Microsoft dev tools are so much better than Sony’s,” he comments. “I think the decision to use an asymmetric CPU by Sony was a wrong one. There are aspects that could make it a winning decision, but they’re not helpful to the developers … It’s not like the PlayStation 3 is a piece of junk or anything. I was not a fan of the PlayStation 2 and the way its architecture was set up. With the PlayStation 3, it’s not even that it’s ugly--they just took a design decision that wasn’t the best from a development standpoint.”

Finally, the console wheel spins to the company from Kyoto, which Carmack says that id Software has never “been that tight with.” He does express his respect of Nintendo’s courage to take a different direction with input methods in controlling games, but his current and next generation of game technology is not targeted at the Wii.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Carmack speaks sense...again.
By Viditor on 1/15/2007 8:34:42 PM , Rating: 4
Considering both the cost and most especially the power requirements of DX10 cards (300w each???), Carmack is spot-on when he says to avoid them for awhile...
Besides, there are so many changes happening in the graphics field before DX10 is mainstream (Next-gen Physx, Fusion, etc...) that early adoption seems like a big mistake...




RE: Carmack speaks sense...again.
By cochy on 1/15/2007 8:51:43 PM , Rating: 2
Well unfortunately for those building brand new PCs or upgrading vid cards and looking for the best performance, why wouldn't this person go with a brand new DX10 card when it is clearly where the industry will be moving. Personally I like making purchase that last me at least 3-4 years.

It would have been nice if he were a bit more enthusiastic about multi-core gaming. Is it really that difficult from a programming point of view to design games that take great advantage of multi-core?

Sony should listen to what he has to say and start supporting developers better or all that shiny hardware will go to waste.


RE: Carmack speaks sense...again.
By Viditor on 1/15/2007 9:04:42 PM , Rating: 2
quote:
Is it really that difficult from a programming point of view to design games that take great advantage of multi-core?


In a word, yes...
Writing parallel code is VERY hard and much more time consuming.


RE: Carmack speaks sense...again.
By cochy on 1/15/2007 9:28:10 PM , Rating: 2
Well then I think it's a good time to start developing game engines and programming languages that introduce a layer of abstraction here to make this job easier for the programmer. Soon everyone will have this multi-cores might as well take advantage of them.


RE: Carmack speaks sense...again.
By UsernameX on 1/15/2007 11:23:27 PM , Rating: 2
quote:
In a word, yes... Writing parallel code is VERY hard and much more time consuming.


This is exactly why programmers need to start chipping the ice berg. Multi core processing is MUCH more powerful then our current single core processors today. Once we find an efficient means of utilizing this free resource... the games we play today will pale in comparison to the games of tomorrow.


RE: Carmack speaks sense...again.
By FITCamaro on 1/16/2007 6:57:45 AM , Rating: 2
Exactly. I don't know why a guy who's supposed to be an industry leader would make the comment that he's unhappy with multi-cores. Just because its easier doesn't mean its better. Seems to me like he's only think about making his job easier.

He's spot on with Vista though. Other than DX10 and the enhanced security, there's really nothing pulling people to Vista. Microsoft could have easily made XP have 3D Windows since 3D Desktop already achieves this.


RE: Carmack speaks sense...again.
By masher2 (blog) on 1/16/2007 7:47:04 AM , Rating: 2
> "Exactly. I don't know why a guy who's supposed to be an industry leader would make the comment that he's unhappy with multi-cores..."

Because he's human, and he can't see the forest for the trees. His entire career has been predicated upon parallel processing...the invisible kind, done transparently by the GPU. Now that parallel processing is moving into the CPU, he becomes dogmatic. A bit sad...but very human.

In any case, his opinion certainly won't slow the introduction of highly parallel CPUs. And a couple decades from now, when games are running on hundreds of cores, no one will even remember he said it.


RE: Carmack speaks sense...again.
By EODetroit on 1/24/2007 4:36:46 PM , Rating: 2
You mean just like no one remembers when Einstein said "God doesn't play dice with the universe."? And yeah I think Carmack is just as brilliant as Einstein was, just in different fields. People will certainly remember this.

But I certainly agree with the rest of what you said... Carmack has come up against something that's hard for him, and this time he doesn't want to figure it out himself. In a way though I think Young Carmack in this situation would figure out the "Theory of Everything" for mutiple CPUs... Old Carmack doesn't have to, he's already got his fame and fortune.


RE: Carmack speaks sense...again.
By somata on 1/16/2007 2:43:17 PM , Rating: 2
quote:
Multi core processing is MUCH more powerful then our current single core processors today.


Well obviously that's the case because two processors will always have a potential advantage over one (assuming they're the same speed and architecture). Having said that, I can't believe how passionate some people are about the apparent "magical abilities" of multi-core processors. Yes more cores are nice, but they should be looked at as an addition to increased single-core performance (a la Core 2 Duo) rather than a desired replacement for it. We should all hope that single-core performance does not stagnate with the industry's recent tendency towards multi-core madness.

Aside from being more difficult on developers, Amdahl's Law sets hard limits on how much any application can be parallelized, even the so-called embarassingly parallel ones like rendering. Going from one to two cores can significantly improve most applications; four to eight less so, and so on. In short, a 3GHz P4 is always desirable over two 1.5GHz P4s, so doubling CPU frequency is always preferrable to doubling cores... if you have a choice.


RE: Carmack speaks sense...again.
By lplatypus on 1/16/2007 5:58:54 PM , Rating: 2
quote:
In short, a 3GHz P4 is always desirable over two 1.5GHz P4s, so doubling CPU frequency is always preferrable to doubling cores

Not always... if what you are doing is inherently dual threaded, then the single core CPU has worse throughput because of the overhead of switching between threads. Also, a single core CPU can have worse latency for dealing with incoming events (eg network interrupts) because you're less likely to have an idle core when the event arrives. These are some of the marketing arguments that Sun uses for its Niagara CPUs.


RE: Carmack speaks sense...again.
By therealnickdanger on 1/17/2007 4:16:22 PM , Rating: 2
My 1.6GHz Core Duo rips MP3s and encodes video faster than my old P4 3.0GHz... Granted, it's got faster RAM too, but still, I can't think of anything my P4 does better than my CD.


RE: Carmack speaks sense...again.
By AnnihilatorX on 1/18/2007 2:05:25 PM , Rating: 2
That's more to do with the doomed P4 and NetBurst architecture


By therealnickdanger on 1/25/2007 1:01:15 PM , Rating: 2
Right, just proving that GHz don't matter...


RE: Carmack speaks sense...again.
By mindless1 on 1/25/2007 11:47:22 AM , Rating: 2
It is seldom the case (if ever?) that someone dual threaded has equal but non-dependant threads. You can't just arbitrarily split the workload in half, thus a single 3GHz P4 would virtually always be MUCH faster than two 1.5GHz P4 even at a dual threaded app.


By artpearson on 2/6/2007 2:38:50 AM , Rating: 2
The idea that games are inherently serial is based on an extremely limited concept of what a game is.
The computer game buyer market today is dominated by the pinheads who think that "computer game" means fps (first person shooter) game.
There is really only one such game, in many guises. It consists of roaming around and shooting at things.
The people in this market buy the same game over and over, with different wallpaper.
Gaming taps into our primordial hunter mindset. The thing is that early hunters were smaller, weaker, and slower than their prey.
They won the hunting game by teamwork, with each other, and the hunting dogs that co-evolved with us.
The technology aspect: clubs, spears, bows and arrows was secondary to this.
War came about as a byproduct of team hunting. It was team against team.
The AI required for challenging teamwork on the part of computer player characters is where the need for parallel processing comes in.
The computer science for this is a little beyond its infancy, but is currently neglected. Read some older books like "Communicating Sequential Processes" and "Fairness" and take it from there.


By artpearson on 2/6/2007 2:40:19 AM , Rating: 2
The idea that games are inherently serial is based on an extremely limited concept of what a game is.
The computer game buyer market today is dominated by the pinheads who think that "computer game" means fps (first person shooter) game.
There is really only one such game, in many guises. It consists of roaming around and shooting at things.
The people in this market buy the same game over and over, with different wallpaper.
Gaming taps into our primordial hunter mindset. The thing is that early hunters were smaller, weaker, and slower than their prey.
They won the hunting game by teamwork, with each other, and the hunting dogs that co-evolved with us.
The technology aspect: clubs, spears, bows and arrows was secondary to this.
War came about as a byproduct of team hunting. It was team against team.
The AI required for challenging teamwork on the part of computer player characters is where the need for parallel processing comes in.
The computer science for this is a little beyond its infancy, but is currently neglected. Read some older books like "Communicating Sequential Processes" and "Fairness" and take it from there.


By artpearson on 2/6/2007 2:42:13 AM , Rating: 2
The idea that games are inherently serial is based on an extremely limited concept of what a game is.
The computer game buyer market today is dominated by the pinheads who think that "computer game" means fps (first person shooter) game.
There is really only one such game, in many guises. It consists of roaming around and shooting at things.
The people in this market buy the same game over and over, with different wallpaper.
Gaming taps into our primordial hunter mindset. The thing is that early hunters were smaller, weaker, and slower than their prey.
They won the hunting game by teamwork, with each other, and the hunting dogs that co-evolved with us.
The technology aspect: clubs, spears, bows and arrows was secondary to this.
War came about as a byproduct of team hunting. It was team against team.
The AI required for challenging teamwork on the part of computer player characters is where the need for parallel processing comes in.
The computer science for this is a little beyond its infancy, but is currently neglected. Read some older books like "Communicating Sequential Processes" and "Fairness" and take it from there.


RE: Carmack speaks sense...again.
By rg33 on 1/16/2007 8:20:13 AM , Rating: 2
I disagree. Games programming seems to be a strange industry, at one end they are producing graphics on COTS gfx cards that's making the simulation industry look years out of date, but on the other, they are using programming architectures and designs that are far from modern.

My company (simulation) has been multi-threading since NT 4 and today we think nothing of it. Granted, it took a few very smart people to put the architecture in place, but it's not that hard.

Now, moving into modern languages and developement environments it's an absolute doddle - my last project I used C# to create a GDI graphics application that ran in one thread (on one core) to draw data that came from a network connection, the code for which was running in a seperate thread (on another core). It took me about 2 days to impliment the framwork for this. It is not hard.

Granted, game's aren't developed in C# or any other .NET lanuguage as of yet, but that could (should) change with the advent of XNA studio.



RE: Carmack speaks sense...again.
By cgrecu77 on 1/16/2007 9:34:01 AM , Rating: 2
games are different than most applications in the fact that most of the things happen as a result of the human player actions. While additional cores could be used for some tasks, it's very difficult to use even 2 cores for processing things around the player.

what carmarck is saying is that switching to multicores makes it a lot harder to have increases in performance - and a lot costlier. It's not as easy as creating a framework that "knows" multi cores and then programming as usual (as your post implies), it's something that must be done on a per game basis. The problem here is that additional complexity = additional costs = higher price for games which are already very pricey. As an industry leader Carmack must be concerned with this because it could lead to big turmoil in the gaming world. Obivously having a 4ghz processor would be better than two of the same at 2Ghz, from all perspectives ...


RE: Carmack speaks sense...again.
By rg33 on 1/16/2007 10:13:35 AM , Rating: 2
quote:
It's not as easy as creating a framework that "knows" multi cores and then programming as usual (as your post implies)


Thats not really what I meant, sorry if it was a little vague. What I actually meant was that if you have a good robust framework in place to handle all the thread-thread communication in a stable and reliable fashion, it makes the whole process pretty simple. You always need to decide what code you are going to run in which thread. That doesn't take too much getting used to.

quote:
games are different than most applications in the fact that most of the things happen as a result of the human player actions. While additional cores could be used for some tasks, it's very difficult to use even 2 cores for processing things around the player.


It is just as much or even more so in simulation. Think of an aircraft simulator - way more information comes from the actions of the pilot, and needs to be acted upon as a result, than in any PC game. Its all about how you balance the loads, some things could be easily put into another thread, whereas others can't. For example, a good game engine should have little trouble shifting all the A.I. processing off to another thread... or physics processing for example.


RE: Carmack speaks sense...again.
By bamacre on 1/26/2007 2:38:12 AM , Rating: 2
quote:
....= higher price for games which are already very pricey.



Totally disagree. I remember paying $40 to $50 for NES games, and that was almost 20 years ago. I think I paid $40 for Far Cry, $50 for HL2, $50 for COD2, $40 for FEAR, you get the picture.

Assuming high quality, I'd be happy to pay more for games. There are simply not enough games out there to keep me satisfied, I seem to always be waiting for some game to come out, with nothing really to tide me over. Looking back, I would have paid $75 to $100 for Far Cry, Vice City, and maybe a few other games.

If higher prices for games is what it will take to get more [b]good[/b] games on the market, so be it. I buy maybe 3 games per year because I only like shooters. That flat out sucks. I want more good games, and I am willing to pay for them.


By Justin Case on 1/19/2007 7:14:05 PM , Rating: 2
In two words, yes... and no.

It is hard to make the game's renderer and simulator (the part that manages the 3D space and object interactions) benefit much from multiple cores (at least multiple CPU cores - most renderers can and do benefit from multiple GPUs). But "making a game take advantage of" multiple cores does not necessarily mean "get more frames per second".

It's not particularly hard, for example, to offload the AI to separate threads, letting games use much more complex AI on multi-CPU (or multi-core) systems. The game runs at the same speed, but suddenly you can do all sorts of tricks (learning AI, etc.) that would be impossible on a single core (without killing your fps).

Of course, since the AI in most games is incredibly primitive anyway, there is still the problem of writing decent AI, but that's a separate issue. It is possible (and not very hard) to "take advantage of" multiple CPU cores as long as that "advantage" means "do new stuff" and not "do the same stuff faster".


By Locutus465 on 1/24/2007 11:02:29 PM , Rating: 2
True, but you could take the approch Valve took and write your own framework to deal with the complexities simplifying the task. Perhaps the solution won't provide absolutly the most optimal code, but it'll be hella better on today's multi-core systems then what we're seeing now.


RE: Carmack speaks sense...again.
By deeznuts on 1/15/2007 10:57:44 PM , Rating: 2
[quote]Sony should listen to what he has to say and start supporting developers better or all that shiny hardware will go to waste. [/quote]
Agreed. I am not a programmer so I have no idea about all this developer speak, but I just saw some PS3 fans going rabid over news rapidmind is developing an SDK for the PS3 that makes it much easier. I have no idea if this is related or not, is it?


RE: Carmack speaks sense...again.
By Chaser on 1/16/2007 9:09:45 AM , Rating: 2
I wasn't sure about the comparisons of the PS3/360 SDKs but if thats true then its a good thing.


By NoSoftwarePatents on 1/16/2007 3:23:26 PM , Rating: 2
Yeah, you should have seen how the Sony PS/3 fanboyz reacted on various forums. Complete and total disrespect of John Carmack...not that any of them know a thing about programming. They just can't stand someone criticizing their beloved Sony Playstation 3 that most of them don't have anyway.

If you had seen those forums, you'd get a sad picture just how much ignorance video game rabid fans have about architectures.


RE: Carmack speaks sense...again.
By Marlowe on 1/15/2007 8:59:02 PM , Rating: 2
It's not really logical to avoid DX10 cards just because they have DX10 compatability, now is there? Just like any new generation of graphics cards, the 8800GTX and GTS are great cards that gives a super experience in all our games. They also, at least the GTS, don't consume that much power, and at the very least they are really silent.

What about the upcoming 8600Ultra and GT cards? Should one avoid those as well just because they are DX10 compatible?


RE: Carmack speaks sense...again.
By Viditor on 1/15/2007 9:27:06 PM , Rating: 2
quote:
It's not really logical to avoid DX10 cards just because they have DX10 compatability


Fair point...

quote:
They also, at least the GTS, don't consume that much power, and at the very least they are really silent


The 8800GTX consumes 170W on average, and ~200w under load. As the capabilities increase and the DX10s add more performance, this will go up...
The reason that they are still that low right now is that nobody would buy them if their PSU couldn't run them (max power on the PCIe bus is 75w, and an extra Molex connecter adds 150w, making total max with current systems 225w per card.) However, what kind of PSU would you need if you used 2 of them in SLI? 340w-400w is a LOT of power to draw for video alone!


RE: Carmack speaks sense...again.
By Ringold on 1/16/2007 12:33:15 AM , Rating: 2
Except that it *could* go down.. like processors have. I guess not in the short-run though, unless they've already revised the cores..

After seeing two 8800GTX in SLI on a high-end rig being pictured at CES running off what was it, a 600 or 650 watt psu, I've realized all this horror story business over needing 800 or 1000 watt psu's is mere talk and not fact.

My next build will be either a Seasonic S12+ 650watt or an M12 600 or 700 watt, depending on price differences. I don't need even that much; I'm buying them for the extra amps on the 12v and quality for overclocking.

Next card will also be a DX10 card; why, when the mid-range parts come out soon, get DX9? Hell, get DX10 cards and be happy with the much-improved DX9 performance.


RE: Carmack speaks sense...again.
By Chaser on 1/16/2007 9:16:51 AM , Rating: 2
Don't presume that. Another popular "hardware" site has DX9 games benched on 8800s and the frame rates are actually slower on the DX10 cards compared to the DX9 cards that cost 2/3s less.

I doesn't take a game programming guru to see the common sense in waiting DX10 out until it matures. Sure I'm sure a couple of games will hit the shelves with a fancy sticker stating "supports Direct 10 acceleration" or something.

Buying a DX10 card for now is noting more than, "I got it" grins for a while.


RE: Carmack speaks sense...again.
By walk2k on 1/23/2007 5:12:04 PM , Rating: 2
I don't know where you're getting your power requirements but I hooked up my PC to a Kill-a-Watt meter, this measures actual usage directly from the power plug. It never got any higher than about 175 watts (that's a TOTAL for all components) and was more like around 150 watts. That's WHILE playing Quake 4. Sitting idle at the desktop it was drawing about 90 watts.

My system is a AMD X2 4400+, 2 gigs of DDR-400, and a Geforce 7900GT 256mb. Not tippy-top of the line, but hardly obsolete... So unless these new Core-Duo and GF8800 use 4x as much power or something... those numbers are blown WAY out proportion.

I had a 350w power supply before, and thought I needed to upgrade when I got this new PC, so I got a 500w. WOW what a waste that was, my 350w would barely have been half-taxed! I don't think you'd need more than 350-400 watts unless you had a Dual-SLI setup. People with 800-1000 watt power supplies have to be idiots...


RE: Carmack speaks sense...again.
By fic2 on 1/16/2007 6:18:30 PM , Rating: 2
quote:
It's not really logical to avoid DX10 cards


This is not what he said. Carmack said, "I [he] wouldn’t jump at something like DX10 right now". Meaning there isn't a reason to buy a video card just because it is DX10.


RE: Carmack speaks sense...again.
By timmiser on 1/18/2007 2:28:33 PM , Rating: 2
I think the point is that if you are building/upgrading a rig today, there is no point in avoiding a DX10 card now.


RE: Carmack speaks sense...again.
By slickr on 1/15/07, Rating: -1
By darkavatar on 1/16/2007 1:05:32 PM , Rating: 2
Carmack says something => tons of people follow his advice => whatever Carmack said turned out to be true.

Not personal, but yeah, very logical indeed.

I agree that binding DX10 with Vista looks more like another reason to make you upgrade,
but I'm reserving my opinion on his take on multi-core.


By VooDooAddict on 1/16/2007 5:17:08 PM , Rating: 2
If you upgrade there isn't very much reason not to go dual core.

Yes dual core won't improve your gaming frame rates much ... but general system use is far improved.


RE: Carmack speaks sense...again.
By Regs on 1/16/2007 3:18:33 AM , Rating: 2
No comments on how badly Vista is? I don't know anybody around here that uses it or even noticed it hit retail. It's a bomb. This OS was in development for 7 years which I think is at least 3 years premature.


RE: Carmack speaks sense...again.
By JimFear on 1/17/2007 10:12:06 AM , Rating: 2
Er, you won't have noticed it in retail yet because its not released, as much as you'll get is "Vista Capable" or "Vista Ready" stickers on new machines, maybe even a pre-order here or there. I'm using it at the moment with our MSDN licence and its not too bad. I wont even consider trying to game on this because im using a low end quadro (Aero works though).

I for one will not be using Vista for a LONG time, DX9 will suffice and until there is worthy content for DX10 I wont bother doing the switch, even then you cannot use more than dual core apparently. Looking in the EULA shows..."You may use the software on up to two processors on that device at one time."...and yes that is the Ultimate version, I'm not sure if its just the wording of it but whether they specifically mean TWO processors as opposed to TWO cores is a different matter altogether.

Look at 2a. at the bottom of the 1st page here if my word isn't enough...
http://download.microsoft.com/documents/useterms/W...

Those of you who go for Core 2 Quad or AMD 4x4 are sh*t out of luck :)

LONG LIVE XP AND DX9!!!


RE: Carmack speaks sense...again.
By Hawkido on 1/18/2007 10:03:25 AM , Rating: 2
I can tell you that the MS CPU limit is based on Socket not on cores. They made a statement to that affect once hyper threading came out (which emulates dual cores), then revised it for 2 physical cores on one chip once it was realized. They felt it would further multi threading and thus further MS style servers being used for businesses, instead of using a SUN varient.

Here is a link to an article http://www.eweek.com/article2/0,1759,1679473,00.as...

Some versions (PRO on NT, 2k, and XP) of MS Desktop OS's are limited to 2 sockets max, the others are 1 socket. Server edition is a 4 socket max, while Server Advance can have 8 sockets and also can have more than 4 gig of ram (16 gig I believe, Important to know for MS Exchange Enterprise). Then you have the Data Center edition which I don't know for sure how much it can handle, something unreal (16 sockets and 64 gig of ram?).



RE: Carmack speaks sense...again.
By darkpaw on 1/24/2007 2:10:46 PM , Rating: 1
Correct, MS counts physical cpus not cores towards its licensing. Was worried when dual cores started hitting the market that they would count cores, but thankfully they had some sense.


RE: Carmack speaks sense...again.
By del on 1/16/2007 3:24:33 PM , Rating: 1
Early adoption of any Microsoft operating system is a mistake... As far as video cards are concerned, I don't really mind the cost or the power requirements, with the exception of SLI (It's not like nVidia or ATi are going to come out with power-efficient GPU architectures any time soon =P).


Just a Grpahics Programmer
By porkster on 1/15/07, Rating: 0
RE: Just a Grpahics Programmer
By Martin Blank on 1/15/2007 9:19:37 PM , Rating: 2
You're right. He is a graphics programmer. He's never claimed to be the main push behind stories, and leaves that part to the couple of dozen others that work at id. However, he is an absolutely brilliant graphics programmer, and that's why people pay attention to him.


RE: Just a Grpahics Programmer
By Goty on 1/15/2007 9:20:18 PM , Rating: 2
Either you mean Doom3 or you're on crack.


RE: Just a Grpahics Programmer
By gramboh on 1/15/2007 9:25:26 PM , Rating: 2
Haha are you insane? Quake 3 was a huge success and is widely regarded to be the best deathmatch 1vs1/2vs2 game made to date, used widely around the world in competitions. It was a huge online success and you will still find many servers full of gamers playing today. The only other game I see with that kind of historically popularity is Counter-Strike.

The Q3 engine was/is amazingly efficient and has been used in many games to date. Carmack is amazing technically, whether or not you agree with game content decisions. The licensing on that engine must have made a ton of money for id.

Calling Q3 a disaster = LOL.


RE: Just a Grpahics Programmer
By cgrecu77 on 1/16/2007 9:36:22 AM , Rating: 2
maybe you meant the only other shooter ... I think starcraft is still very popular online and it's a 100 years old game, not to mention MMOs like WoW and others ...


RE: Just a Grpahics Programmer
By xphile on 1/16/2007 4:55:44 AM , Rating: 5
In 1993 the game DOOM came out and the entire gaming world just stopped and cried and cried for months. It was not just a step up it was like going from a stick figure to a real life super model - it was simply beautiful. It was ID Software and John Carmack and John Romero pretty much INVENTED 3-D gaming that you enjoy today, though by the sound of it you were 4 months old at the time and sucking on your mothers nipple because it is obvious you do not have the FAINTEST idea about who he is or what importance he ALREADY holds in the history of gaming.

In the gaming world to anybody who has lived and PLAYED through it since 1990 or earlier, John Carmack is six Bill Gates, three Steve Jobs and a group of Schmidts, Pages and Brins in value all by himself.

Doom or Quake may seem like crap games to you today, but like the Wright brothers to the Concorde, without them you'd be sitting at home marvelling over 2D games like Donkey Kong right now and lapping it up.


RE: Just a Grpahics Programmer
By sviola on 1/16/2007 7:00:35 AM , Rating: 2
The Wright brothers did not invent the plane. The plane was invented by a Brazilian called Alberto Santos Dummont.
The Wright brother's aircraft cannot be considered a plane as it lacked controlled flight and couldn't make turns.

Alberto Santos Dummont first official flight was in 1906, when using the 14-bis motorized airplane he flied around the Eiffel Tower in Paris (I think it was a 15 minute flight).

Btw, altough John Carmack is a great personality in the Computer Gaming world, and should be respected, his statements should be taken with a grain of salt, as any statement made about the future (many important people in the industry have spoken such future statements that proved to be totally worng - i.e.,"640k is more than enough").


RE: Just a Grpahics Programmer
By Chillin1248 on 1/16/2007 8:55:58 AM , Rating: 3
You are wrong on the "Bill Gates Quote".


quote:
-640K ought to be enough for anybody.

=Often attributed to Gates in 1981. Gates has repeatedly denied ever saying this:

-=-I've said some stupid things and some wrong things, but not that. No one involved in computers would ever say that a certain amount of memory is enough for all time... I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again



http://www.wired.com/news/politics/0,1283,1484,00....
http://www.htimes.com/htimes/today/access/oldfiles...
http://groups.google.com/group/alt.folklore.comput...

-------
Chillin


RE: Just a Grpahics Programmer
By TheBluePill on 1/16/2007 10:12:09 AM , Rating: 2
You are wrong on the First Flight info there cheif;

http://en.wikipedia.org/wiki/First_flying_machine

ANYWAYS;


Carmack is a pioneer and visionary. He may well be wrong, but his is FAR more often RIGHT when he makes a prediction. I do not beleive he is so much making a prediction for the future of gaming, but commenting on the current state of technology and its limitations. As Developers gain experience and better APIs for working with multi-cores, the technology will be better received.

If Carmack says its hard to program for, then you better bet its a problem for the whole industry.


RE: Just a Grpahics Programmer
By colek42 on 1/17/2007 2:36:25 AM , Rating: 2
it did have controls. The wings flexed and deformed (with strings) to turn the aircraft. They modeled their aircraft controls after the way birds use their wings to maneuver.

"Based on observation, Wilbur concluded that birds changed the angle of the ends of their wings to make their bodies roll right or left.[12] The brothers decided this would also be a good way for a flying machine to turn—to "bank" or "lean" into the turn just like a bird—and just like a person riding a bicycle, an experience with which they were thoroughly familiar. Equally important, they hoped this method would enable recovery when the wind tilted the machine to one side (lateral balance). They puzzled over how to achieve the same effect with man-made wings and eventually discovered wing-warping when Wilbur idly twisted a long inner tube box at the bicycle shop" (Wikipedia - Wright Brothers)

P.S. If you are going to try and prove something (especially as dumb as that) please sight your sources


RE: Just a Grpahics Programmer
By colek42 on 1/17/2007 2:38:24 AM , Rating: 2
*site -- i'm a dumbass


By PunaProgrammer chris on 1/19/2007 8:04:02 PM , Rating: 2
The funny thing is, you still spelled it wrong the 2nd time there (it's cite).


RE: Just a Grpahics Programmer
By JimFear on 1/17/2007 12:06:08 PM , Rating: 2
Wolfenstein 3D came out before Doom did, Doom just brought in extra goodies :)

Quake was the grand daddy of 3D game design though, Wolfenstein/Doom were just the stepping stones to get there.


RE: Just a Grpahics Programmer
By Pandamonium on 1/18/2007 9:49:16 PM , Rating: 2
If Carmack didn't do it then, someone else would have done it later. Some of you need to stop idolizing the man...



RE: Just a Grpahics Programmer
By ViperROhb34 on 1/16/2007 7:51:41 AM , Rating: 2
He pioneered or popularised the use of many techniques in computer graphics, including binary space partitioning,surface caching, Carmack's Reverse which he devised for Doom 3, and MegaTexture. While he was not the first to discover Carmack's Reverse, he developed it independently.

Furthermore, You say he's just "a programmer" ... yet this is his field and what he says about programming ( whether you like or dislike what he says about what console ) he knows much more then and has countless experience on pc and consoles!



Carmack's right
By casket on 1/15/2007 10:40:29 PM , Rating: 2
I agree with Carmack. No games currently support DX10. Spending $500 on a video card doesn't make much sense right now.

I would wait for a DX10 game to come out first... and by that time these $500 video cards will be $200. Upgrading to a video card at a later date is no big deal.




RE: Carmack's right
By Axbattler on 1/15/2007 11:29:03 PM , Rating: 5
If I was to buy a 8800 card today, it wouldn't be for DirectX 10 support, but simply for being able to run games at 1900x1200.

Of course, if budget is an issue (is for me), or if the game runs smooth enough at your prefered resolution/detail settings then of course, waiting makes sense.. DX10 or not.

That said, if I was to build a brand new PC today though, I would think twice about getting a top of the end DX9 card (e.g. X1950XT). The premium you pay for a 8800GTS is pretty much made up through the performance alone. Support for DX10 is just a bonus.


RE: Carmack's right
By otispunkmeyer on 1/16/2007 4:51:08 AM , Rating: 2
agreed

if i was to build a new machine today, which i am thinking of doing - though mac is looking like a very attractive proposition at the moment and all my gaming bar RTS's can be catered for by my 360 - i would only consider a X1950XT if i absolutely had to go penny pinching with the build.

more likely i would be budgeting for at least a 8800GTS and not necessarily for the DX10 support/performance which as of yet is unproven. the IQ, speed in DX9 and hi-res performance alone is easily worth the outlay for the card


RE: Carmack's right
By Le Québécois on 1/16/2007 1:04:30 AM , Rating: 2
Not buying a DX10 video card just because no game are made for it yet is just plain stupid if you ask me ( if it's a matter of money, that's a different story ).

I don't remember when or what was the first DX9 game I ever played but I was sure glad to have my 9700PRO long before that .

I buy next-generation cards for their power, not for whatever DX version come with it.


RE: Carmack's right
By Xavian on 1/16/2007 4:46:42 AM , Rating: 2
i think people are under some weird assumption that because the card supports DX10, that it can't run DX9 games as good as previous cards.

Ofcourse that is ridiculous, the cards are a little more powerful on DX9 games then 2 7900GTX's in SLi, i would pay to play all my games at vsync'd 60FPS with no slowdown at 1600x1200 with all options on. Which with the 8800 series is exactly what I'll get.


RE: Carmack's right
By PrinceGaz on 1/16/2007 7:42:42 AM , Rating: 2
He is not saying that there is anything wrong with DX10 cards or that they should be avoided if you need a faster graphics-card. What he is actually saying is that there is currently no reason to upgrade a current high-end DX9 card because they are capable of running everything currently available perfectly well (unless you use extreme resolutions or insist on 8xAA and higher).

I'm sure he would agree that if you do need to upgrade your graphics card today, then an 8800 series card would be the ideal choice if it is within your budget. But people with the likes of a 6900GT or X1950XT are better off waiting a good few months yet before buying a DX10 card.


RE: Carmack's right
By PrinceGaz on 1/16/2007 7:44:05 AM , Rating: 2
I meant 7900GT, not 6900GT of course.

Can we have an edit function please?


ya'll don't seem to understand something..
By Tamale on 1/16/2007 10:32:31 AM , Rating: 2
i'm seeing a lot of negativity on his comment about multi-core and i don't think you guys understand why he's saying what he is..

fundamentally, games are single threaded applications.

sure you can put the physics, ai, sound, and graphics into separate tasks, but at the end of the day, a game is a single loop waiting for user input or other activities to update the world and display it to the user. this is nothing like the task of encoding media or rendering an image... the very 'idea' of what a video game is can only be a single processing loop at its most fundamental level - update world, display world. that's it folks - and there's only one way to bring all these other elements into a loop that can keep up with the demands of the most intense games, and that's for more instructions per second by whatever CPU is running the main game thread.

multi-core will certainly help.. but he's right.. what we really need to keep moving forward is more gigahertz.




RE: ya'll don't seem to understand something..
By Teletran1 on 1/16/2007 11:06:26 AM , Rating: 2
BUT

The industry has already moved in the direction of multi core processors. Its now up to the game developers to tap the potential of these designs. Who is to say that a new programmer doesn't come around and pull a John Carmack and create a game that blows people away because it uses multi core processing to its full extent like doom did with our hardware back in the day. I have already read examples of how some PS3 developers use a job scheduling system to keep the SPEs in the Cell busy. Why can't the same be done with all multi core designs?


By Cogman on 1/16/2007 11:38:08 AM , Rating: 2
It can be done, The problem is just working everything out. Multi-core processing is easy to do, but hard to do efficiently. And I think that is really what his main concern is. in a game it is hard to use up all the processing power that a computer has. Mind you, this is their jobs and they need to be tough, band together, and make some sort of api that can do this, but it will be hard.

On the other hand, the rewards will be great once they are able to fully utilize multiple cpus efficiently, because we are bound to have more then 8-16 cpus in one computer I foresee.


By Alexvrb on 1/16/2007 6:51:10 PM , Rating: 2
2-4 symmetric cores is bad enough as it is. As Tamale put it, games are fundamentally single-threaded. You have to put some serious additional time and money into your game to take advantage of many cores - and even then you get diminishing returns. Some tasks can only be parallelized so much.

Sony themselves said that developers would have a hard time making full use of its resources. So why build a gaming console that is inherently bad for game software? They were just too damn cocky, and basically didn't give a damn. They figured they'd walk all over the competition again.


RE: ya'll don't seem to understand something..
By bigbrent88 on 1/22/2007 10:05:31 AM , Rating: 2
How exactly does Crossfire and SLI work? Your adding separate memory and GPU's together into one output. Isn't this a rough idea of a basic dual core CPU? One GPU most schedule the other for work and then compbine to results for the output. Can someone explain better


By gramboh on 1/23/2007 1:21:13 PM , Rating: 2
Look up AFR and SFR I believe those are the SLI rendering techniques, but they are not like dual core with different threads for each GPU.


By somata on 1/16/2007 3:01:03 PM , Rating: 2
Exactly!

Carmack knows the fundamental constraints of a game engine better than just about everyone. Traditional programming languages are only really designed to accomodate coarse-grained multithreading (one thread for physics, one for rendering, or any variation on that), and that is simply the wrong approach for getting good utilization out of more than a few cores imho. To get anything approaching good utilization out of something like Intel's 80-core chip, I predict you'll need something like a few main threads that deposit a bunch of smallish "work units" (could be AI, physics, etc.) into a task pool sorted by urgency. The remaining idle cores would just monitor the pool and fetch tasks as they arrive. Think out-of-order execution on a much higher level. Object-oriented languages might be able to handle such an approach, but it would be ackward. Even with an ideal language it would require a very different way of thinking and likely years of refinement. Of course, the devil's in the details, so it's quite possible that such an approach would not be very viable in practice. Either way, I can't wait to see where the industry is in 10 years.


By Dactyl on 1/19/2007 6:05:27 PM , Rating: 2
quote:
fundamentally, games are single threaded applications.
No. The way games are written today is almost entirely single-threaded. But that doesn't mean a whit about what games are "fundamentally." There is no fundamental essence of how a game should be programmed.


This guy has fallen from grace
By encryptkeeper on 1/16/2007 9:15:47 AM , Rating: 4
How can someone known as an industry leader make assumptions that DX10 and multi-core processing may be too tough to utilize? Too bad pal, that's the way technology is moving. No one should have been surprised by multi-core chips anyway, it was a pretty obvious solution to increase cpu speed. But denying DX10? He's out of his mind, just take a look at some scenes from Crysis and see if you agree for yourself.




RE: This guy has fallen from grace
By hellokeith on 1/16/2007 10:10:05 AM , Rating: 2
I agree. John is living in the past. This is likely why he is enjoying console graphics, because the platform hardware is stagnant for 3-5 years. Carmack is not nearly as influential as he used to be. And there are a dozen or so FPS game engines out now that do just as well as his.


By ViperROhb34 on 1/16/2007 10:55:14 AM , Rating: 3
I wouldn't say he's fallen from grace. If he became an alcoholic, wasn't even in programming anymore, wasn't on tv anymore, wasn't asked anything anymore about his field... ummm.. wasn't wealthy anymore..his wife left him, his children had health issues....Then yes..

Still ... He has engines used in games, including his latest Doom 3 engine, he still develops new things for use today, he's still on TV... Just because there are MANY more developers these days doesn't mean he's fallen from grace.

While you sit there from your home posting from your Shack maybe you should try summing up your life compared to his.


about the games
By thepinkpanther on 1/16/2007 7:08:36 AM , Rating: 1
I can tell you. The 8800gts is deadly fast in games. There is simply no comparison to other players with dx9 cards in dx 9 games. Has tested it against players with highend cards + dual dx9 high end cards. And I am talking about ONE graphic card.

Although I would have waited for the buy of new pc but got the money for it so had to step in and go for it.

Why not go for dx 10 cards. Its the same whimpy answeres each time, join the train or get of.

Its fine John Carmack says that multicore is hard to program well here is a newsflash. Do you really think that his words matter at all when EVERYBODY buying new pc´s at least gets dual core and up. Show me a singlecore that got the power to run the 8800 series...fact is none. It takes cpu power to run high end graphic cards so you cant stay in single core cpu. It really shows in fps test also or else the cpu is bottleneck.

About Vista well at the moment you can avoid it. But When the first dx 10 game comes out there is no excuse for going xp anymore.

Even the top dx 10 game everyone talks about the hot Far cry follow up game..Crysis is not totally written to the dx10 but it will still blow you out of the water....just wait when it comes out.

The last argument. If you go Vista with your dx9 card then you must run dx9 L on the vista which is 10-20 procent slower than dx 9 on xp.

So if you have vista then dx 10 card is a must, very soon.

multi core + vista + dx10 gpucard belongs together.

single core + xp + dx9 singel or dual gpu card is the other.




RE: about the games
By edge929 on 1/16/2007 11:29:00 AM , Rating: 2
Games do not take advantage of every core UNLESS they are programmed to do so. Name me one game currently available that does this.

*crickets chirping*

Yeah, thought so. Your shiny new GTS is an awesome card no doubt, but it does NOT use both of your CPU cores to run it unless the game is programmed to do so. So your 2.4 Ghz dual core cpu is only using one of it's 2.4Ghz cores and not the full "4.8 Ghz". Dual core cpus don't work like that unless you're multitasking or using a program that specifically utilizes both cores. And by multitasking I mean compiling your C# code on one core and rendering 3D studio max on the other core with no slow down.

Dual-core CPUs are still not mainstream just yet and like Vista, won't be until they are utilized a good amount more. Single core CPUs still use threading which is pretty much a generic dual-core attempt. Split the single core into threads and effectively turn it into a dual-core.


RE: about the games
By scrapsma54 on 1/17/2007 2:50:35 PM , Rating: 2
The move to dx10 would be nice move graphically, yes. I disagree with him on the dx9 bid. Yes shader model 3.0's full potential hasn't been reached and obviously The xbox 360 is a clear example of this, but many people cannot afford a High end workstation to out last. Look at Fear, There are still performance kinks that have not been mastered by the high end 7900, which the 8800 gtx masters gracefully.


RE: about the games
By typo101 on 1/16/2007 8:11:44 PM , Rating: 2
quote:
But When the first dx 10 game comes out there is no excuse for going xp anymore.


Thanks to MS. I agree with John that DX10 is the main reason many people will go to Vista, nothing to do with the rest of the OS. XP really does get the job done for me.

Another thing, if .NET 3.0 for XP has WPF, then doesn't that mean it should be possible to make Aero for XP as well? Aero is nice, but I don't want to deal with a new MS OS before at least SP1. Ideally I would like to wait until SP2.


John Carmack?
By restrada on 1/16/2007 4:31:07 PM , Rating: 2
Ok, who made this guy God of all gaming? It's obvious his opinion is highly valued. But to put a guy on a pedestal as swaying gamers to decision on what hardware to buy is a bid tad much.




RE: John Carmack?
By colek42 on 1/17/2007 2:40:24 AM , Rating: 2
Mr. John Carmack did


RE: John Carmack?
By EclipsedAurora on 1/27/2007 1:00:36 PM , Rating: 2
Frankly speaking, Jogn Carmack's game engine are well known of processing power hungry. But I can't see any engine that came out from his studio can perform well even on slower hardware. From Doom to Quake, the efficiency of his engine is ultra poor. That's why his game engine is good on PC --- a market that users will always upgrade their hardware. For consoles, sorry. In a programmer's point of view, if we talk about efficiency, his engine is completely a disaster! In fact, the product of Mr. Carmack did never win any "best programming award" in Toyoko game show.


Feeding frenzie
By piesquared on 1/16/2007 5:15:08 AM , Rating: 2
I think he's missing the point that the tech industry is moving at a blistering pace and i wouldn't expect it to slow down(most likely because this is the first true world-wide age with world-wide competition). It's a big feeding frenzy that feeds off itself, and it's driving innovation.
So if there's more room to create a better visual experience using DX10, i say go hard. The faster we get to CG quality gameplay the better. However, it is at the expense of game/software developers which i'm sure is why Carmack is opposing it. Maybe we have too many smaller game developing companies out there, and it would be smarter to have 3 or four very large coorporations with alot more resources. I'm sure the software companies are having a hard time keeping up.




RE: Feeding frenzie
By sviola on 1/16/2007 7:06:45 AM , Rating: 2
I totally disagree with you. If we had only a few games companies around, we would never have innovation happening. An example of this is EA, which year after year releases the same franchises (changing only graphics) and only rarely do release innovative games. The more companies and competition around, the more they need to be innovative and use high quality standards and in the end, we gamers win. =)


JC
By kuyaglen on 1/16/2007 12:53:14 PM , Rating: 2
The G80 takes too much power, the PS3 is hard to code for, DX9 is easy to work with. No dis-respect to the mighty mighty JC, but would you like some cheese with that whine? I wish his comments were more full of perseverance. Though news of a new engine and simulatanious platform launch is intriguing.




RE: JC
By mindless1 on 1/25/2007 11:52:03 AM , Rating: 2
Would you prefer he took the brain dead approach some do and just claim "it's newer therefore let's pretend it's MUCH better"?

He is giving his perspective, maybe you have your own that doesn't agree but he is entitled to his and it makes it no more whining than you having valid reasons for (any) thing you don't like or see as a benefit.


Strange
By lagomorpha on 1/16/2007 2:09:24 AM , Rating: 2
I'm surprised to hear Carmack saying negative things about dual-core CPUs. Quake3 was one of the first games to take advantage of SMP and that application alone was the reason 2 people I knew built overclocked dual-celeron computers. If it was worth including SMP support with Quake3 at a time when only a tiny fraction of the market had the hardware why is it more difficult now? Is he referring to support for 2 cores or only the problems that arise when you try to write software for more than 2 cores?




RE: Strange
By MadAd on 1/16/07, Rating: 0
This time he is wrong!
By AlmostExAMD on 1/16/07, Rating: 0
RE: This time he is wrong!
By slickr on 1/16/2007 9:13:20 AM , Rating: 1
It's not all about graphics man, it's about gameplay and fun. You certanly are wrong. Although DX10 will bring better graphics and allow developers to make the games feature some interesting special effects and stuff the thing is DX10 will is best for bringing more performance for DX10 grpahic cards.
If you take the faq that only 0.001% have DX10 cards (8800 series) and the rest 99.999% have DX9 cards he does makes logic. Considering that in order to move to DX10 you need to buy windows Vista and considering there are only 3-4 games that will support DX10 you'll see that upgrading for DX10 is not worth it at this stage. Ofcourse all will upgrade at some point the thing here is that it's too early.
games that will run on DX10 are:
Crysis
Halo 2
Alan Wake


RE: This time he is wrong!
By Chaser on 1/16/2007 9:25:51 AM , Rating: 1
Good grief! All these screen shot, press release genuises. Why don't you give Carmack a call and explain to him how vastly superior the demo/screenshot you saw compared to his "shit" games.

Better yet, use your new API and write something yourself that will land you that cushy job in Beijing.



Sell out
By AppaYipYip on 1/16/07, Rating: 0
RE: Sell out
By Alexvrb on 1/16/2007 7:03:59 PM , Rating: 2
How is he in Microsoft's pocket? MS has been pimping Vista and DX10 as the ultimate gaming platform. He just got through saying that there's no reason to rush out and jump onto the DX10 and Vista bandwagon yet, if you've still got perfectly good WinXP software and DX9 hardware. Does that sound like a MS shill to you? Why?

He gave MS props where they deserved it (better support, better tools, simpler design). He'd shoot them down too if their design decisions sucked. For all his faults he's his own man, and only a true fanboy couldn't see that.


LOL
By iwod on 1/17/2007 5:58:14 AM , Rating: 2
To those who kept on bashing John. Have you actually listen to what he said apart from just reading the Dailytech Article?

People from Valve, Epic and now Id Software ( John ) All have mentioned DX10 isn't so much of different to DX9. DX10 removes some of the set limit in DX9 and add some flexibility in programming. And they go on to mention most of the Brilliant Screenshot and rendering you see for DX10 games are possible in DX9 as well. And Oblivion is a good example of DX9 games capable of. And they can still push for more.
So it is not Dx10 is not good. It is just DX9 still have some time to go.

And About Muti Core. John never said he had much problem with 2 - 4 Thread / Core. So to those who point Crysis , Far Cry and even games like Alan Wake as example are completely false. Using 2 thread / Core is not much of a problem. Quad Core you have have one for physics , one graphics, one for sound/misc and one for game itself. The problem he said is that what happen when you get 16 core, or even more. It will be hard to fully use them. At least that is with the current state of programming.

And as he mentioned in QuakeCon, programming is getting harder and more complicated with the tools they had that doesn't scale with the task they have. It is not like you add 10 more programmer to the task and it will magically speed up by a factor of 10.

So he is not against it DX10 or Muticore. He is merely pointing out the problem with the industry and somehow someone has to figure it out.




RE: LOL
By slickr on 1/21/07, Rating: 0
Isn't Doom 3 based on OpenGl?
By Smoza on 1/21/2007 10:53:16 PM , Rating: 2
quote:
On the topic of DX10, Carmack said that there’s nothing at the moment motivating him to move to the new API just yet for Quake Wars, citing that he’s quite satisfied with DX9 and the Xbox 360. “DX9 is really quite a good API level … Microsoft has done a very, very good job of sensibly evolving it at each step--they’re not worried about breaking backwards compatibility--and it’s a pretty clean API,” he said. “I especially like the work I’m doing on the 360, and it’s probably the best graphics API as far as a sensibly designed thing that I’ve worked with.”


This paragraph makes me think quake wars is DX9, but I thought the Doom 3 engine was OpenGl? Do I have it wrong or is there something I'm not understanding?




On multicore
By RMSe17 on 1/22/2007 12:13:44 PM , Rating: 2
Article on multi core gaming, pretty good read.

http://www.anandtech.com/tradeshows/showdoc.aspx?i...




"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki

Related Articles













botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki