backtop


Print 96 comment(s) - last by Headfoot.. on Dec 5 at 9:51 PM

Dedicated video card won't be needed for graphics on Windows 7

Intel and AMD are hard at work to develop CPUs that also integrate graphics processing. If Microsoft has its way, any CPU will be able to render Direct3D 10 and Direct3D 10.1 graphics as long as it is running Windows 7.

Microsoft calls the portion of Windows 7 that will enable these graphics on the CPU WARP. The goals of WARP include replacing the need for customized rasterizers, enabling rendering when no Direct3D hardware is available or when no video card is installed among others.

By using WARP, if the video card fails Windows will be able to continue rendering graphics and will kick in when the video card runs out of memory or hangs. WARP supports all Direct3D 10 and 10.1 features along with all the precision requirements for both specifications. Direct3D 11 is also supported. Optional texture formats are supported like multi-sample render targets and sampling from float surfaces. Anti-aliasing up to 8x MSAA is supported, as is anisotropic filtering.

Minimum specifications for WARP10 are the same as minimum specs for Windows Vista including an 800MHz CPU and 512MB of RAM. Microsoft is targeting WARP at casual gamers, existing non-gaming applications, and advanced rendering games. The software evenly distributes rendering duties across all available CPU cores.

Graphics performance is nowhere near the level of a discrete video card. However, Microsoft says that typical performance on Penryn-based 3GHz quad core processors outperforms Intel integrated graphics in many benchmarks.

Microsoft says that on a Penryn quad-core CPU running Crysis at a screen resolution of 800 x 600 with all quality settings at the lowest level gave an average frame rate of 5.69 fps. Intel's integrated graphics at the same settings gave an average frame rate of 5.17 fps. A low-end NVIDIA 8500 GT gave an average frame rate of 41.99 fps at the same settings.

Intel's Larrabee will feature integrated graphics processing similar to what WARP promises.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Interesting...
By Spivonious on 12/1/2008 2:44:13 PM , Rating: 2
I wonder if we'll see a resurgence in software renderers, or if this is just a precursor to dedicated on-die graphics processing.




RE: Interesting...
By MrDiSante on 12/1/2008 2:51:45 PM , Rating: 3
I think this is more of a favour to Intel, which has traditionally been pretty bad with drivers. The abstraction layer necessary to make Larrabee run like a native GPU is one that has to be very performance tuned and well-written. If they can dump most of that on the OS and make a bare-bones driver, it could greatly improved performance, stability, faster response (in terms of driver updates), etc.


RE: Interesting...
By Mitch101 on 12/1/2008 3:19:09 PM , Rating: 4
Bad Video drivers are one of the main reasons for Vista's launch being so bad. So if you could keep the system running even if the video drivers crash is a plus.

No losing data because of a bad video driver or overly clocked video card.

I wonder if we will wind up with being able to update/roll back the video drivers without rebooting?


RE: Interesting...
By nosfe on 12/1/08, Rating: 0
RE: Interesting...
By Griswold on 12/1/2008 4:16:31 PM , Rating: 5
Thats nonsense. Vista already just restarts the driver if it starts to act up. I havent seen many bluescreens due to the video driver (compared to XP). There is still room for improvement, but its dramatically better than it used to be with previous windows versions. Plus, you can always fallback to the out-of-the-box driver that comes with windows. I've done (had to) that at the launch of vista thanks to nvidias total failure and had no issues keeping a system productive this way - just the eye candy wasnt there.

WARP is really useful for game development, not for rendering the windows desktop - you dont need D3D for that.

The real purpose of this is to give game developers the ability to test their render engines without depending on potentially defective drivers in early development stages. With this technique it would be easy to discover flaws in your own code rather than figuring out if it is the driver.


RE: Interesting...
By Mitch101 on 12/1/2008 4:40:41 PM , Rating: 2
quote:
The real purpose of this is to give game developers the ability to test their render engines without depending on potentially defective drivers in early development stages. With this technique it would be easy to discover flaws in your own code rather than figuring out if it is the driver.


Thats a great point if during development of video drivers they go belly up the system keeps running and you can possibly pull dump information to figure out what went wrong.


RE: Interesting...
By michael2k on 12/1/2008 5:56:04 PM , Rating: 3
quote:
WARP is really useful for game development, not for rendering the windows desktop - you dont need D3D for that.
You do if you wanted Aero...

I think this will prevent another Vista Capable/Ready fiasco. Instead all systems will be Vista 7 Capable, just some will be more capable (3D acceleration) than others.


RE: Interesting...
By murphyslabrat on 12/2/08, Rating: -1
RE: Interesting...
By Clauzii on 12/1/2008 8:30:20 PM , Rating: 3
I also see a software driven GPU take over in the case of lack of real drivers for ones hardware. That way the OS wouldn't go haywire because of some issues or a broken graphicscard.

Allthough going from 60 to 6 FPS ingame would suck..


RE: Interesting...
By inighthawki on 12/1/2008 4:18:31 PM , Rating: 5
I dont know about nvidia or intel drivers, but ATi's drivers install just fine and require no rebooting on vista. A system of installing drivers without rebooting already exists ;)


RE: Interesting...
By Mitch101 on 12/1/2008 4:56:27 PM , Rating: 1
I was avoiding naming any companies. Didn't want to start a fan war. God knows ATI had its issues with drivers in the early days and NVIDIA is probably responsible for how good ATI drivers are today.


RE: Interesting...
By inighthawki on 12/1/2008 5:00:05 PM , Rating: 2
oh I'm not trying to point fingers or favor anyone, i was just pointing out that ATi's drivers CAN install and update without rebooting, but I've never used nvidia or intel video drivers on vista before so i couldn't vouch for them


RE: Interesting...
By FITCamaro on 12/2/2008 8:38:56 AM , Rating: 2
I have to reboot after updating the driver with my Vista desktop running 8800GTS 512s in SLI.


RE: Interesting...
By omnicronx on 12/2/2008 11:17:21 AM , Rating: 2
SLI is driver based, chances that more than the base WDDM driver is installed without a reboot is very small, thus the need for a reboot.


RE: Interesting...
By bobobeastie on 12/1/2008 8:30:03 PM , Rating: 2
Where are you getting these magical driver updates? You can choose not to restart, but you aren't using the new drivers at that point.


RE: Interesting...
By OCedHrt on 12/1/2008 8:41:55 PM , Rating: 2
ATI drivers have been able to install w/o reboot for the past few releases. This does not include CCC. You can exit CCC first, then restart it, and you will not need a restart at all. The installer won't even ask for one.


RE: Interesting...
By omnicronx on 12/2/08, Rating: 0
RE: Interesting...
By Spivonious on 12/2/2008 12:46:26 PM , Rating: 2
You may not think it's possible, but it is. The AMD driver installer doesn't even ask for a reboot. You can notice the system switch over to the generic VGA driver, and then switch back to the new driver. The only reason for a reboot would be if a file was in use when it tried to update it.


RE: Interesting...
By inighthawki on 12/1/2008 9:32:23 PM , Rating: 2
Where? Go to amd.com, and download driver ;). Every version i know of since vista's release has never required a reboot. You install and it simply says "Success, click done to finish installation" and walla!


RE: Interesting...
By JonnyBlaze on 12/1/2008 10:11:31 PM , Rating: 4
after your screen flashes a few times you're using the new driver


RE: Interesting...
By Sulphademus on 12/2/2008 8:48:58 AM , Rating: 2
Ditto. Updated to ATI 8.11 this weekend, system didnt even ask for a reboot.

I forget the exact number given but, at the Vista launch event I went to a bit ago, MSFT said 80%+ of XP crashes (based on data from error reporting) were caused by video drivers. One of the things they changed was isolating the video driver so that Vista can restart it without bringing down the whole system.


RE: Interesting...
By omnicronx on 12/2/2008 10:31:44 AM , Rating: 1
quote:
One of the things they changed was isolating the video driver so that Vista can restart it without bringing down the whole system.
They put the video into the user-kernel mode for security and stability reasons, it has nothing to do with not having to restart. You didnt have to restart with XP either, it loads the base driver and initiates and completes all the other settings after reboot, while Vista does have more flexibility, it still requires a reboot to make changes to the system other than the base WDDM driver.


RE: Interesting...
By segerstein on 12/1/2008 4:59:48 PM , Rating: 2
Larrabee is an interesting technology. A new way of interfacing is certainly going to be needed.

DirectX is used to create a standardized interface on the top of proprietary arhitecture that changes radically with generations. On the other hand x86/x64 are well known architectures that are directly targeted by software developers (using compilers).

A java like JVM (or MS CIL/CLR) is going to be needed to compile&optimize on the fly for the specific Larrabee architecture.


RE: Interesting...
By Sc4freak on 12/1/2008 5:28:41 PM , Rating: 2
That would be redundant. That's pretty much what DirectX does already - DirectX does not interface with the graphics card directly. Instead, you send commands through DirectX (eg. using shaders) and it's the driver's job to compile and optimise the commands for the target architecture.

This allows for maximum performance, because drivers for different graphics cards will obviously optmise for the specific architecture it was designed for. Nvidia, for example, would write their drivers to compile and optimise shaders to best make use of their hardware architecture, and likewise Intel will have to do the same for Larrabee.

But I don't know how well they'll do, considering their current drivers...


RE: Interesting...
By Solandri on 12/1/2008 3:00:09 PM , Rating: 2
This seems like a natural response to multicore CPUs. 3D graphics is an embarrassingly parallel problem, so GPUs are just highly parallelized processors. You can think of them as having thousands of cores (each core can do a very limited set of calculations relative to a CPU core), all solving a different piece of the 3D problem.

As CPUs transition to multiple cores, serial tasks will leave lots of cores sitting unused most of the time. So if you don't have a dedicated video card, it makes sense to use those cores to tackle the parallel task of rendering 3D video. It won't be as effective as a dedicated GPU, but that's why this is geared towards systems without a dedicated GPU.


RE: Interesting...
By mmntech on 12/1/2008 6:37:58 PM , Rating: 2
This was basically what IBM had envisioned the Cell becoming. A computer on chip platform with each of the eight SPEs handling a different function. NVIDIA has also done something similar with the 9400M chipset. This sort of evolution of computers was inevitable since CPUs have gotten significantly more powerful in recent years. For home users, anything above a dual core and that CPU power is going to waste anyway, so it makes sense to put it to use. While it doesn't replace a proper GPU, it has the advantage of making computers thinner and lighter than they already are. It could also offer some power savings with laptops and netbooks.


What's the benefit for the end user?
By epyon96 on 12/1/2008 3:16:41 PM , Rating: 3
Someone please point out the benefit of WARP when all it does is improve framerates from 5.1 fps to 5.69 fps. The incremental improvement is not noticeable to the user for all intents and purposes. And this is on a high end quad 3.0 cord.

What's the point? Unless the performance improves 5 fold at least to be about 25-30, there's no point for WARP. Currently, the Intel integrated gfx are crap but they seem to run most of VISTA graphics sufficiently.




By Ryanman on 12/1/2008 3:20:00 PM , Rating: 3
Hopefully, when a Video card fails/hangs the CPU can step in and stop a full-on crash - Enough for you to save your game and close out the process instead of restarting. This has a potential to drop BSODs if they pull it off right.


By walk2k on 12/1/2008 4:23:18 PM , Rating: 1
Uh yeah, so a $1000 CPU can run graphics at 5 fps...

Versus a $60 video card (with ie a $160 cpu) at 40 fps?

PASS.

I mean yeah Crysis is pretty demanding but jesus that's horrible, can it even get 30fps in ANY semi-current game?


RE: What's the benefit for the end user?
By omnicronx on 12/1/2008 5:23:09 PM , Rating: 2
This is not meant for gaming! get this through your heads! It gives the ability for all machines to make use of basic DX10 runtime features (such as but not limited too Aero). It does not take a rocket scientist to figure out how this will help the low end sector, you will be able to have an Eee running Windows 7 with Aero, all on a tiny Atom core with no discrete graphics. Apple watch out! Windows is treading in your waters!


By walk2k on 12/1/2008 7:14:36 PM , Rating: 2
"Microsoft is targeting WARP at casual gamers, existing non-gaming applications, and advanced rendering games."

Yes, it is meant for gaming. Just LOUSY gaming! :)


RE: What's the benefit for the end user?
By michael2k on 12/1/2008 7:41:57 PM , Rating: 2
What this means, literally, is that Windows7 can now run on a cellphone, assuming the UI can be tweaked to adjust to the much lower resolution screen.


RE: What's the benefit for the end user?
By psychobriggsy on 12/2/2008 8:13:08 AM , Rating: 2
Except that high-end cellphone graphics processors are already more powerful than this on-the-CPU thing (which is very neat, but irrelevant), and I don't want to imagine the battery life of a cellphone with a quad-core 3GHz CPU in it (never mind the first degree burns). I guess that with the lower screen resolution it might get 12fps though.

And in 10 years time, when cellphones will have quad-core 3GHz (equivalent) CPUs, mobile graphics will have progressed a hundred-fold as well.


By omnicronx on 12/2/2008 9:14:56 AM , Rating: 2
quote:
Except that high-end cellphone graphics processors are already more powerful than this on-the-CPU thing
I think you are confused, most smartphones have a one chip solution already.

For example the HTC touch has the Qualcomm MSM 7200 which is an all in one solution.

That being said, who knows what will be more efficient, chances are you are correct though, we are now just starting to see smartphones taking advantage of their on chip 3d solutions which would probably be more power friendly then using general processing to do the grunt work.


By michael2k on 12/2/2008 12:47:20 PM , Rating: 2
It's not about power, it's about DX10 compatibility. Are you saying all high end cell phone GPUs are DX10 compliant?

With WARP10 they don't have to be. They will do their thing, and whatever feature the OS needs that isn't present can now be filled in by the CPU.


By spread on 12/1/2008 6:41:40 PM , Rating: 2
Good point, but this is not meant to replace a dedicated graphics card. Its for developers, and it looks like its setting a foundation for CPU+GPU fusion.


By PrinceGaz on 12/1/2008 7:32:31 PM , Rating: 2
The point is that any old graphics will be able to run Vista's (or rather Windows 7's) graphics sufficiently, regardless of the video-adapter in the system. You won't even need to rely on Intel integrated graphics supporting the full Aeroglass features, a decent CPU will be able to do the job instead.

A non-techie friend of mine who doesn't play games recently bought a desktop PC for home-use. The processor is a Core 2 Duo at 2.6GHz - not bad at all, but the graphics are provided by the Intel GMA 3100 (not to be confused with the somewhat better but still pretty dreadful GMA X3100).

How fast is the GMA 3100? As much as 3DMark is disliked for reliable benchmarking, it can at least give a ballpark figure most people can identify with, so here goes.
3DMark2001- 6773 - an old GeForce 3 would easily beat that with his 2.6GHz Core 2 Duo (if one could be fitted)
3DMark03- 1804 (CPU 924)
3DMark05- 750 (CPU 6658)
3DMark06- 284 (SM2.0 130, SM3.0 not supported, CPU 2060)

The fact is that as CPUs become ever more powerful and multi-cored, yet still very cheap, manufacturers will put the minimum spec graphics into them they can get away with because the CPU is the measure of "how fast it is" by salesmen to non-geeks. It's hard for them to push faster graphics to non-gamers who don't think they'll need that, but saying it has a faster, bigger, whatever processor is much easier to sell. Using that processor to do the graphics makes sense (in a twisted sort of way) as in many systems it will actually be faster at it that whatever graphics-adapter the computer has (and will probably be the only reason for needing a fast CPU in the first place for the average non-gamer).

It is a totally twisted approach to computer design, but market demand could lead to CPU graphics dominating the low-end, with the graphics adapters in them being little more than part of the mobo chipset which outputs the frame-buffer to the monitor. Rather like PCs used to be a long long time ago, when 3D acceleration was unheard of, and only high-end cards could accelerate certain 2D operations. Such systems will be useless for playing new games on, but will be fine for the Windows desktop which is all that will matter to many customers.


The article answered the most important question
By amanojaku on 12/1/2008 3:25:01 PM , Rating: 5
quote:
Microsoft says that on a Penryn quad-core CPU running Crysis at a screen resolution of 800 x 600 with all quality settings at the lowest level gave an average frame rate of 5.69 fps.

Yes, it CAN run Crysis. Badly.




By Motoman on 12/1/2008 3:56:48 PM , Rating: 3
I don't see the point here. So what, in 15 years there will be a CPU that with WARP can run Crysis at 30FPS? WTF?

Is there anyone in the world who thinks this is valuable at all in anything remotely resembling the near-term? Especially when the cost of a dedicated GPU (or decent IGP chipset) is darn-near negligible for a system builder?

I call shenanigans. This is just Microsoft pandering to Intel's ego in the same way they did with the Vista labeling. Would it kill Intel to just put out some video processing capabilities that don't suck abysmally?


By inighthawki on 12/1/2008 4:20:53 PM , Rating: 3
You're missing the point. This will allow things like laptops to limit the video cards by just enabling the cpu to run directx10 applications, plus its just an introduction. 5fps isnt much now, but in 10 years when software i optimized and cpus are several times better with many more cores, we might be talking an additional 20 fps to some of your favorite games on higher end graphics.


By foolsgambit11 on 12/1/2008 6:09:09 PM , Rating: 2
Doubtful. The problem is that game demands seem to ramp up as fast as the growth in performance of graphics cards will allow. And while it makes sense to add more and more 'processors' to GPUs, the costs of adding fully-x86-and-extensions-capable cores to a CPU are way higher than adding SPs to a GPU, for example. Especially when, much of the time, those CPU cores would be going under-used. Additionally, you see diminishing returns, I'm sure. Going to an 8-core CPU won't get you 10fps. Future games will demand even more - additionally reducing any gains from adding CPU cores. Not to mention, when gaming, there are plenty of tasks that already can be offloaded to the CPU. Leave the specialized tasks to the specialized hardware.

This doesn't make sense as a performance enhancer. It makes sense for non-gamers, who in the future won't need even integrated graphics to run the Aero interface or its successor. It makes sense as a 'spare tire' to get you home when you blow a tire (GPU crash, that is). That's what this is about.


By JKflipflop98 on 12/2/2008 7:48:17 AM , Rating: 2
Geez you guys are thick. It's pretty friggin obvious what this is for, but all anyone can look at is 5 FPS in stupid Crysis.

Stop looking at my finger and look at what it's pointing to.


By psychobriggsy on 12/2/2008 8:19:19 AM , Rating: 2
So the laptops will run a 25W CPU at full blast rendering, instead of slipping the CPU into a 5-10W lower power state and letting the 5W integrated graphics take over?

Or will they use a $1000 CPU + no graphics over a $200 CPU and $30 graphics? The 8400M can probably get 30fps at 800x600 in Crysis (not bothering to check) anyway.

Oh wait, your last comment looks like sarcasm.


By Mitch101 on 12/1/2008 4:38:26 PM , Rating: 2
I didn't get it at first either but I believe it all comes down to stability. If the video card drivers crash like at the launch of Vista everyone blames Microsoft and Vista for the crash not the video card company for the shoddy early release drivers. So your in the middle of a game and blam instant reboot. Your quickly on the boards telling everyone how Vista bites meanwhile the issue is the video drivers. With the GUI using so much more 3D this is almost necessary. If Microsoft does this then if the video drivers go south then the OS stays up. Sure it comes to a crawl but not losing your spreadsheet because of bad video drivers is good news.

I think this is great news but I wouldn't expect any miracle framerates from it.

Its like when the Physics cards came out everyone expected miracle framerates when how is more stuff needing be drawn on the screen going to cause higher frame rates?


By HinderedHindsight on 12/2/2008 9:42:01 AM , Rating: 2
quote:
. If the video card drivers crash like at the launch of Vista everyone blames Microsoft and Vista for the crash not the video card company for the shoddy early release drivers


This is making the huge assumption they are even implementing such a feature. So far, we just have an announcement for software based DX10. This is also assuming that Windows 7 is being built in a way that it could tolerate such a failure without a complete crash.

This is also assuming that it will be implemented in all versions of Windows 7- how much do you want to bet there will be a "Basic" version which, for all intents and purposes, has the exact same feature set as Vista Premium?

So far we have a fairly useless feature. If this is the only big new feature I have to look forward to, I'd rather stick with Vista and purchase a video card upgrade, because you know even an upgrade to Windows 7 will be expensive.


Useless.
By William Gaatjes on 12/1/2008 3:16:16 PM , Rating: 2
A dedicated cpu core will always be better then a general cpu core in these kinds of tasks. A highly optimized core like a gpu with it's own memory is much more efficiënt.

quote:
By using WARP, if the video card fails Windows will be able to continue rendering graphics and will kick in when the video card runs out of memory or hangs. WARP supports all Direct3D 10 and 10.1 features along with all the precision requirements for both specifications. Direct3D 11 is also supported. Optional texture formats are supported like multi-sample render targets and sampling from float surfaces. Anti-aliasing up to 8x MSAA is supported, as is anisotropic filtering.


I do understand why they want this, aero needs direct3d. And windows 7 is just a improved version of vista. As i see it, the implementation of programming a gpu like a cpu with memory management is not finished. This is a temporary measure maybe ?




RE: Useless.
By omnicronx on 12/1/2008 4:28:48 PM , Rating: 2
I think you kind of miss the point here, this is not aimed at high end gamers, it is being aimed at those with onboard graphics. It sounds pretty cool that if the onboard graphics card hangs or runs out of memory, warp kicks in and does the work that the onboard card cannot handle. This would be revolutionary and would give OEMS the ability to have the option of not having a dedicated video card at all, while still have a more than capable system. This could vastly cut the costs of entry level and business systems.


RE: Useless.
By walk2k on 12/1/2008 7:17:00 PM , Rating: 2
The problem with this theory is that laptops and netbooks and such aren't stuffed with 3.0Ghz Xtreme quadcore processors either.

Again, why would you pay $1000 for a cpu (or even $500) to do what a $100 cpu and a $50 integrated graphics chip can?


RE: Useless.
By michael2k on 12/1/2008 7:50:30 PM , Rating: 2
Um, this happened already. It was called Vista Capable/Ready; if Microsoft had this solution available three years ago, there would be no distinction, Aero would run on all systems, and driver crashes would not have plagued Vista's debut.

Laptops and netbooks are stuffed with inadequate GPUs (last I heard Intel had 70% marketshare), so having a CPU code fallback makes perfect sense.

The question is, who makes laptops and netbooks with GPUs? Most don't, if Intel has 70% marketshare!


RE: Useless.
By omnicronx on 12/2/2008 9:18:36 AM , Rating: 2
quote:
Um, this happened already. It was called Vista Capable/Ready; if Microsoft had this solution available three years ago, there would be no distinction, Aero would run on all systems, and driver crashes would not have plagued Vista's debut.
Bingo, Microsoft is learning from their mistakes of Vista. As for the poster above, there was never a claim that you needed quad core 3ghz processor to use this feature, in fact it is probably quite the opposite. Remember this feature is designed specifically for the low end, it would be pretty much pointless if it only worked on the most high end laptops and computers. From what I have seen, WARP will be able to work with even the most feeble setups (including the ATOM which is leaps and bounds behind current processors in terms of raw power)


RE: Useless.
By William Gaatjes on 12/3/2008 2:32:15 PM , Rating: 2
It will work but will it be usefull ? WIll it be more productive... I doubt it. Possible scenario : My cpu is busy during various tasks and then also has to emulate a gpu. Right.


RE: Useless.
By William Gaatjes on 12/3/2008 2:30:00 PM , Rating: 2
quote:
I think you kind of miss the point here, this is not aimed at high end gamers, it is being aimed at those with onboard graphics.


A highpower cpu but not a high power gpu. That is a bad design decision unless you are running a server...
And a server running a 3d user enviroment like aero ? I don't think so.

I think warp is more or less a step between into what directx10.0 promised but never delivered. If i am right, directx11 will fully support these features. This was also mentioned as reason why directx 10 was not possible under windows xp. GPU memory management alike the cpu does and gpu interruptability. To let the gpu be under full control of the os. I feel warp is a step in between current windows directx features and maybe directx11.


RE: Useless.
By Screwballl on 12/1/2008 4:38:33 PM , Rating: 2
You can crap on the lawn and paint it green, it is still crap (aka Vista).


Intel vs AMD
By JonnyDough on 12/1/2008 3:11:59 PM , Rating: 2
This could hit AMD hard if they lead in GPU sales and are being overpowered by the big boys in blue. I'm talking about Intel, not the cops. Whoever leads in CPU strength particularly in reference to WARP, could stand to gain market share - or lose it.




RE: Intel vs AMD
By drebo on 12/1/2008 3:53:31 PM , Rating: 3
This isn't going to affect discrete GPU sales one bit.

You still don't get any hardware decode accelleration, dual monitor support, or any of the other benefits of running a discrete GPU.

This is more of a novelty item or a way for MS to provide the Aero interface to everyone, regardless of GPU.


RE: Intel vs AMD
By JonnyDough on 12/2/2008 7:28:41 PM , Rating: 2
I'm not sure why you think I was referring to DISCREET GPUs. I was talking about integrated, which is still the largest market sector of GPU sales. I think you're failing to see the future here.

Imagine an 8 or 16 core CPU (32 core?) running a PC and graphics on the desktop. With that kind of left over power I'm quite certain that we'll eventually see quad monitor support. Do you honestly think that PCs are going to continue to be that limited in power? The question is "what do you want to do?" and the answer is that operating systems and programs for the average user at home are never going to match the power needed to do things like modeling and massive renders you'd find in a place of work. So my point is that eventually, we'll reach a place where we have more power than we really need for most tasks, and I would say that many homes in America already have unused CPU power laying around at home. That's why we have FaH (folding).

This is more of a way to sell Windows 7, not necessarily "the Aero interface". That is a plain crazy notion.

You will likely get ALL the benefits of today's discrete GPUs, just not yet. A CPU is designed to do many things, but what if you took a few cores and designed them just for graphics? What then? GPU and CPU on one die. That is what MS is being told is about to happen, and that is why they are taking this next step with their OS to begin to integrate that. Wake up.


RE: Intel vs AMD
By Headfoot on 12/5/2008 9:41:57 PM , Rating: 2
AMD's integrated graphics are actually very robust compared to Intel's integrated graphics. The 780G and 790GX chipsets can actually run games at reasonable frames per second.


RE: Intel vs AMD
By bhieb on 12/1/2008 4:26:36 PM , Rating: 2
I disagree look at the numbers. The performance is a joke even @ 3GHz. No one should/would "game" on this. It might be nice to help accelerate some flashy OS options (akin to areo), but to say that it is in anyway a competitor to a GPU is way overstating the market for something like this. It may be a competitor to integrated graphics, but even there the marginal gains compared to a crappy Intel IGP is a joke. Afterall 5.1 fps vs 5.7 fps is still unplayable. A 20000% increase of 0 is still 0, net result is you still won't game with it and it will not impact discrete or even "good" IGP offerings.


RE: Intel vs AMD
By JonnyDough on 12/2/2008 7:39:32 PM , Rating: 2
I wouldn't say it's a joke, after all that IS Crisis, this IS a quad core, and this IS Crisis. Did I mention it's Crisis? If you can pull even 5fps on a CPU that is ALSO running the game, then that's nothing at all to scoff at. It's pretty damn amazing. Give the tech a bit of time, and let CPUs be a bit more designed for graphics and I think it'll be pretty awesome. Take an older game like StarCraft and run it and I'll bet you'd do just fine. My question is: with future tech how little energy will my PC use for basic games?

Overstating the market? The market IS integrated GPUs. Most people barely do more than watch a video on YouTube. We won't even mention sales to businesses, which are huge.


Someone help me out here
By masher2 (blog) on 12/1/2008 5:04:57 PM , Rating: 2
quote:
By using WARP, if the video card fails Windows will be able to continue rendering graphics and will kick in when the video card runs out of memory or hangs.
Let's assume your video card fails. WARP kicks in, begins rendering.

Now...how does that rendered data get displayed on a monitor that's not connected to anything but your failed video card?




RE: Someone help me out here
By AssBall on 12/1/2008 5:18:34 PM , Rating: 2
I think it is more for the driver failing, not the hardware. It would be reverting to different drivers to display through the card.

Kinda like when you display something like a BIOS, you use the hardware but not the maximum 3d acceleration the hardware is capable of with drivers.


RE: Someone help me out here
By omnicronx on 12/2/2008 10:08:34 AM , Rating: 2
Masher is correct, the chances that windows can change display drivers are on the fly is very small. This being said if a device fails due to driver issues or card failure, it may still be rendering, but it wont be outputting to your monitor.


RE: Someone help me out here
By AssBall on 12/2/2008 12:27:17 PM , Rating: 2
quote:
the chances that windows can change display drivers are on the fly is very small.


What? Isn't that what Windows does when it boots up anyway? I don't understand how WARP works, but I don't see any reason why Windows couldn't swap display drivers or why the result would be due to "chances".


RE: Someone help me out here
By emboss on 12/2/2008 10:12:48 PM , Rating: 2
The problem is that up until the main OS driver takes control, the card is operating in VGA emulation mode (which is set up by the card BIOS early in the POST sequence). Once the main driver starts, the card is operating in accelerated mode, and the emulation interface is effectively disabled. The only way to get back to VGA emulation mode is through a reset of the card or by having the driver switch the card back again.

Unless you have a motherboard that supports PCI-e hot-plug, there's no way for the OS to reset the card short of a complete reset of the computer. Additionally, with the main driver out of action, there's no way to get the driver to switch the card back to VGA mode either. So although WARP would be able to render whatever, there's no way to get the result to the card, and therefore no way to get the result onto your screen.


Not Windows 7
By Ilfirin on 12/2/2008 9:57:50 AM , Rating: 2
Small correction here.

WARP is a part of the Direct X 11 SDK, which may very well come out BEFORE windows 7 and will, regardless, be available on Vista. Thus WARP is a feature of DirectX, not Windows 7.




RE: Not Windows 7
By omnicronx on 12/2/2008 11:02:38 AM , Rating: 2
Are you sure? Its being used with Direct3d 10, which is part of DX10 not 11. In fact Microsoft refers to it as WARP10 as it is closely tied with Direct3d 10.

This is from the download section of the latest DX SDK
quote:
Windows Advanced Rasterizer Platform (WARP)

Available in this SDK through Direct3D 11 and eventually also through Direct3D 10.1, WARP is a fast, multi-core scaling rasterizer that is fully Direct3D 10.1 compliant. Utilizing this technology is as simple as passing the D3D10_DRIVER_TYPE_WARP flag in your device creation.


Don't be surprised to see MS only release this for 7, kind of like how they left out DX10 in XP, even though it has been proved that you can install it. It kind of sounds like although it is a DX component, that it may be specifically used in 7 and not Vista. Not one doc mentions Vista compatibility. I just can't find a single piece of info on the subject.

I guess we will know more on the subject when the DX11 beta is released (not just the current preview).


RE: Not Windows 7
By Ilfirin on 12/2/2008 11:12:59 AM , Rating: 2
Positive. Although it is technically still part of DX10, its official release timeframe is at least loosley tied to that of DX11 which, as of yet, may or may not be released earlier, the same time or after Windows 7 but will, as promised, be available on Vista.


RE: Not Windows 7
By omnicronx on 12/2/2008 11:21:30 AM , Rating: 2
quote:
as promised, be available on Vista.
Good to hear! I am glad they did not do what they did in Vista, although critics would have been quick to point out that no video API changes have been made in 7, as such they would not have the same excuse they did with XP (they said WDDM drivers were required for DX10 although it was later proved this is not true)

Looks like DX11 is slated for release after Windows 7 release though (it definitely won't make the Beta which is due in the next few weeks)


RE: Not Windows 7
By Ilfirin on 12/2/2008 11:45:34 AM , Rating: 2
Well, you'll notice that DX11 is in "beta" (err.. Technical Preview) right now - http://msdn.microsoft.com/en-us/directx/default.as...

Given how easy it is for OS release dates to slip and the demand of the graphics industry, I wouldn't completely discount the possibility of DX11 being officially out before Windows 7.

But then I've always been sceptical of Microsoft's expected launch date for Windows 7.


SLI/Crossfire
By Kary on 12/1/2008 3:28:06 PM , Rating: 2
Would it work in SLI/Crossfire configuration..maybe with Intel onboard graphics chips to get you 10 fps.

No more blue screens..I'll believe it when I don't see them (rebooting doesn't count MS)

and lastly, by the time Windows 7 releases shouldn't 8 core CPUs be out?




RE: SLI/Crossfire
By omnicronx on 12/1/2008 4:24:21 PM , Rating: 2
What would be the point, windows can already run more than one cpu with multiple cores natively, which is the entire point of SLI/Crossfire implementations. As Windows would already has access to all cores without the need of software drivers doing the work, it could potentially yield some pretty good results on a multiple core system.


RE: SLI/Crossfire
By Kary on 12/1/2008 5:23:13 PM , Rating: 2
quote:
What would be the point, windows can already run more than one cpu with multiple cores natively, which is the entire point of SLI/Crossfire implementations.


Sorry, meant with an Intel integrated GPU, for instance

Not multiple instances of WARP (because despite my high caffeine levels even I wouldn't suggest that :)


Backporting to XP
By Kary on 12/1/2008 3:56:41 PM , Rating: 2
I wonder if they could backport this to XP so that you could run older games that aren't compatible with Vista/Windows 7 through an emulator with 3d acceleration




RE: Backporting to XP
By TheSpaniard on 12/1/2008 4:50:15 PM , Rating: 2
I would rather they develop a better XP emulation into Win7...


RE: Backporting to XP
By omnicronx on 12/2/2008 10:14:35 AM , Rating: 2
Most programs should work in Vista/7 unless they are dos based (which you would not have much luck in XP anyways) or they happen to make API calls that are no longer available in Vista/7. If this is the case, no XP emulation is going to help you. So if you are expecting some kind of directshow emulation or similar components to get certain things working under 7, don't hold your breath.


Relevant to my interests
By piroroadkill on 12/1/2008 6:12:32 PM , Rating: 1
If this allows Aero to run software accelerated on some screens and hardware on others, I'm for this. My clusterfuck of a setup has old cards and a new one to support multiple monitors. As thus Microsoft Windows XP is the only current choice




RE: Relevant to my interests
By Makaveli on 12/1/2008 7:25:39 PM , Rating: 1
All you guys crying about how this is shit and why bother if its only gonna give you 5fps. It clearly states in the article this is not ment to replace GPU's and a few posters have also said the same thing. If you don't know how to read I heard hooked on phonics is pretty good.

God I think even britney spears would have understood this article.

Wankers!


RE: Relevant to my interests
By Headfoot on 12/5/2008 9:46:48 PM , Rating: 2
You're insulting this guy for reading comprehension? Seriously? ARE YOU KIDDING?
You fail at reading comprehension far more than the OP.

He wants this WARP to be able to run EXTRA MONITORS in addition to the ones that he can currently run on his GPU. This means if you have a GPU with 2 DVI slots and your motherboard has 1 DVI slot that WARP can use, then he could theoretically use (Do the math with me now, 2 + 1 is...) 3 monitors instead of 2 as it is currently.


One interesting idea
By FITCamaro on 12/2/2008 8:44:23 AM , Rating: 2
I wonder if this will pave the way for Microsoft to have a standard physics API as part of DirectX.




RE: One interesting idea
By Headfoot on 12/5/2008 9:49:30 PM , Rating: 2
DirectX 11 already has this.

They integrated a Computer Shader (GPGPU for all essentially) feature that can theoretically allow for a standard physics API. All they have to do is actually implement it.


reference rasterizer?
By Visual on 12/2/2008 3:43:46 AM , Rating: 2
How is this thing different from the ages old concept of a software rasterizer, like the reference rasterizer that was shipped with the directx SDK?
Is there no such rasterizer any more? I remember there was in the days of dx7 and 8, but haven't bothered with dx development for a while and have no clue if it's the same in the latest versions.

I understand there are improvements in terms of multi-core utilization and performance, but is there any other difference here?




By ET on 12/2/2008 5:19:28 AM , Rating: 2
First of all, Microsoft advertising WARP as a Windows 7 feature is a strange marketing twist (I mean, I don't understand it). It's part of Direct3D 11, which will run on Vista just fine, and the beta is already available on Vista.

The reference rasteriser continues to exist, and will not be replaced. Refrast is part of the SDK, and wasn't available to users. This new rasteriser is a user technology, and though it might help somewhat with development (when refrast is too slow), it's more intended to provide a fallback in situations when D3D10 hardware isn't available.

Bad video drivers were indeed a problem at Vista's beginning (especially NVIDIA ones), and caused a lot of blue screens. However, WARP isn't currently intended to solve that. It certainly won't switch on the fly, since Vista (and presumably Windows 7) will just try to restart the driver. If the driver causes a blue screen, having WARP won't help.

The current speed will probably improve (at least that's what Microsoft intends), but IMO it will never be fast enough for high end games. Still, for people writing casual titles will be able to switch to D3D10 (as long as they're willing to target Vista and up only, which I think is what Microsoft hopes) instead of staying with the outdated Direct3D 9 API.




WARP10
By Rhonda the Sly on 12/2/2008 7:24:56 AM , Rating: 2
If WARP10 allows rendering of the entire UI in Classic, Basic, or Aero for "general use" applications on the CPU alone with decent performance (framerate on desktop as well as resource usage), I imagine this will be a pretty big hit with businesses and really low-end stuff. The Crysis benchmark was likely a simple show to catch attention. Gaming on this, aside from StarCraft, will never happen but businesses and low-end laptops/netbooks will love this. Depending on how performant/efficient it is there could be actual battery life increases and some heat reduction (no more roasted chestnuts).

If anyone is interested Microsoft hosts the the PDC Session "Unlocking the GPU with Direct3D" on Channel 9, WARP10 and D3D11 are talked about there. Allison Klein, the session speaker, starts on WARP10 around 14 minutes in.

http://channel9.msdn.com/pdc2008/PC05/




Not necessary
By tallcool1 on 12/2/2008 7:39:53 AM , Rating: 2
All this comes about because Intel can not build a decent entry level integrated GPU, which caused alot of issues with "Vista Compatible/Capable" stickered PC's, the lawsuit, etc...




The Picture of the Enterprise
By blazeoptimus on 12/2/2008 9:38:54 AM , Rating: 2
Hey, I know its artistic choice, but didn't anyone else notice that the ship is upside down? - Yes, I know technically its not the enterprise. If the pic is from the episode I'm thinking, its most likely the Columbia NX-02




Crysis Benchmarks
By omnicronx on 12/2/2008 11:15:26 AM , Rating: 2
For all of you that want to know the most asked question on DT, here you go ;)

Crysis Benchmarks @ 800x600 lowest settings

Core i7 8 core 3.0GHz-- Avg FPS: 7.36 Min FPS: 3.46 Max FPS: 15.01

AMD FX74 4 Core @ 3.0GHz-- Avg FPS: 3.43 Min FPS: 1.41 Max FPS: 5.78


http://msdn.microsoft.com/en-us/library/dd285359.a...

You won't be replacing your discrete card anytime soon ;)




Aero with WARP ?
By Tractor on 12/2/2008 10:05:53 PM , Rating: 2
Has anyone already successfully installed WARP on a Windows Vista system, and made Aero work through it ?




WARP 10 today
By DemolitionInc on 12/4/2008 10:32:35 AM , Rating: 2
I think MS is only going to fool us. Warp 10 will only be reaced in the 24th Century. Tom Paris will pilot a shuttle from the Voyager at this speed and experience being at the every point of the universe at one point in time.

http://en.wikipedia.org/wiki/Warp_drive

:-)




OS/2 3.0 is still better than Windows Vista 2.0
By UNCjigga on 12/1/08, Rating: -1
By MrDiSante on 12/1/2008 3:11:06 PM , Rating: 3
And you are a troll. There, I said it...


By UNCjigga on 12/2/2008 11:59:08 AM , Rating: 2
Aww c'mon, no love for the OS/2 Warp reference?


By Belard on 12/1/2008 3:40:29 PM , Rating: 2
No No NO!

You got it wrong.

AmigaOS 3.0 is still better than Vista.

Vista 2.0 isn't out yet, it maybe better.


By omnicronx on 12/1/2008 4:46:47 PM , Rating: 1
7 is just as much Vista 2.0 as Windows 2000 is XP. So what exactly is your point here? Microsoft has had a tick tock system on every single Windows release
-Windows 3.1/3.11 (which was considered a major upgrade)
-Win95/98 (once again major upgrades between releases)
-Win2k/XP (same base OS, yet major distinct changes)

The media has really spun this out of control, anyone who knows anything about OS development knows that it goes API, behind the scene changes etc (Vista/tick) and then visual/featuresets to take advantage of said changes (7/tock)

p.s don't count ME, it was a patchwork release because XP was off schedule and MS didnt want people thinking that Win2k was the replacement for 98.


By inighthawki on 12/1/2008 5:02:36 PM , Rating: 2
"p.s don't count ME..."

lol, i think most people have forced themselves to believe that ME never existed, with how terrible it was. I once had it crash on a fresh install by hitting the enter key :P


By Headfoot on 12/5/2008 9:51:52 PM , Rating: 2
Call me crazy but I liked ME. It had some cool features that didn't come out till later and my experience was relatively crash free.

It doesn't seem like anyone else had the same experience though...


"Vista runs on Atom ... It's just no one uses it". -- Intel CEO Paul Otellini

Related Articles
Intel Talks Details on Larrabee
August 4, 2008, 12:46 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki