Print 38 comment(s) - last by 3kliksphilip.. on Feb 20 at 11:59 AM

The physics-intensive Cell Factor: Revolution demo will soon be "No PhysX Card Required"
NVIDIA's purchase of AGEIA leads to a PhysX-on-CUDA port

With the announcement earlier this month of NVIDIA's acquisition of AGEIA, rumours began to fly immediately surrounding the future of dedicated physics hardware -- and it now appears that the PhysX name will live on as a checkbox beside the capabilities of some current and most future NVIDIA GPUs.

During NVIDIA's fourth-quarter financial results conference call, CEO Jen-Hsun Huang responded to several questions about the plans for technology obtained in the AGEIA purchase, revealing that the plan is to port the AGEIA PhysX engine to NVIDIA's CUDA (Compute Unified Device Architecture) C-like programming language.

"We're working toward the physics-engine-to-CUDA port as we speak. And we intend to throw a lot of resources at it." said Huang. "[PhysX on CUDA] is just going to be a software download. Every single GPU that is CUDA-enabled will be able to run the physics engine when it comes."

NVIDIA's choice to run a physics engine on a GPU runs in stark contrast to AMD's assertion in late 2007 that "GPU based physics is dead until DirectX 11." Every NVIDIA 8-series GPU is currently capable of running CUDA applications, and future GPUs will no doubt retain this feature.

The idea of using SLI for more than graphics has been brought up by NVIDIA in the past, so it was no surprise to hear Huang endorsing its further use again. "It might - and probably will - encourage people to buy a second GPU for their SLI slot. And for the highest-end gamer, it will encourage them to buy three GPUs." No mention was made of the use of the upcoming "Hybrid SLI" technology showcased at CES 2008, but an onboard GPU supporting CUDA could theoretically be used as a physics processor while discrete GPUs handle the rendering.

No timeframe for the release of the PhysX-on-CUDA software was specified, but with the PhysX engine to be available to a larger audience, it will no doubt encourage the development of more accelerated physics engines in upcoming titles.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Old Video Cards
By Chris Peredun on 2/15/2008 11:49:36 AM , Rating: 5
Now the big question for me is whether I can plug in an "old" video card to do my physics while a "new" one does my graphics?

Unfortunately not - CUDA isn't supported on anything other than the 8-series right now, just due to the requirement for a more programmable unit.

Depending on the efficiency of the PhysX-on-CUDA engine though, you could pick up a low-end 8-series card just for physics and let your 7900GT continue to handle graphics.

RE: Old Video Cards
By mendocinosummit on 2/15/2008 12:16:39 PM , Rating: 2
I am surprised that they don't all have to be the same card or do they have to be?

RE: Old Video Cards
By Chris Peredun on 2/15/2008 12:20:47 PM , Rating: 2
I am surprised that they don't all have to be the same card or do they have to be?

While the current iteration of SLI requires that the cards be identical, just for the ease of balancing the workload (either via SFR or AFR) - the upcoming Hybrid SLI's "GeForce Boost" is aiming to negate that requirement.

Also, if the cards are not in an SLI configuration - or in a two-card SLI with a third acting totally outside the graphics loop - it would seem that only the "paired" cards would need to be identical; the third could run on its own time.

RE: Old Video Cards
By UppityMatt on 2/15/08, Rating: -1
RE: Old Video Cards
By Proteusza on 2/15/2008 2:47:01 PM , Rating: 2
My guess is that if you could, you would need to treat one of the cards as a 320MB card.

Why bother anyway, unless you got one of them for free.

RE: Old Video Cards
By Clauzii on 2/15/2008 10:37:46 PM , Rating: 2
The 640MB card will behave like it only had 320. (Don't know if nVidia changed that with recent drivers??)

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher
Related Articles
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM
Havok Causes Havoc in GPU Physics
November 21, 2007, 4:46 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki