Print 60 comment(s) - last by jtesoro.. on Feb 7 at 9:02 PM

BFG AGEIA PhysX Accelerator
The physics battle heats up

FPS Labs is reporting that NVIDIA will soon announce its intention to acquire AGEIA. FPS Labs' Stu Grubbs has no financial terms for the deal, but notes that his confidential source expects the deal to become official sometime this week.

AMD batted around the idea of purchasing AGEIA in November 2007, but considering that the company is still recovering from its ATI acquisition, that idea was put to rest rather quickly. It should be interesting to see how AMD will respond to the news if the announcement comes this week -- especially considering that AMD has already declared GPU-based physics dead.

A successful acquisition of AGEIA would give NVIDIA the firepower to go up against Intel which purchased physics software developer Havok in September.

The last time that DailyTech covered AGEIA, its PhysX 100M mobile physics processor was sharing chassis space with dual GeForce 8700M GT graphics cards in Dell's $2,700 XPS M1730 World of Warcraft Edition notebook.

If NVIDIA has its way, NVIDIA GPUs and chipsets may have an even closer synergy with AGEIA hardware in the future.

Updated 2/4/2008
Just moments after this story went live, NVIDIA sent us the official PR announcing its acquisition of AGEIA (further details will come forth on Wednesday):
NVIDIA, the world leader in visual computing technologies and the inventor of the GPU, today announced that it has signed a definitive agreement to acquire AGEIA Technologies, Inc., the industry leader in gaming physics technology. AGEIA's PhysX software is widely adopted with more than 140 PhysX-based games shipping or in development on Sony Playstation3, Microsoft XBOX 360, Nintendo Wii and Gaming PCs. AGEIA physics software is pervasive with over 10,000 registered and active users of the PhysX SDK.

“The AGEIA team is world class, and is passionate about the same thing we are—creating the most amazing and captivating game experiences,” stated Jen-Hsun Huang, president and CEO of NVIDIA. “By combining the teams that created the world’s most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world.”

“NVIDIA is the perfect fit for us. They have the world’s best parallel computing technology and are the thought leaders in GPUs and gaming. We are united by a common culture based on a passion for innovating and driving the consumer experience,” said Manju Hegde, co-founder and CEO of AGEIA.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Hmm...
By roadrun777 on 2/4/2008 11:28:30 PM , Rating: 2
I think your making alot of assumptions here.
I suspect the reason behind these decisions has to do with the programming interface and parallel cores problem.

If you could virtualize several of the standard physics functions and tie that in to the shader, you could create a programming kit that would allow programmers to concentrate more on the game experience, rather than the high math involved, in writing complex shader code, for effects that could be achieved with a single line of code, using a physics processor.

RE: Hmm...
By roadrun777 on 2/4/2008 11:37:26 PM , Rating: 2
Oh, and as far as it costing more to manufacture and driving up the price. I don't see that happening. The actual cost to produce these components is minuscule to the cost of doing business. If you just concentrated on manufacturing costs and materials that 400$ board costs around 25$ to make (including labor). Once you add in licensing, research and development, and the standard bureaucratic pork, then, and only then, do you get these inflated costs to manufacture.

RE: Hmm...
By Polynikes on 2/5/2008 4:22:16 PM , Rating: 1
The actual cost to produce these components is minuscule to the cost of doing business.

So why do many Apple products carry such price premium when comparable products are much cheaper?

It doesn't matter how much it costs Nvidia, it matters how much it costs me.

RE: Hmm...
By mindless1 on 2/6/2008 11:33:09 AM , Rating: 2
Then you'd be wrong. Increasing die size and power consumption is a primary differentiation between inexpensive and expensive video cards. All the addt'l R&D, including developing the software/driver will have inherant costs as well.

All these ideas about manufacturing and other business costs don't change the fact that one video card ends up costing multiple times what another does. "Driving up price" is a relative concept though, depending on who's deciding it's not worth their money.

"I modded down, down, down, and the flames went higher." -- Sven Olsen
Related Articles

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki