Print 60 comment(s) - last by jtesoro.. on Feb 7 at 9:02 PM

BFG AGEIA PhysX Accelerator
The physics battle heats up

FPS Labs is reporting that NVIDIA will soon announce its intention to acquire AGEIA. FPS Labs' Stu Grubbs has no financial terms for the deal, but notes that his confidential source expects the deal to become official sometime this week.

AMD batted around the idea of purchasing AGEIA in November 2007, but considering that the company is still recovering from its ATI acquisition, that idea was put to rest rather quickly. It should be interesting to see how AMD will respond to the news if the announcement comes this week -- especially considering that AMD has already declared GPU-based physics dead.

A successful acquisition of AGEIA would give NVIDIA the firepower to go up against Intel which purchased physics software developer Havok in September.

The last time that DailyTech covered AGEIA, its PhysX 100M mobile physics processor was sharing chassis space with dual GeForce 8700M GT graphics cards in Dell's $2,700 XPS M1730 World of Warcraft Edition notebook.

If NVIDIA has its way, NVIDIA GPUs and chipsets may have an even closer synergy with AGEIA hardware in the future.

Updated 2/4/2008
Just moments after this story went live, NVIDIA sent us the official PR announcing its acquisition of AGEIA (further details will come forth on Wednesday):
NVIDIA, the world leader in visual computing technologies and the inventor of the GPU, today announced that it has signed a definitive agreement to acquire AGEIA Technologies, Inc., the industry leader in gaming physics technology. AGEIA's PhysX software is widely adopted with more than 140 PhysX-based games shipping or in development on Sony Playstation3, Microsoft XBOX 360, Nintendo Wii and Gaming PCs. AGEIA physics software is pervasive with over 10,000 registered and active users of the PhysX SDK.

“The AGEIA team is world class, and is passionate about the same thing we are—creating the most amazing and captivating game experiences,” stated Jen-Hsun Huang, president and CEO of NVIDIA. “By combining the teams that created the world’s most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world.”

“NVIDIA is the perfect fit for us. They have the world’s best parallel computing technology and are the thought leaders in GPUs and gaming. We are united by a common culture based on a passion for innovating and driving the consumer experience,” said Manju Hegde, co-founder and CEO of AGEIA.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By retrospooty on 2/4/2008 5:39:26 PM , Rating: 5
"If NVIDIA has its way, NVIDIA GPUs and chipsets may have an even closer synergy with AGEIA hardware in the future."

If this means better physics are going to be integrated into the same card (and hopefully chip), then thats great. If not, forget it. Not many people will buy it as an extra card. The same issue Ageia faces today.

RE: Hmm...
By SirLucius on 2/4/2008 5:58:55 PM , Rating: 3
Agreed. If they just releases separate physics cards under the Nvidia name they'll run into the same exact problem Ageia is facing now. Nobody wants to buy dedicated physics processors if no games support advanced physics. But no developers will jump on the physics bandwagon if nobody has the hardware to support it.

Hopefully Nvidia will figure out a way to integrate physics into their new GPUs without significantly driving up the costs. This way more people will have physics capable hardware without having to spend more money and take up another PCI slot in their cases. Once people have the hardware, I think we'll see more developers implement physics into games.

RE: Hmm...
By TomZ on 2/4/08, Rating: 0
RE: Hmm...
By DOSGuy on 2/5/2008 1:46:22 AM , Rating: 4
A dedicated processor will destroy a general purpose processor in any test. CPUs can go quad core, and graphics cards can go SLI/CrossFire, but they will always lose out to a dedicated processor. No matter how many cores CPUs end up having, someone will devote an entire die to that task, whether it be hardware decryption for set-top boxes, or floating-point math co-processors from ClearSpeed. Dedicated processors also achieve that performance at a lower cost, and drawing less energy than it would take to reach the same performance with a bunch of general purpose processors.

Instead of adding more cores, CPU/GPU companies traditionally make their processors more competitive with dedicated processors by adding dedicated hardware to their general purpose processors. If physics are important to Intel or AMD, they could add a "physics processing unit" or special instructions designed to speed up that particular task. VIA added dedicated cryptography encryption/decryption hardware into their CPUs, and ATI and Nvidia both added hardware encryption for media content creation to their GPUs. Nvidia can now incorporate PhysX logic into their GPUs if they choose to. I won't predict what their plan is, but I can assure you that any dedicated chip that they produce, or dedicated physics processing unit that they add to their GPUs or chipsets, will blow away any general purpose processor in the same task, no matter how many cores it has.

RE: Hmm...
By kelmon on 2/5/2008 3:16:22 AM , Rating: 2
Relatively unimportant. Sure, a dedicated processor will be faster than a general purpose one but I already have 2 processors in my laptop and desktops have as many as 8 now, so I'd rather make better use of existing resources than introduce something that will almost never be used.

RE: Hmm...
By mindless1 on 2/6/2008 11:23:49 AM , Rating: 2
If that were feasible they'd do it, but have found there is not enough realtime processing power from the CPU because it was not designed for this. Since it is a feature only gamers would make much use of, it makes more sense for it to be a cost borne by video card buyers not everyone as it would be if integral to the CPU design. I suppose we could instead argue about whether there should be more forks in processor offerings as there are in video card offerings but that seems too much of a niche market for a CPU manufacturer while exactly what a video card manufacturer thrives on.

RE: Hmm...
By Visual on 2/5/2008 5:08:55 AM , Rating: 2
But the problem is, how dedicated is this thing really?
Look at graphic processors for example, they are now highly programmable and are more or less becoming general purpose.
So wouldn't it be natural that the physics processor's evolution goes the same way - you'd be trying to make it more and more universal, general, and eventually it won't be too different than a cpu or gpu.

RE: Hmm...
By qwertyz on 2/5/08, Rating: -1
RE: Hmm...
By murphyslabrat on 2/5/2008 10:53:32 AM , Rating: 5
That, my friend, was a very bold statement.

RE: Hmm...
By tastyratz on 2/6/2008 3:22:54 PM , Rating: 2
Not entirely, it was more blunt and simplified than plain bold.
Sounds like what he is saying relates to a very realistic line for Nvidia to follow. Any dedicated circuitry increases costs. There is a very strong chance that Nvidia will simply release higher end gpu's with physics processing, or stick the Aegia's chip onboard making the videocard/Physic processing unit a single offering. This will likely be shed from lower tier products to save on costs.

RE: Hmm...
By BansheeX on 2/5/2008 3:50:44 AM , Rating: 2
Nobody wants to buy dedicated physics processors if no games support advanced physics. But no developers will jump on the physics bandwagon if nobody has the hardware to support it.

Welcome to the PC model of gaming.

RE: Hmm...
By retrospooty on 2/5/2008 8:35:13 AM , Rating: 2
To an extent maybe... With a small 3rd party startup like ageia using a proprietary hardware/software package that is true. But in the past Open GL, and currently DirectX prove that theory wrong.

By a large margin, the best graphics have always been PC, and still are today.

RE: Hmm...
By Omega215D on 2/4/2008 5:59:09 PM , Rating: 2
Imagine nVidia integrating it into a chip on the motherboard like Soundstorm in the southbridge now that would be pretty cool. Maybe a dual core GPU: one for graphics purposes and the other for physics. However they do it I hope it becomes a real product before games like Alan Wake. =D

RE: Hmm...
By StevoLincolnite on 2/5/2008 8:23:31 AM , Rating: 2
Or Integrating it into the GPU, and have it enabled in Hybrid SLI. - Boon for laptop users as well, with no additional cost to the end users, hopefully.

RE: Hmm...
By AggressorPrime on 2/4/2008 8:02:27 PM , Rating: 2
G92 Transistor Count: 754 million
PhysX Transistor Count: 125 million (~17%)

Yes, putting the PhysX PPU on the same package as the G92 would be worthwhile (like Intel's placement of GPUs on future low end CPUs). All nVidia needs to do is make the PhysX 65nm instead of 130nm. Then PhysX will automatically be included with every nVidia GPU as another feature that AMD doesn't have.

Integrating the PPU elements into the GPU die may take a little more work though.

Hopefully this will mean PPUs, or at least the elements that make them up, become more powerful and more popular leading to much more realistic gameplay physics.

RE: Hmm...
By Polynikes on 2/4/2008 9:05:37 PM , Rating: 1
I'm afraid they'll make the PPU standard on all enthusiast video cards, driving up prices, when most people won't want one on a video card. Very few games use PhysX as it is.

RE: Hmm...
By retrospooty on 2/4/2008 11:11:21 PM , Rating: 4
" Very few games use PhysX as it is."

Chicken and egg... Very few games use it, because very few people bought it. Developers lose time and money programming features that are used by a small percentage of the market. This is why games arent released on MAC's. The hardware has always been similar and capable, but its just too small to bother with and still make a profit. If it were standard on Nvidia chips, developers would use it to death.

RE: Hmm...
By roadrun777 on 2/4/2008 11:28:30 PM , Rating: 2
I think your making alot of assumptions here.
I suspect the reason behind these decisions has to do with the programming interface and parallel cores problem.

If you could virtualize several of the standard physics functions and tie that in to the shader, you could create a programming kit that would allow programmers to concentrate more on the game experience, rather than the high math involved, in writing complex shader code, for effects that could be achieved with a single line of code, using a physics processor.

RE: Hmm...
By roadrun777 on 2/4/2008 11:37:26 PM , Rating: 2
Oh, and as far as it costing more to manufacture and driving up the price. I don't see that happening. The actual cost to produce these components is minuscule to the cost of doing business. If you just concentrated on manufacturing costs and materials that 400$ board costs around 25$ to make (including labor). Once you add in licensing, research and development, and the standard bureaucratic pork, then, and only then, do you get these inflated costs to manufacture.

RE: Hmm...
By Polynikes on 2/5/2008 4:22:16 PM , Rating: 1
The actual cost to produce these components is minuscule to the cost of doing business.

So why do many Apple products carry such price premium when comparable products are much cheaper?

It doesn't matter how much it costs Nvidia, it matters how much it costs me.

RE: Hmm...
By mindless1 on 2/6/2008 11:33:09 AM , Rating: 2
Then you'd be wrong. Increasing die size and power consumption is a primary differentiation between inexpensive and expensive video cards. All the addt'l R&D, including developing the software/driver will have inherant costs as well.

All these ideas about manufacturing and other business costs don't change the fact that one video card ends up costing multiple times what another does. "Driving up price" is a relative concept though, depending on who's deciding it's not worth their money.

RE: Hmm...
By TimberJon on 2/6/2008 3:12:44 PM , Rating: 2
Not to mention it would simplify and reduce the cost of production for Mobo manufacturers. Perhaps a little more engineering on the architectural side, but as far as an additional slot and its waypoints, thats just one less component.

The gaming triangle...
By Micronite on 2/4/2008 6:03:53 PM , Rating: 1
1) High-end graphics processing
2) Discrete physics processing

1) High-end graphics processing
2) x86 General-purpose processing

1) Graphics processing (at least working on high-end)
2) x86 General-purpose processing
3) Physics processing (Havok)

It looks to me like there are two companies missing out on a complete current-generation gaming platform. The third company just needs to get their high-end graphics platform rolling.
Just thought that was interesting. It almost seems like NVidia went after Ageia/PhysX because of Intel's deal with Havok.

RE: The gaming triangle...
By imperator3733 on 2/4/2008 6:21:11 PM , Rating: 2
This is like an idea that I've had for a while. Since AMD makes CPUs, GPUs, and chipsets, and Intel makes CPUs, chipsets, physics stuff, and (soon) decent performance GPUs, Nvidia should buy some company with an x86 license so that they will have everything. I'm thinking they should get Transmeta, since they have a license (right?) and they don't seem to be doing much (hence it should be relatively cheap). It may take a few years then for Nvidia to make a decent CPU, but once they do, there will be three companies each with all the needed components.

If that were to happen, there would be some better competition, since if one of them screws up (like AMD right now), there are still two companies competing.

RE: The gaming triangle...
By Mitch101 on 2/4/2008 7:47:03 PM , Rating: 2
I would prefer they purchase VIA. I believe the C3 CPU's from VIA are a better product than transmeta's and they recently broke the 2ghz barrier. Gives them something decent to work with from the start. VIA's might even have a patent list that can keep Intel from killing off NVIDIA so easily. Might be cool to see a double processor speed IPC module embedded in a CPU if thats possible.

One thing for certain we can look foward to is better drivers for Ageia products. I trust NVIDIA's driver developers are more proficient than AGEIA's.

RE: The gaming triangle...
By dvinnen on 2/4/2008 9:24:41 PM , Rating: 2
Nvidia has a market cap of 14.4 billion while VIA has a market cap of 21.4 billion. Don't see that buyout happening.

The only companies with a x86 license still developing is Intel, AMD, and VIA. Other then that the only people who had one and stop developing (and haven't been bought out or excessively large) are Transmetta, SIS, and UMC. Agian, no telling who has a active license.

On a side note, according to wikipidia, nvidia already has license for embedded devices acquired from ALi.

RE: The gaming triangle...
By poohbear on 2/4/2008 9:39:56 PM , Rating: 2
why buy via? they bought Uli back in the day & they were gifted chipset makers.

RE: The gaming triangle...
By Hakuryu on 2/4/2008 6:30:52 PM , Rating: 2
I dont think they went after Ageia because of what Intel has done, but because realstic physics in games is arguably the next big step in gaming.

Graphics have been getting better and better over the years, but otherwise there hasn't been anything 'big' that happened to gaming hardware wise. We expect better graphics, more precise mice, keyboards with all kinds of gadgets, but we haven't been delighted by something unexpected like the control scheme of Wii did to the console market.

I've played some games with very good physics, but they lack the feeling of being in a true physical enviroment. You can readily identify what will use physics after a few minutes of playing - those barrels, chairs, etc; but that post in the center of the room is solid even if you detonate a nuclear blast on it. Perhaps with hardware physics processing, we could be playing fully destructible enviroments with realistic physics.

If NVidia can merge a seperate physics processing unit onto their video cards, without making them extremely expensive, then I bet they will be on top for a long time to come after the 8800's become old. AMD/ATI watch out.

RE: The gaming triangle...
By Assimilator87 on 2/4/2008 7:15:55 PM , Rating: 1
I hope they don't integrate a PPU into GPUs. nVidia would have to cripple both products in order to have them both in the same space. I want the different processors to stay on separate cards so they can be pushed much further technologically. If you're so adamantly against separate processors, ditch computers and buy a console or one of those VIA all in one boards.

RE: The gaming triangle...
By Targon on 2/5/2008 8:09:05 AM , Rating: 2
The issue that many people have with physics at the moment is that it doesn't improve the overall game experience, which is what people want. More details at 5 frames per second due to the extra polygons the dedicated physics processors have been delivering really doesn't improve the experience.

Back in the day when the original Tomb Raider came out with 3D support, there were TWO advantages to the 3Dfx Voodoo. The first is obviously the improved graphics, but the second was improved game speeds. Improved graphics alone would have been fine if the framerates were the same, or getting the improved speeds alone would have helped, but would not have drawn in crowds.

So, PPUs....better graphics, but no other advantages, and many disadvantages. NVIDIA may be looking to take care of those disadvantages with the purchase, but it may take them years to get to the point where the physics processing becomes an advantage.

If physics becomes a demanded feature, it will probably be added to DirectX as a part of the API, in which case following the API will be very important. Even then, unless physics is good for more than just eye candy in games, it also becomes a "who cares" feature. If physics gets used for things like being able to have an explosion open up holes in a wall or floor realistically, and that makes tactics/trying to make the explosion happen in a certain location, that is another story(and this could be emulated very easily without the need for physics as well).

So, what will physics processing REALLY do for us that we don't get already? If a PPU were used to improve framerates at VERY VERY VERY high resolutions, that would be very useful. Or perhaps NVIDIA is looking forward to when holographic technologies may mean that physics processing will be a big factor in making the display work? It is anyone's guess at this point.

RE: The gaming triangle...
By Darkskypoet on 2/4/2008 7:21:55 PM , Rating: 2
I'd say both nvidia, and amd have massively parallel GPGPU processing capabilities available. Via tesla, and amd's older but less trumpeted streaming GPGPU useage. And both of these can be (and have been)leveraged for graphics or physics. Nvidia bought Ageia for the developer tie ins, and pre-exisiting support for the API; honestly nvidia, intel, and, amd could all do realistic physics. The question is whose API will be used?

Much the same as battling for game developers, this can be clearly seen when viewing GPU benchmarks; much more comes down to whose code path you're utilizing, then it does to whose hardware is marginally theoretically superior. In other words, if a GPU/CPU/PPU has a feature set that accelerates performance, but nobody uses it, does it matter to anyone but a fanperson? (see results in 3dmark06 vs certain game titles, or game titles A vs game title B)

Would we be better served as gamers to simply have a standard for such physics processing (industry wide), and then allow the various companies involved to attack it by making hardware to deliver best performance of such? I for one don't want to stifle competition, or innovation; however I'd rather have the beautiful physics in gaming work on any of the video cards I choose to purchase, and not simply on the one that wins, (but refuses to license said technology).

Perhaps that will never happen, and we'll be forced to hope and pray the next great game supports the physics processing utilized in our video card, and not another one.

Anyone not see this coming?
By JoshuaBuss on 2/4/2008 5:37:24 PM , Rating: 2
I mean really, did they ever stand a chance on their own? :)

RE: Anyone not see this coming?
By HotdogIT on 2/4/2008 7:19:25 PM , Rating: 2
Why were you rated down? Makes no sense.

Anyways, I agree. Who honestly is surprised by this?

RE: Anyone not see this coming?
By JoshuaBuss on 2/4/2008 10:14:24 PM , Rating: 2
I don't know. All my posts are getting rated down.. even when everyone agrees with me, which sucks 'cause now I have to verify I'm a real human every time I post :|

RE: Anyone not see this coming?
By SavagePotato on 2/5/2008 10:41:12 AM , Rating: 1
Just means you probably have a fan.

Welcome to the wonderful world of meaningless ratings based on popularity.

RE: Anyone not see this coming?
By mindless1 on 2/6/2008 11:43:50 AM , Rating: 2
I can't say for certain about your post, but I've posted things in the past that seemed to automatically be down 1 rating point the moment they appeared on the page, before anyone could have possibly downrated them. Whether it is a flaw in the rating system or some keyphrase parsing causing it I don't know, but on posts I vaguely recall they weren't full of four letter words or anything like that.

RE: Anyone not see this coming?
By jtesoro on 2/7/2008 8:54:43 PM , Rating: 2
Same here. In another post I also got auto rated down even though I didn't use any cuss words also. The only reason I could think of is that I had more than a few exclamation points in my post (where I was trying to make a joke). When I have time later I'll probably make a test account and experiment a little bit. Stay tuned.

Hopes for benefits
By Xodus Maximus on 2/4/2008 5:48:57 PM , Rating: 3
Intel has Havok and they did nothing new with them, so I am really hopeful that NVidia might do something that moves the field forward. Im sure that the AGEIA PhysX API will be accelerated by drivers from Nvidia in the future, but NVidia tried something similar with Cg, and that received almost no industry backing, except for some books being written about it.

This might actually kill PhysX, because AMD will make their own product, and with all three companies having competing products and having no standard, games will be forced to have their own solution that doesn't take advantage of these great techs, and eventually it will be some obscure feature on NVidia's SDK that 1% of people use...

RE: Hopes for benefits
By togaman5000 on 2/4/2008 6:07:15 PM , Rating: 4
The different standards may very well hurt overall adoption, but if nVidia takes the smart route and integrates the technology directly into their cards, then developers would have a guaranteed user base.

If it were up to me, then I'd integrate the technology into all future GPUs. I'd also release a software version. Not only would any user be able to use Ageia's physics engine, regardless of GPU brand, but nVidia GPUs would be faster and include more features. Kind of like how EAX is available up to 2.0 for most licensed hardware based sound companies, but Creative can use it up to 5.0. All games support it now, despite the fact that only a certain percentage of the population can use it to the fullest.

RE: Hopes for benefits
By Griswold on 2/5/2008 4:54:17 AM , Rating: 3
The solution will be Microsoft including physics into directX sooner or later. Proprietary APIs will disappear just like they did after the release of DX (in addition to openGL) - Glide anyone?

In the end, it will just be a matter of who delivers the better performance.

By kilkennycat on 2/4/2008 8:04:15 PM , Rating: 2
In this thread there seems to be a pervasive thought that nVidia needs to have some special-purpose silicon within their GPUs for physics. And hence the speculation that this is the major reason for buying Ageia. Wrong. The next gen nVidia silicon currently in full development is targeted for BOTH GPGPU purposes (read physics == just one specific class of complex algorithm processing ) AND graphics-GPUs. The next-gen (GP)GPU silicon from nVidia will have double-precision data-paths throughout, essential for high-speed complex-math computations.

nVidia is probably acquiring Ageia specifically for their extensive PHYSICS-ALGORITHM know-how and together with the Ageia folk, will figure out how to most efficiently integrate their algorithms into their (GP)GPU silicon, maybe with some minor architectural accommodation to improve computational efficiency when dealing with this class of algorithm. No dedicated physics processor... the next-gen GPGPU with a dedicated library of very efficient physics-processing functions.

No doubt nVidia has big plans to use Ageia's technical expertise to address PC game-physics, (which actually is best handled by a pairing of a CPU-core - for the bulk-physics with the GPU(s) handling the particle-physics)

However, the biggest impact of the acquisition may be on the industrial and professional applications of their GPGPU creations. This is a rapidly-expanding and very highly-profitable area of business for nVidia, thanks in part to the powerful parallel math-processing on their GPUs enabled by nVidia's CUDA toolset. An extended version of the current CUDA toolset, with a powerful physics-processing library would be attractive to many areas of engineering development where any extra time spent can represent many $$ lost in opportunity-cost.

The PhysX chip in its current discrete implementation on an antique-technology PCI board is finally a truly dead-duck. R.I.P.

By edborden on 2/4/2008 8:42:11 PM , Rating: 3
I disagree. nVidia is already doing what Ageia has had years of trouble accomplishing. They have the history, contacts, money, partners, expertise, and personnel to drive their technology through the game developers themselves. That's what's important here.

Consider what has been the problem for Ageia all this time? The main issue has been getting developer support to implement the technology, right? They've had a killer product (in my opinion), but couldn't get enough developers to actual implement it to create the market for gamers to have any reason to buy it. Man, tell me that doesn't sound exactly like the type of problem that nVidia has chewed up and spit out over the past few years.

They don't have to add one thing to Ageia and they can make a bunch of money because they just have to sell it, which will be cake for them.

I blogged about this :

By protest the hero on 2/5/2008 4:03:01 PM , Rating: 2
What's the point in Intel buying Havok. Havok is just a software based physics "engine". We've seen what it can do, and it's mediocre. I'm fairly sick of games using the engine, as the games always react identically.

I can see why nVidia would have interest in AGEIA, as they also have R&D into the hardware aspect.

They need to incorporate a PPU in every card of the particular series it debuts in, to get the user base there. They also have to do it for cheap, or else it'll get put on the side.

Nvidia has to tread lightly, if they want to the go anywhere but down.

RE: nvidia<3AGEIA
By BoyBawang on 2/5/2008 9:20:18 PM , Rating: 1
My next PC upgrade:

1. Asus motherboard with built-in INTEL LARRABEE
2. Twin sockets that supports two AMD quad core FUSION CPUs
3. Four PCI express for four NVIDIA AGEIA GFORCE Physics Game card SLI.

:) :)

RE: nvidia<3AGEIA
By jtesoro on 2/7/2008 9:02:42 PM , Rating: 2
You forgot to put in a KillerNIC and Monster cables for "premium" performance.

By inighthawki on 2/4/2008 5:38:24 PM , Rating: 2
"but considering that the company is steal recovering from its ATI acquisition"


RE: typo?
By shabby on 2/4/08, Rating: -1
RE: typo?
By inighthawki on 2/4/2008 10:10:52 PM , Rating: 1
Not sure where you were raised, but i was trying to make a simple correction. In something like an article, it looks more professional to have no grammatical errors, so i figured it'd be nice to point out the problem so it could be fixed, not to try to make him look bad.

RE: typo?
By shabby on 2/5/2008 7:40:01 PM , Rating: 2
If you havent noticed almost every article here on dailytech and even anandtech has typos so professionalism obviously isnt their top priority.

Not Just for Physics
By Haltech on 2/4/2008 5:48:13 PM , Rating: 2
I guarantee that one provision in the contract if it goes through is that Nvidia will require them to hand over their own workforce. Happened for 3dfx and will happen for Ageia.

RE: Not Just for Physics
By initialised on 2/5/2008 4:36:16 PM , Rating: 2
Hmmmm... wasn't Nvidia's acquisition of 3DFx preceded by Nvidia infringing 3DFx's IP. I would guess that the buy out allows them to use it. But is physics processing going to change graphics and gaming as much as 3DFx and their shader technology has?

By kilkennycat on 2/4/2008 8:32:27 PM , Rating: 1
NVIDIA, the world leader in visual computing technologies and the inventor of the GPU,

Not quite. 3dfx was building GPUs a few years ahead of nVidia's founding, but the dedicated graphics co-processor IC on the Amiga (circa 1985) preceeded them all - Jay Miner's brilliant architectural creation. nVidia might have coined the 3-letter mnemonic (GPU) describing the function of that particular area of silicon functionality, but that is all.

By BenSkywalker on 2/7/2008 9:42:32 AM , Rating: 2
3dfx never put to market a single GPU, they sold rasterizers. The moniker 'GPU' was coined to differentiate rasterizers from rasterizers that also had hardware transformation and lighting on a single die. Other companies such as GLint, SGI etc had parts that had both a rasterizer and discrete hardware transformation and lighting on board, but the NV10 was the first available commercial product that had both elements on a single chip.

By MetaDFF on 2/4/2008 5:44:37 PM , Rating: 2
Hopefully Nvidia intends to integrate Ageia's hardware directly into their GPUs so hardware based physics is more ubiquitous and more developers can begin adding increasingly more complex physical interactions into games.

In a sense, soon graphics will not matter as much and the next big user of computing power will be in more complicated physics and physical interactions.

Online Games
By AlvinCool on 2/5/2008 10:13:36 AM , Rating: 1
AGEIA's PhysX software is widely adopted with more than 140 PhysX-based games shipping or in development on Sony Playstation3, Microsoft XBOX 360, Nintendo Wii and Gaming PCs.

Would this not give Nvidia the ability to have games ported to the PC, if they were to say sell and inexpensive AGEIA GPU co-processor either in a PCI slot or possibly in SLI mode to their current graphics card, that could be played online from any of these platforms? In future cards they could include it, but being able to retrofit all previous cards might be profitable depending on what people could get out of it.

RE: Online Games
By BoyBawang on 2/5/2008 9:24:16 PM , Rating: 1
My next PC upgrade:

1. Asus motherboard with built-in INTEL LARRABEE
2. Twin sockets that supports two AMD quad core FUSION CPUs
3. Four PCI express for four NVIDIA AGEIA GFORCE Physics Game card SLI.

:) :)

Nvidia owns ATI
By IntelGirl on 2/4/08, Rating: -1
RE: Nvidia owns ATI
By Squilliam on 2/4/2008 8:02:15 PM , Rating: 2
They haven't bought them yet. Actually Nvidia is in a little squeeze for their chipsets and IGPs. ATI has a far better one and AMD are looking at a chip with integrated graphics. Whilst Intel is also looking to chip away at their low to mid range GPUs. In addition to this, the newest Nvidia chip looks to be quite hot and expensive while ATi's solution the r700 looks quite elegant. They can scale the one chip from midrange to highend. It's power usage will probably be lower too. Well anything has to be lower than 250w TDP!

RE: Nvidia owns ATI
By Iv645 on 2/6/08, Rating: 0
"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki