Print 36 comment(s) - last by Screwballl.. on Mar 30 at 1:54 PM

Suit seeks to invalidate license agreement allowing Intel access to NVIDIA patents

Two of the most important parts of a computer are the CPU and the GPU. For a long time the, CPU was the brains of the computer where the complex calculations required by software were performed. At the same time, the GPU was where the calculations required for graphics were performed. Today the lines between those two products are becoming blurred.

Intel and NVIDIA both say that their products are the future of the computer. Intel is the largest chipmaker in the world and manufactures a line of integrated GPUs -- its future processors will even have graphics cores built-in. NVIDIA is the largest discrete GPU maker and is running software on its GPUs that traditionally ran only on the CPU, and the GPU often does it with more performance. NVIDIA has also said it is eyeing entry into the x86 CPU market as well.

NVIDIA and Intel are currently fighting in court over a suit that Intel filed alleging that a license agreement in place between the two firms does not allow NVIDIA to build chipsets that are compatible with the new integrated memory controller processors like Nehalem.

NVIDIA maintains that the license agreement allows it to make chipsets for the integrated memory controller processors and Intel is merely trying to prevent it from being competitive by trying to cast doubt on NVIDIA products in the minds of consumers.

NVIDIA has filed its own suit against Intel reports Reuters and is seeking to terminate Intel's license agreement to NVIDIA patents relating to graphics processing and 3D computing. NVIDIA says that without the license agreement it believes that Intel's line of integrated graphics processors violates NVIDIA patents.

NVIDIA says in its countersuit that Intel has manufactured the license dispute as part of a strategy to eliminate NVIDIA as a competitive threat. Intel says that the license agreement in place only allows NVIDIA to build chipsets for processors that lack an integrated memory controller.

Intel's Chuck Mulloy told Reuters, "There is a substantial disagreement between Intel and NVIDIA about their licensing rights under the agreement. We've been trying multiple times, multiple ways to find a way to settle the argument. The suit simply asks the court to interpret the agreement."

The battle between the world's largest chipmaker and the largest GPU maker is likely to heat up. Both firms believe that their products are the future of computing and that future computer systems may not need both a GPU and a CPU. NVIDIA claims that by officially denying the validity of its license agreement with Intel that Intel has breached the contract the two firms have in place.

NVIDIA said, "Having breached the contract and irreparably injured NVIDIA, Intel has lost the right to continue to enjoy the considerable benefit of its license to NVIDIA 's patent portfolio."

Prior to the suit being filed by Intel, NVIDIA had granted Intel the right to license SLI on its X58 chipset, allowing motherboards using the chipset to support both Crossfire from ATI and SLI from NVIDIA.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By DigitalFreak on 3/27/2009 12:16:38 PM , Rating: 5
Intel is merely trying to prevent it from being competitive by trying to cast doubt on NVIDIA products in the minds of consumers

NVIDIA has been doing a great job of that all on their own.

By meepstone on 3/27/09, Rating: -1
By BZDTemp on 3/28/2009 2:45:18 PM , Rating: 3
hmmm - where to start :-)

How about the renaming of renamed products or all the problems with Nvidia chips in laptops or how about the FUD tactics used by Nvidia to time and time again try an make competitors look bad.

By RamarC on 3/27/2009 12:25:30 PM , Rating: 2
yup, after the "missteps" of the nvidia 680/780 chipsets, nvidia lost its competitive footing and they're really only a choice for diehard fans or folks who crusade against intel (but still use their chips).

By DuctTapeAvenger on 3/27/2009 12:46:03 PM , Rating: 4
Definitely. Ever since I switched to Vista (and now Windows 7), I have not ran an NVidia product. The drivers drove me away. Even before that point, I used to play Supreme Commander quite frequently, and there was a problem with the 8-series cards that caused it to crash quite frequently. You would usually get far enough in to get everything set for your first assault, and lose because the game crashes.

With my old 4870 and my current 4870x2, Vista, 7, and all games are perfectly smooth, and lack the autocrash feature.

By greylica on 3/27/09, Rating: 0
By Mitch101 on 3/27/2009 2:09:16 PM , Rating: 3
Its most likely NVIDIA's original team that released quality drivers on a frequent basis and support for alternate OS's like Linux that got ATI to wake up and beef up their driver division. However if ATI has not exceeded NVIDIA in alternate Windows OS driver support they surely have caught up to them. The Vista/NVIDIA driver issue should put to rest any argument over quality drivers on Windows systems that ATI's driver division now exceeds NVIDIA when it comes to quality.

By brightstar on 3/27/09, Rating: -1
By DjiSaSie on 3/27/2009 9:41:39 PM , Rating: 2
You definitely never play a game sir. Try get some decent game in vista and play it on Geforce 8/9 series and you'll get some autocrash feature :)

I used to play motogp2008, but a crash that occurred in the middle of the game drove me away.

By cnar77 on 3/28/2009 12:35:31 AM , Rating: 2
I've used Vista for the last year as my gaming laptop machine and unlike many "user issues" that had users thinking the OS was crap as oppose to their inability or incompetence, I have had no problems. I personally follow the instructions and use the recommended tried and proven driver (no betas) and everything I've run works fine. I've wasted entire weekends playing games without issue ... not including the response from my wife.

By overzealot on 3/30/2009 3:52:35 AM , Rating: 2
Congratulations, I'm glad you had no problems.
I've tried reinstalling Windows, changing drivers (seriously, I tried every stable and most beta drivers).
Since buying an 8800gtx when they first came out, I've replaced EVERY OTHER PART OF MY PC.
New memory, new cpu, new motherboard, new hdd, new power supply, new dvd drive, new case.
I've wasted entire weekends gaming without fault, to be fair, but then had nights where the bloody thing will crash consistantly, after 1 minute of gaming.
Perhaps the fact that you blame users for these problems shows how "incompetent" you are.
Thanks for posting!

By Screwballl on 3/30/2009 1:54:40 PM , Rating: 2
Agreed... in my experience, Vista has been the problem since day 1, regardless of which GPU is used.
This is why I still use XP for 99.999% of my gaming... the same games that crash under Vista. It is not DX 8/9/10 related as some crashes come from older games like Counter Strike 1.5 and Source, others from newer DX10 games like Crysis, some from expansive games like Sins of a Solar Empire. Yet each and every one of these games works perfectly with zero crashes under XP, regardless of which GPU has been used (currently running with a nvidia 8800GTS 320MB but also tested with a 280GT and 4850).
I have tested a few of them on W7 and it is a mixed bag of results (reporting the bad results) but I will keep an eye out to see if it continues after the final release.

By msomeoneelsez on 3/28/2009 1:50:49 AM , Rating: 2
Here is the thing about Vista and nVidia... Yes, nVidia should have done a better job about their drivers in the beginning, but they did release much better drivers very quickly. People didn't update them.

Also, most (not all) of Vista's issues are caused by the general public's lack of computer knowledge. If they would just set their systems up right in the first place... And companies like Dell, HP, etc. do not help either, whether it be the additional trial software that bogs down computers, or their manufacturing "quality" if it can be called that.

My system is custom built by me (that means parts researched and chosen by me, as well as put together by me) and I try to always keep the latest WHQL (non-beta) drivers installed directly from manufacturer's websites. I have been on Vista 64 bit for nearly a year now and have had 2 (yes, only 2) crashes, BSODs, etc. that were not caused by my own stupidity. Even when they have been caused by my own stupidity, those have been few and far between, and only 1 of those have actually required a hard reset.

I would say that for a system as complicated as the modern computer and modern OS, that is pretty dang good. People just need to build and maintain their systems in the right way.

By jconan on 3/28/2009 1:55:08 PM , Rating: 2
2nd that. Updating drivers to a reliable build can be quite important. It may just solve a few bugs or 2 if the developers looked closely at the issues in the driver build release. However Nvidia still hasn't gotten their hdmi for monitors to display properly. It'll work for tv displays but for monitors they forget to check the eid's for the display...

By mindless1 on 3/27/2009 11:04:03 PM , Rating: 1
Hardly, one short period of sketchy drivers for Vista does not begin to offset years of ATI bugs, before and after that period. I'm not saying nVidia doesn't have bugs too, only that anyone who has actively supported both platforms has seen plenty of problems from both camps.

By just4U on 3/28/2009 2:05:16 AM , Rating: 2
I don't think ATI's driver devision exceeds Nvidia. They have their issues as well but I do believe their pretty much on par. I argue alot with fanboys of both companies (on various forums) and well yeah.. they stand behind their brand of choice regardless of it's flaws. It's like trying to talk to a brick wall :(

Currently I don't have issues with Drivers be either company.

By darkhawk1980 on 3/27/09, Rating: -1
By tallcool1 on 3/27/2009 1:56:22 PM , Rating: 2
I have a 4850 and play alot of games on my system, never had any problems with it. One game I do not play however is WoW.

By Mitch101 on 3/27/2009 2:03:24 PM , Rating: 1
I have two computers both with ATI cards in them now and both play WoW. Never had a WoW lockup. Radeon 850xt and 3870. The 3870 is just the weaker version of the 4 series and the drivers are unified so Im calling shins on his comment.

With what 11 million subscribers? I would be sure that this is the first game they test their drivers and cards with.

By Murloc on 3/27/2009 3:12:57 PM , Rating: 5
3870 is not 4xxx series.

By spread on 3/27/2009 3:58:57 PM , Rating: 4
I'd wager the driver 'hardlock' in WOW is a feature designed to protect you from yourself.

By brightstar on 3/27/2009 7:10:35 PM , Rating: 2
To good! Got me laughing.

By Mitch101 on 3/27/2009 1:50:27 PM , Rating: 4
I lost data because of the last NVIDIA chipset I owned. To this day I am not sure they ever officially acknowledged a true issue with NCQ and their NVIDIA drivers could cause data loss. But use the Microsoft drivers and turn off NCQ and you didn't have the issue that would show up about once a month. By then I had enough and moved on.

Then we have Bumpgate which was another buried issue of manufacturing that spanned a list of products from laptops to top end video cards. Sure you could get a high performance product but would it last? Then would you be denied support even though its a manufacturing defect? Lets crank up the fans with a bios update and hope they make it past the warranty.

Then you have the arrogance of the company shooting its mouth off at Intel saying the CPU is dead? Last I checked GPU's dont run without a AMD OR INTEL CPU and a CPU can run very well without an NVIDIA GPU. Did you not learn from VIA's demise? That couldn't happen to NVIDIA right?

They divided themselves from Microsoft from choosing them for the x-box 360 because they couldnt come to an agreement. You can make some money or no money. NVIDIA chose no money. The PS3 sales and remaining high PS3 console price wont help get them another contract like that again.

Might as well mention that NVIDIA played a major role in Vista getting a very bad launch because the NVIDIA drivers for Vista were the majority cause for crashes of the OS at launch. Something they havent been able to shake since but Microsoft shares the blame by letting Apple make fun of them without smacking them back.

You then have NVIDIA price gouging with ever increasing video card prices rising higher and higher. The latest video cards nearly hit $500.00 that is until AMD/ATI came out with much cheaper 38xx/48xx series to which we all saw the NVIDIA prices drop like a rock. Why couldn't they do that in the first place? Despite the competition it was price gouging the consumer figuring that ATI couldnt fight back. Probably a mix of arrogance in there also.

In the past NVIDIA did everything right. Now I find myself looking at NVIDIA and asking what are they doing right by consumers and big companies? I also realize it might be too late.

They burned the people who made them what they are today.

Intel is going to send thier motherboard business back to the stone age just like they did to VIA. AMD makes the best motherboards for thier CPU's so there is no line there especially when they merge the CPU with the GPU that leaves no real mobo chipset for them to make money on.

This leaves video cards and ATI can compete and leverage manufacturing processes learned from the CPU division.

NVIDIA is losing cash another bumpgate, a manufacturing delay, etc and NVIDIA is really going to start bleeding and without a CPU and quite possibly being removed from the mobo process that makes them a one trick pony. Intel can tie up NVIDIA making chips for thier CPU's for a long long time.

They really need to start kissing up to the console companies too because the next gen designs should be well under way.

By neothe0ne on 3/27/2009 2:48:54 PM , Rating: 1
"They really need to start kissing up to the console companies too because the next gen designs should be well under way."

If the console companies care about backwards compatibility (Sony is an iffy one here, but obviously Nintendo and Microsoft care more), they have no hope. It's highly unlikely Nintendo will move away from ATI after GameCube and Wii, and highly unlikely Microsoft would choose the original Xbox library over the 360 library.

By Totally on 3/29/2009 3:52:56 PM , Rating: 3
You kinda are really clueless when it comes to BC, Sony had been of the forefront until they axed it in between ps3 revisions. There is no bc between xboxes, xbox games have to be recoded to run on the 360, and that privilege has only been awarded to a handful of games, Halo 2 is the only one i could think of. Nintendo before Wii/Gamecube you only had BC on gameboys, and interoperability between consoles and handhelds. Sony has had BC on all their systems and you claim them to be iffy?

By Silver2k7 on 3/27/2009 2:51:24 PM , Rating: 1
Nvidia is or was 4 times larger than AMD atleast before the Abu Dhabi deals or whatever money injections AMD got... would be interessting to see them make a CPU or combined CPU/GPU product.

By Dianoda on 3/27/2009 3:03:05 PM , Rating: 2
NVIDIA's history (for the past three years, at least) of terrible PR and general consumer/manufacturer/supplier/partner abuse is pretty epic, and it doesn't seem like that laundry list you provided is going to stop growing anytime soon. And who was the genius that approved the decision to call the mobile edition rebadge of the 9800GTX+ "GTX280M," as if there was any similarity between a GTX280 and a 9800GTX+?

Honestly, who the hell do they think they are? Some advice for NVIDIA management: get some PR help, cut the FUD, own up to your mistakes, quite trying to pin the blame on everyone else and start acting like the professionals you claim to be. If you f---ed up bad enough for it to cost you your job, just do your owners a favor get fired instead of passing the buck on to someone else and continuing to piss away shareholder value. I wouldn't trust the NVIDIA top brass to manage anything more complex than a lemonade stand, and even then I'm sure they'd find a way to screw me over along with everyone else while running the operation into the ground.

By gumbi18 on 3/27/2009 10:46:24 PM , Rating: 2
I agree wholeheartedly. What PR idiot thought it would be a good idea to re-release older GPU's under newer names. Just look at the GTS 250 - it's a 9800 GTX+. In fact the cards are so similar that a 9800 GTX+ can be reflashed with the BIOS of a GTS 250 and VIOLA you have a new card.

By brightstar on 3/27/2009 7:14:47 PM , Rating: 2
Nice post. I agree with allot if this and yes, there chip sets suck. I gave up on those after the 4 series and wont ever buy another.

By lakrids on 3/27/2009 9:24:20 PM , Rating: 2
You then have NVIDIA price gouging with ever increasing video card prices rising higher and higher. The latest video cards nearly hit $500.00 that is until AMD/ATI came out with much cheaper 38xx/48xx series to which we all saw the NVIDIA prices drop like a rock. Why couldn't they do that in the first place? Despite the competition it was price gouging the consumer figuring that ATI couldnt fight back. Probably a mix of arrogance in there also.

I agree especially with this.
Remember the GeForce 8600 GTS? Launch price $199 to $229. Which is a lot of money frankly, and what you got for those $200 was a crippled 8800 GTX with 75% of the stream processors deactivated.

I know Nvidia is a company who needs to make money, and I don't fault them for this, but sometimes they go TOO FAR. If ATI can release the HD4850 with All 800 stream processors intact at those $200, then why couldn't Nvidia be a little more generous with the 8600 GTS?

I want to like Nvidia, because we need at least 2 players on the field, but they are making it very difficult for me to like them.

yeah but...
By Randomblame on 3/27/2009 9:01:36 PM , Rating: 2
HaVE you seen the prices of x58 motherboards lately?? Thats due to lack of competition. Let nvidia release a chipset, I won't buy it but maybe x58 prices will drop a bit.

RE: yeah but...
By lakrids on 3/27/2009 9:36:36 PM , Rating: 3
Thats due to lack of competition.

This is half of the truth.

The other half is SLI certificate .
The motherboard makers are to send the motherboards to Nvidia for a "certification process" to ensure a so called "great out-of-the-box experience".
This will cost money, and ultimately the user is the one to pay for this trouble, and that's one of the reasons you have seen skyhigh X58 motherboards.
So every time you buy an X58, Nvidia gets a good portion of what you paid. Even though the chipset was made by Intel, and even though you might be using crossfire instead of SLI, Nvidia profits.

I know who's gonna win
By homerdog on 3/27/2009 2:18:21 PM , Rating: 3
"A lawyer is a gentleman who rescues your estate from your enemies and keeps it for himself." - Lord Henry Peter Brougham

Boo hoo.
By djkrypplephite on 3/27/2009 12:07:37 PM , Rating: 2
So many corporate tears these days.

A lot of nvidia hating today...
By Gannon on 3/28/2009 5:56:29 AM , Rating: 1
Truth be told without nvidia we would not have had the leaps and bounds of 3D accelerator progress. Let us not forget the whole war between 3Dfx and Nvidia, and 3Dfx's incompetent management (similar to AMD) by trying to make their own boards and they lost the bet because nvidia developed better chips with 32 bit color, etc, etc. While 3Dfx was going under because of their bad acquisition of STB (if I remember correctly).

Nvidia has definitely got problems but let us not forget its a for profit driven company, corporations are assholes because of the demands of shareholders, Intel, AMD, Microsoft, etc, have all had bullshit to spin. MS and their red ring of death Xbox's, Intel with the Pentium bug and the horribly performing Pentium 4... Intels attempts to limit overclocking (let us not forget the hard coded multipliers in core 2 duo's that everyone seems to forget is a total asshole move on Intel's part to the OC community).

As george carlin said: It's all bullshit folks, so don't worry about it.

"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer
Related Articles
Intel Sues NVIDIA Over Chipset Manufacturing
February 18, 2009, 11:25 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki