Print 99 comment(s) - last by Oregonian2.. on Feb 24 at 10:29 PM

NVIDIA says Intel is trying to slow the uptake of NVIDIA platforms

Intel and NVIDIA have had agreements in place that allow NVIDIA to build chipsets that support Intel processors. The two firms recently began licensing NVIDIA technology for use on Intel motherboards as well.

Today reports are coming in that Intel has filed a suit against NVIDIA alleging that the licensing agreement in effect does not allow NVIDIA to build chipsets for Intel processors with integrated memory controllers -- including the Nehalem CPUs.

According to Bit-Tech, Intel issued a statement saying, "Intel has filed suit against NVIDIA seeking a declaratory judgment over rights associated with two agreements between the companies. The suit seeks to have the court declare that NVIDIA is not licensed to produce chipsets that are compatible with any Intel processor that has integrated memory controller functionality, such as Intel’s Nehalem microprocessors and that NVIDIA has breached the agreement with Intel by falsely claiming that it is licensed. Intel has been in discussions with NVIDIA for more than a year attempting to resolve the matter but unfortunately we were unsuccessful. As a result Intel is asking the court to resolve this dispute."

NVIDIA feels confident that its license agreement does in fact allow it to build chipsets for Intel processors with integrated memory controllers. So confident in fact that according to NVIDIA it will not change its roadmap and will continue development of chipsets for the Intel processors in question and for future Intel processors.

NVIDIA points out that this license disagreement does not affect any of its currently shipping products and the graphics giant doesn't expect any impact to its current business regardless of the outcome.

NVIDIA has released an official statement saying, "NVIDIA believes that our bus license with Intel clearly enables us to build chipsets for Intel CPUs with integrated memory controllers. We are aggressively developing new products for Intel’s current front side bus (MCP79 and MCP89) and for Intel’s future bus, DMI."

NVIDIA maintains that the suit is nothing but an attempt by Intel to slow the adoption of NVIDIA platforms and protect a decaying CPU business where the CPU has become much less relevant compared to the GPU inside a PC.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

same old tricks
By UltraWide on 2/18/2009 12:10:48 PM , Rating: 5
intel is up to their same old tricks they used on AMD, Cyrix, VIA, you name it.

Same game for intel, just have to change the company name in their lawsuit papers from 1975.

RE: same old tricks
By Motoman on 2/18/2009 12:25:54 PM , Rating: 2
Yup...when AMD was stealing their lunch, they were more than happy to have others build stuff to support their platform. When AMD faltered, they started to go back to their old ways...with luck, maybe the new AMD stuff will level the playing field a bit again.

RE: same old tricks
By ImSpartacus on 2/18/2009 3:20:39 PM , Rating: 2
Yeah, Intel is doing too well. They are getting a touch cocky.

AMD needs to release something that competes with i7 and Intel will be more than happy to let everyone support their chipsets. Too bad AMD is busy competing with last generation hardware (albeit competing quite well). But give them time, who knows?

RE: same old tricks
By Pryde on 2/18/2009 11:09:04 PM , Rating: 4
What a lot of people seem to forget is that Intel works for their investors / shareholders. Intel and AMD only look out for their best interests. These companies are not in this to give stuff away.

RE: same old tricks
By Oregonian2 on 2/20/2009 2:52:11 AM , Rating: 1
It's also not exactly like nVidia is a sweetheart angelic company themselves.

RE: same old tricks
By walk2k on 2/18/09, Rating: -1
RE: same old tricks
By omnicronx on 2/18/2009 2:26:39 PM , Rating: 5
Can you really blame them? If you invented something then someone came along and cloned it and started selling it for $2 less, wouldn't you try to protect your business?
Not too sure what you are talking about here, Nvidia pays Intel to license out the technology. Intel is claiming this license does not include support for chips with onboard memory and Nvidia is claiming the opposite. Seems like a technicality to me, and it would not surprise me if Intel is looking to gain traction selling their boards before they give Nvidia a chance to do so.

There are no clones involved here, if Intel didnt want Nvidia making chipsets for Intel processors, then they should not have licensed the technology to them.

RE: same old tricks
By Oregonian2 on 2/20/2009 2:54:31 AM , Rating: 1
if Intel didnt want Nvidia making chipsets for Intel processors, then they should not have licensed the technology to them.

Intel is claiming that they didn't! Are you just taking nVidia's word for it that they did ?

RE: same old tricks
By Ichinisan on 2/23/2009 6:49:41 PM , Rating: 2
Also, the chips/chipsets are often NOT cheaper. They target a different market segment which Intel has only just entered (ultra-high end multi-GPU), licensing nVidia's tech to do so (arbitrary as it may be). Even if Intel loses this, it may buy them time to sell to the ultra-high-end before nVidia makes a "must have" SLI board.

RE: same old tricks
By mattclary on 2/18/2009 5:17:05 PM , Rating: 2
Good lord, man!! What makes you think they are "cloning" anything?!

Personally I think this is crap. It's like Intel makes a car engine, then forces you to "license" gas pedal designs.

RE: same old tricks
By Reclaimer77 on 2/18/09, Rating: 0
RE: same old tricks
By KernD on 2/18/2009 5:41:12 PM , Rating: 3
I think you made a mistake there, Intel is the one trying to get out of the deal by claiming it is no longer a valid deal for the new processor.

RE: same old tricks
By Gravemind123 on 2/18/2009 7:16:52 PM , Rating: 1
Intel isn't trying to get out, it licensed nVidia its FSB technology to make chipsets for. Intel CPUs no longer use the FSB other then the Atom, so nVidia has no license for Core i7.

RE: same old tricks
By mars777 on 2/18/2009 8:01:14 PM , Rating: 2
And you have read their agreement, so your words are truth?

RE: same old tricks
By finalfan on 2/18/2009 9:23:24 PM , Rating: 3
There is no detail of the agreement. However, in this article,,2845,1729927... at the end of second paragraph, it says "The ATI-Intel license was later updated to include the Pentium M and the 800-MHz processor bus. "

So it looks very likely those kind of license is based on FSB and has to be renewed for every new bus.

RE: same old tricks
By mattclary on 2/18/2009 5:41:50 PM , Rating: 1
Well, we can't read that agreement, but I SERIOUSLY doubt Intel slipped a clause in that read "except CPUs that have a built in memory controller". Do you really think nVidia would have signed that considering AMD has been doing that for the last 5 or 6 years?

RE: same old tricks
By Oregonian2 on 2/20/2009 2:57:50 AM , Rating: 2
Of course they would if they wanted to make and sell Intel support, and that's the only way Intel would supply the license.

RE: same old tricks
By Hawkido on 2/23/2009 2:41:39 PM , Rating: 2
I don't know how old this license is, but it is clear to me that Intel left it vague (accidently?) and nVidia is capitalizing on it, and gambling that the court will say "Vague means inclusive, not exclusive." If the license agreement already have a vaguely inclusive feel to it the court will find in nVidia's favor, if the agreement has an exclusive feel to it then the court will find in Intel's favor. The vaguery will probably cause the courts (if they rule in Intel's favor) to say let bygon's be bygon's but from this point forward nVidia cannot sell anymore chips until the license is updated then that may pan out in nVidia's favor if they can make and sell enough chips before the courts shut them down.

I see a 3 pronged fork in the road. One in Intel's favor (court rules that nVidia has to back pay for all chipsets sold), another one in nVidia's favor (Vague means inclusive)and one down the middle (Vague means exclusive, but it was vauge so from this point on its a no-no) that ends up being in nVidia's favor if they can score enough points, my garnering enough market share from Intel early in the game.

The past long-standing relationship between the 2 companies leans in nVidia's favor. But it all comes down to the wording of the agreement.

RE: same old tricks
By Oregonian2 on 2/24/2009 10:29:53 PM , Rating: 2
What you say may be true, but note that I was only responding to the question posed:

"Do you really think nVidia would have signed that considering AMD has been doing that for the last 5 or 6 years?"

RE: same old tricks
By Samus on 2/18/2009 5:36:31 PM , Rating: 3
Intel, like AMD (before ATI merger) have stated they want chipset partners because their chipset capacity isn't acceptable in comparison to processor output. I can remember two instances in the past few years of Intel chipset shortages. Chipset partners will only increase sales of your cashcow Intel's case, CPU's.

The only reason they are doing this is because these days sales are slow.

RE: same old tricks
By Quijonsith on 2/22/2009 8:18:31 PM , Rating: 1
Good thing that when AMD released their 64bit processors they didn't patent integrated memory controllers.

RE: same old tricks
By SavagePotato on 2/18/2009 1:32:07 PM , Rating: 4
In this case though you have one big greedy evil company suing another big greedy evil company, not exactly an underdog.

Nvidia is no saint by any stretch, add to that their cocky dumbass ceo that goes about making grandiose statemnts like "were gonna open up a can of whoopass on Intel" (funny as hell when the article came out the next week, "nvidia opens can of whoopass on self" re: the faulty notebook chip debacle).

On one hand there is the reaction to laugh at Nvidias plight as Intel not allowing them to make chipsets for their products would be arguably a deathblow with all their other problems. On the other hand you know that paying $400 for X58 motherboards really sucks and everyone knows how quick Intel is to fall back to it's evil ways without competition.

To all those that laughed and scoffed at AMD's shortcomings over the last few years for example, I say, enjoy your "enthusiast" $400 mobo's and $1100 dollar processors. On that note, anybody remember those $600 gtx280's that came out and were slashed down to size by ATI? $200 price cuts within weeks of launch, amusing.

RE: same old tricks
By sweetsauce on 2/18/09, Rating: -1
RE: same old tricks
By SavagePotato on 2/18/2009 6:57:47 PM , Rating: 2
Take a look at the cost of those x58 boards some time. There is no mainstream and will be no mainstream for i7.

Intel fooled you and everyone like you. i7 is a rethinking of skulltrail designed to milk you dry.

When the i5's come out with their different socket, and lack of triple channel support you might see i7 more for what it is, a disguised skulltrail.

RE: same old tricks
By JackQW on 2/18/2009 8:10:11 PM , Rating: 3
Start linking to Skulltrail in your posts, or atleast say "Google Intel Skulltrail, first hit", or something, somewhere... No one wants to accidentally run into a trail of actual skulls, or be suspected of being a serial killer.

What SavagePotato is saying is that Intel 'artificially' segregates it's consumers.

The reasoning is that "if you're intelligent", and you want a main-stream Intel chipset, buy the upcoming Core i5, at which time Intel will release it's next-generation enthusiast component, and you'll be relatively in the stone age, with no upgrade path, so you will have to rebuy. If you buy previous-generation enthusiast components, you'll pay too much and wind up around main-stream performance, and have no upgrade path.
Both paths will cost you current-generation enthusiast prices, and their next-generation enthusiast components cost you next-generation enthusiast prices. The 'next-generation' components 'will' have an upgrade path ... assuming they have competition.
Without competition, they shaft their performance-investing customers that drive progress and change, the same customers that got Intel off of their NetBurst high-horse and pronounced the Pentium M as the true path toward performance, and slouch into a comfortable slow-to-no-progress 100Mhz bump cycle.

When Intel has sufficient competition, it's a different story; the consumer wins, there is a true mainstream segment, and performance skyrockets everywhere.

RE: same old tricks
By Reclaimer77 on 2/19/2009 5:11:23 PM , Rating: 2
Intel fooled you and everyone like you. i7 is a rethinking of skulltrail designed to milk you dry.

Yes people were "fooled" into buying a CPU that's clearly targeted to the server market ?? Ummm.. ok. I guess it's just a myth that the i7 is also the best performing CPU for desktop applications as well ? Yeah those poor suckers sure got fooled...

When the i5's come out with their different socket, and lack of triple channel support you might see i7 more for what it is, a disguised skulltrail.

You might want to actually research what you are talking about. The i5 isn't "lacking" because it will be dual channel. It will see gains in performance BECAUSE it's dual channel.

Nobody needs triple channel, and it's NOT optimal in the first place. The i7 needed to be triple channel because servers need tons of memory and bandwidth to address it's eight cores (4 physical + 4 Hyperthreaded ) because server apps can actually support that many cores.

Scream Skulltrail to the top of your lungs for all I care. It doesn't change that fact that your a stupid fanboi.

RE: same old tricks
By TA152H on 2/18/2009 9:05:32 PM , Rating: 1
I agree completely with you about Nvidia, they are a horribly obnoxious company, that deserves acrimony because of their incredibly irritating CEO.

But, why would anyone buy an Nvidia chipset anyway? With all the problems they have, and the relative excellence of Intel chipsets and the stability of the platform, why take a risk on Nvidia? Not that Intel is always perfect, but at least when they are imperfect, they are still the standard, and a huge company that can better support their problems. If there's a bug with something Intel, software will find a solution to it faster than a smaller company, with significantly less market share. It's just common sense.

Of course, I'm not saying Intel has as many problems as Nvidia, I'm just pointing out that even when they do, by the sheer size of what they sell, the industry does work around it better.

Now, about expensive processors. I have this discussion with a lot of people, so I'm not picking on you. The most expensive processors are an incredible bargain in certain situations. They are WELL worth their cost. I will give an example. Years ago, I worked for a jet engine maker, and we'd use PCs for computational fluid dynamics. They were always running things in the background, and could never finish fast enough. Even then, people were making in excess of $50 per hour who were working on it. With everything else, it was easily over $60 per hour per employee, and probably closer to 50% more than that, but let's use $60 per hour because it's easy to work with, and I want to use a very conservative number and still make the point. Now, a faster computer means less time spent waiting for results. Let's be conservative and say that even if it saved an employee 10 minutes a day, that's $10 a day I save with this new computer. Again, since these beasts ran day and night, it would be much more than $10 a day, but, again, let's be conservative.

At $10 a day, within two months, I've said $600 (assuming weekends). In a year, I've said $3650 in labor costs. Not bad for a $1100 processor, as compared to a $500 processor. f you want to scratch out weekends, then It's still nearly $2600 a year. If you use an even cheaper processor to compare it to, the numbers just get higher.

Again, all my numbers are very low, and in reality, you save WAY more than 10 minutes a day, and the cost per employee is much closer to $100 per hour, and probably more. So, really, in many situations it's going to be 3x the number I gave. Also, it's more enjoyable for employees to have a fast machine than a slow one they wait forever on.

Also keep in mind, big companies DO NOT overclock. Forget it. Don't even suggest it, you'll get mocked and laughed at like you're an idiot (I learned this from the 300 MHz Celeron 300a, which of course could easily run at 450 MHz, since Pentium IIs were, but ...).

So, expensive processors make a whole lot of sense in certain situations, and not a whole lot of sense in others. It's not as simple as saying they are always a bad deal. Because, they are not.

RE: same old tricks
By SavagePotato on 2/18/2009 9:21:38 PM , Rating: 2
Comparing a business situation to a home situation is an invalid argument.

Businesses pay more for everything, they always have and always will, and they know this. Price out some servers from sun, dell, hp, whom ever one time and you will really see overpricing from the perspective of the home user. Even from the perspective of the business user. That doesn't magically make the Intel extreme edition a good thing because it is still targeted with a specific intent, and that marketing intent is an enthusiast with too much money and too little ability.

As a home user I look for whatever will bring me the most for the least hit on my wallet. Two years ago that was the core2 e6600, which I overclocked to 3.4 ghz and use to this day.

If you want to spend money in the business environment the infrastructure is already there for you to do so, you don't need to buy overpriced home processors, you can buy overpriced server or workstation class processors with ecc ram and dual slot motherboards.

RE: same old tricks
By TA152H on 2/19/2009 12:07:19 AM , Rating: 1
Yes, I agree completely, businesses are not allowed to buy EE processor, Intel forbids it. Only someone at home can buy them, and it's always a bad idea.

The problem with FB-DIMMS makes a lot of people allergic to that platform. They are slow, and burn a lot of power, for little benefit in many situations. Choice is good.

Even for home users, they can sometimes be worth it, depending on what you're doing and how much money you have. If I have more money than I know what to do with, and there still are people in this situation, or I do work at home on a computer that I get paid for (a situation that is common), then these processors can still be worth the price.

That's the point, Intel doesn't create so many of these beasts that they expect to sell them in the mainstream. Instead, they position them as a niche product, and if you really need it, you'll pay for it. Intel certainly isn't forcing you to buy them, and they offer a nice selection at much better prices below them. But, if you need the best, you'll pay for it, and like it.

But, I'll agree, for most people, these processors don't make a lot of sense. But, they do for some, and that's all Intel expects to buy them. For me, I won't pay more than $100 for a processor, because I like to relax a little when I'm compiling. But, if you're someone that can benefit from the processor speed, and mentally get annoyed if you have to wait at all, they are well worth the money. They can easily save money. And those unusual situations are who these processors are for.

RE: same old tricks
By SavagePotato on 2/19/2009 10:42:34 AM , Rating: 2
I dislike Nvidia, but if Intel muscles everyone out of the chipset business on their cpus, everyone is royally screwed, even if they don't realize it yet.

The combination of amd and via making sdram boards for the p3 was the only thing that saved the consumer from intel forcing rambus on the world.

Remember $1000 for 128mb of ram rambus? good times.

Good and perhaps bad?
By sprockkets on 2/18/2009 12:00:38 PM , Rating: 3
The good: They didn't mention Ion.

The bad: Intel isn't hesitant to sue over chipsets.

RE: Good and perhaps bad?
By Drexial on 2/18/2009 12:05:29 PM , Rating: 1
Not sure if you mean good as in they have been talking about it too much or good that ti wasn't effected by this?

If its the later, its because the memory controller isn't on the atom chips. That's why they are still using the older not so great 945 chipset.

RE: Good and perhaps bad?
By sprockkets on 2/18/2009 1:15:10 PM , Rating: 2
The good in that Intel isn't suing nVidia over Ion.

The bad is well, Ion isn't out yet, so probably they will not sue until it does.

The ugly: The integrated graphics and memory controller on the Atom is primarily done to save space and power, not to enhance performance (although it should). It won't even have SATA to save power.

Just remember for those who posted above me, an integrated GPU on the CPU will still SUCK. Don't expect a very powerful one built in, and even if it is, it will fundamentally have so little memory bandwidth anyhow to work with. And even if it does have any good memory to work with, you pretty much were better off with an add in card to begin with.

RE: Good and perhaps bad?
By ChuckDriver on 2/18/2009 2:33:39 PM , Rating: 2
The ugly: The integrated graphics and memory controller on the Atom is primarily done to save space and power, not to enhance performance (although it should).

Except since the release of the Atom most netbooks have used the 945GSE Express chipset, using a northbridge and a southbridge, occupying a lot of real estate compared to GF9400M, made using older processes to cut cost and amortize the expense of their older equipment. I think Intel would have sat on the GN40 longer had Nvidia not announced the Ion platform.

RE: Good and perhaps bad?
By jconan on 2/24/2009 1:22:00 AM , Rating: 2
Yep that's why competition is good... Similar with Microsoft and Apple not that Apple has a large market share but analogous to Intel and AMD.

RE: Good and perhaps bad?
By teckytech9 on 2/19/2009 1:24:05 AM , Rating: 2
The bad: Intel isn't hesitant to sue over chipsets.

True, but they could resort to anti-competitive behavior and force OEMs to purchase their inferior chipsets bundled with the Atom. This will inevitability delay the ION product availability launch date.

RE: Good and perhaps bad?
By trisct on 2/23/2009 1:26:15 PM , Rating: 2
NVidia has a license to make chipsets using Intel FSB specifications. Atom uses a bus to memory which is shared with other components (ie. a FSB) so Intel can't complain about it. It is the switch to serial links (QPI) for memory access at downstream components that Intel thinks NVidia needs a new license for. So ION is safe no matter what, for now.

By HrilL on 2/18/2009 12:36:09 PM , Rating: 2
Wouldn't this be considered Anti-Competitive? If Nvidia stops making Intel chipsets then Intel would have a monopoly over chipsets for Intel CPUs and could end up facing anti-trust violations.

RE: Anti-Competitive?
By SavagePotato on 2/18/2009 1:34:40 PM , Rating: 2
Someone turn on the EU symbol (sorta like the bat symbol but... yeah).

Maybe Crystal clear will charge in to save the day and tell us how the EU will put a stop to this tomfoolery.

RE: Anti-Competitive?
By Sulphademus on 2/18/2009 2:38:30 PM , Rating: 5

Dunna nunna nunna nunna nunna nunna nunna nunna....

RE: Anti-Competitive?
By PB PM on 2/18/09, Rating: -1
RE: Anti-Competitive?
By Wieland on 2/18/2009 2:10:42 PM , Rating: 2
It would obviously be anti-competitive for Intel to prevent other companies to manufacture chipsets for its CPUs. That would bring them one step closer to a vertical monopoly. The EU has actually ruled that Microsoft must allow their operating systems to be equally functional with the software of other companies (browsers, productivity apps, etc.).

RE: Anti-Competitive?
By omnicronx on 2/18/2009 2:27:53 PM , Rating: 2
Worst comparison ever.. So much so it does not even deserve a rebuttal..

RE: Anti-Competitive?
By Bateluer on 2/18/09, Rating: 0
RE: Anti-Competitive?
By Hexxx on 2/19/2009 7:36:09 AM , Rating: 2
Then they would be like Apple with it's iPhone OS, wouldn't they?

By greylica on 2/18/2009 5:53:03 PM , Rating: 1
Why only X86 or X86_64 when we have a chance under the Linux world ?
Nvidia should start to make processors, and doesn´t need to be X86 compatible if the performance is sufficient to match the needs of the Linux power users.
Some earlier processors have open standards, Nvidia could use their expertise to make processors.
More competition, and if Nvidia achieve a stellar processor performance, the case is solved, good bye Intel.

By HeelyJoe on 2/18/2009 11:08:55 PM , Rating: 2
Why would you design a processor that 90 percent of the market can't use? You honestly believe that Nvidia would, by creating a processor using a different architecture, outsell Intel?

By SavagePotato on 2/19/2009 10:48:28 AM , Rating: 2
90+ of the desktop market.

But on the server market linux is a different story.

A different architecture that is screaming fast and linux usable could see popularity for server application.

After all where did AMD make it's move when it was climbing, In the server market. Granted AMD was usable with Windows servers so it is not an apples to apples comparison.

Doubtful Nvidia could take over the world per se, but it is not totally far fetched that they could at least make some money with a different architecture.

By teckytech9 on 2/19/2009 1:36:56 AM , Rating: 2
They already do. Try looking up Nvidia's Tegra platform using their CUDA ARM based architecture.

It's known that Nvidia plans to release MIDs (Mobile Internet Devices) with keyboards to compete in the mobile and netbook markets. Subpar $99 devices as I read correctly, with emphasis on visual computing running Android, Linux, and WinCE to name a few.

By greylica on 2/19/2009 6:45:17 AM , Rating: 2
If the balance between performance/price/energy consumption surpass others, the market will change a lot. Linux is not an alien in OSes anymore, and 7%(including servers) of the world population is using Linux, including me.

The bright side...
By Aloonatic on 2/18/2009 11:35:56 AM , Rating: 2
I've been trying to find it, and all I can come up with is that at least lawyers are one group of people who will not need to receive a government/taxpayer backed bail out.

RE: The bright side...
By therealnickdanger on 2/18/2009 11:39:17 AM , Rating: 2
Why not? The salt marsh mouse is getting $30 million. Get while the gettins good, IMO.

RE: The bright side...
By djc208 on 2/18/2009 12:02:04 PM , Rating: 2
That's because they don't recieve it directly, it's sucked off the side like a leach with all the regulatory issues, law suits, contracts, and "oversight" that they'll instigate and propigate.

By xstylus on 2/18/2009 1:47:53 PM , Rating: 4
I find this rather humorous. Apple switches from Intel chipsets to NVidia chipsets, and now Intel is suing NVidia in with a case that is grasping straws at best.

Hey Intel, anti-competitive a bit much...?

Its just a disagreement
By MadAd on 2/18/2009 8:06:01 PM , Rating: 2
JCOAB guys, its just a disagreement over terms of a contract, they cant decide so they are asking a court to decide for them.

Thats what courts do, when 2 sides disagree they can ask an impartial and trained 3rd party to consider the case and agree to be bound by the outcome.

Its not like criminal charges or anything.

RE: Its just a disagreement
By itzmec on 2/19/2009 6:04:05 PM , Rating: 2
yah, civil court.

So, I buy an Intel Processor
By Chipsets on 2/19/2009 4:29:13 PM , Rating: 2
and read the data sheet. Work out what the myriad pins do and connect them to memory, grahics card and a hub to drive PCI ports and hard disk.

Sound unfeasible, welcome to part of my CV. This was in the days of the 286.

At that time a processor was a discrete device with the peripherals constructed from discrete logic devices.

The sub-components that were formerly seperate units had been integrated in the 8086/8088. This left simple logic to be implemented to enable memory connection, hard disc connnection, ISA bus connection(s), serial port connections.

Look at a Dell 200 motherboard and you will see many 8/14/16/18 etc pin chips. These became integrated into North Bridge and South Bridge chips. Smaller, neater, cheaper.

The point is, Intel are saying that you cannot build a "South Bridge" (I know that the term isn't quite right for AMD k8/10 and the look alike k8/10 intel devices i7/5) for their procesors. Remember, a "southbridge" is glue logic for specific controllers, (Sata, Graphic, Comms etc)

This is like Microsoft saying "Thou Shalt Not Use Our APIs".

FFS, patent/copyright system.. here is a Dollar, take a walk to the corner store and buy a clue. While the IO on the processor can be legitimately copyrighted (it is an implementation) for the purposes of protecting the IP relating to that chip, therecan be no legitimate reason for allowing the restriction of the use of that IO interface.

RE: So, I buy an Intel Processor
By trisct on 2/23/2009 1:30:58 PM , Rating: 2
This is the right way to look at it. The interface to Intel CPUs should be treated like a Microsoft API, and Intel ought to be forced to make reasonable license terms available just like MS. Trying to squeeze off competition because you assert copyright over the interface to your (market-dominant) product is exactly what got MS in trouble at first. Intel is doing exactly the same thing, and the courts ought to dictate terms to them.

Burning Bridges
By shabodah on 2/18/2009 12:53:18 PM , Rating: 2
Kinda ironic that the two companies that are known for burning bridges are now disolving the one between them. Heck, nVidia was nuts to even think Intel wouldn't turn on them in heartbeat in the first place.

Not Good
By Kougar on 2/18/2009 4:45:43 PM , Rating: 2
NVIDIA is not licensed to produce chipsets that are compatible with any Intel processor that has integrated memory controller functionality

This is not Nehalem specific, it would include future versions of Atom as well. It would be a blanket ban on NVIDIA producing further Intel chipsets.

By thebeastie on 2/18/2009 7:26:39 PM , Rating: 2
They sued AMD and the others for years and years, left them far behind and some are still trying to catch up.
If there was ever a tech company that needs a real hard wallop from a anti trust case its Intel.

Its just plain ridiculous that Intel would allow chipsets for one type of CPU and not others, why not just ban who can make gfx cards that go into PCI-express slots on top?

Shows the true colors of Intel which is sometimes hard to see.

Via lost, so will Nvidia
By FXi on 2/18/2009 9:55:34 PM , Rating: 2
Via lost the "use the courts to settle it" method and so will Nvidia.

The only way to desl in this matter is to come to Intel and negotiate a license, as AMD has done for the x86 technology.

Honestly I'd say the backroom negotiation on this involved making SLI available on all Intel chipsets. At that point I think Nvidia said "no way" and Intel made it clear that their chipset license was going to end with the current generation of chips. Be aware that the I7 technology is going to descend to a range of all other chips so this is basically telling Nvidia that their ability to make Intel chipsets is over.

And honestly their chipsets stunk and were overpriced to boot. But they did yet instill some competition which was and is good. I hope they are both able to renegotiate with Intel a proper license, but also to build better chipsets in the future since the memory controller was one of their weakpoints, which would not be fixed. No doubt Intel is thinking of this too.

By Authentic Wong on 2/19/2009 7:07:36 AM , Rating: 2
nVidia should starting design and create new X86 CPU by take over the VIA Cyrix or IDT Win Chip.

We love to hear there is X86 Cyrix M4 Processor able to match against Core i7 and Phenom II.

We also love to see there is 3 Kingdom in the PC market:
1.Intel CPU and Larrabee GPU
3.Cyrix CPU and nVidia GPU

We last time love Cyrix MII with 3dfx Voodoo Rush PCI to run 3DMark99, that's feeling is great! (nVidia got 3dfx gene)

By JonnyDough on 2/19/2009 5:23:24 PM , Rating: 2
Intel has been in discussions with NVIDIA for more than a year attempting to resolve the matter but unfortunately we were unsuccessful.

There's just something fantastically disturbing about this sentence. Hmm...

Somebody needs...
By StillPimpin on 2/18/09, Rating: -1
RE: Somebody needs...
By Motoman on 2/18/2009 12:24:46 PM , Rating: 5
I have to reality, once you have "enough" CPU power, in most video games you don't get much better by getting a faster CPU - it's down to the GPU at that point. A machine with 16 i7 cores and a Radeon 7000 isn't going to play anything. On the other hand, a Sempron 3300+ with a Radeon HD4850 will probably do a decent job playing Crysis.

While making such a statement so broadly was stupid, there certainly are plenty of examples where, in gaming at least, it can clearly be demonstrated that the CPU is much less important than the GPU.

RE: Somebody needs...
By Suomynona on 2/18/2009 12:30:47 PM , Rating: 2
Most people don't play graphics-intensive games on their PCs, though. Most computers use integrated graphics, and most users are satisfied with them.

RE: Somebody needs...
By Motoman on 2/18/2009 12:37:05 PM , Rating: 2
...Granted. I'm making specific examples around gaming though, which is (as far as I know) the best example to give where the GPU is of more importance than the CPU. It's the only thing the Nvidia guy could have been thinking of when he made his comment - but instead of restricting it to that topic area, he went dumb and make it a general statement.

...for that matter, the typical user would probably be fine forever on an Athlon XP or PIII even. If all you do is email and surf the web, the primary relevance of the CPU is just that there's one there...same for the GPU. The average consumer gains little to no value from virtually any of the hottest new CPU/GPU technology.

RE: Somebody needs...
By Radnor on 2/18/2009 12:42:50 PM , Rating: 2
If so, everybody could work with a Barton or a NorthWood. Hell, look at Atom !!!

That is not a excuse. You don't need a top-end GPU, but a Mainstream one makes all the diference, and with GPGPU coming you will see it.

RE: Somebody needs...
By callmeroy on 2/18/2009 1:00:04 PM , Rating: 3
I don't give a rat's arse if anyone else agrees or disagress but Motoman (I hope i didn't screw his name up) is right on this issue -- as far as gamers are concerned, i can tell you from long experience both mucking with my own gaming PC builds and just being a huge fan of all types of computer games over the many years i've been playing the GPU makes a huge difference in games ---- ONCE YOU HAVE A DECENT other words, as long as you aren't moronic about it expecting the world out of a AMD Athlon XP with 512 MB of RAM, ....once you have say a medium road CPU with decent RAM...the GPU begins to be FAR FAR more relevant in direct game performance returns than a CPU does. Its not even negotiable on that fact its night and day.

I've seen my own systems prove that case time and again -- changing nothing out but the GPU to a higher end model.

As for the "most users don't comment"....Nvidia i'm sure is referring to its main and foundational market segment that really put its name on the map -- GAMERS. Its only after years of tremendous success in the enthusiast/gamer market that Nvidia ever started getting into on-board graphics on run of the mill boxes.

RE: Somebody needs...
By Reclaimer77 on 2/18/2009 4:41:53 PM , Rating: 1
I don't give a rat's arse if anyone else agrees or disagress but Motoman (I hope i didn't screw his name up) is right on this issue -- as far as gamers are concerned, i can tell you from long experience both mucking with my own gaming PC builds and just being a huge fan of all types of computer games over the many years i've been playing the GPU makes a huge difference in games ---- ONCE YOU HAVE A DECENT CPU...

Nobody is disputing this. This is a silly fight over the context of a statement that was poorly worded by the Nvidia spokesman. Motoman simply stating the obvious in an overbearing tone is what's sparking the debate here.


Sorry for caps, but c'mon ppl, this is silly.

RE: Somebody needs...
By JackQW on 2/18/2009 8:24:09 PM , Rating: 2
Not even from a gaming perspective; With Vista, 3D interfaces such as Compiz Fusion, the GPGPU movement; accelerated physics, sound, image, ray-tracing, database searching, ... accelerated frickin' anything and everything...

The CPU is becoming so much less important than the GPU that it is now forseeable that the entire concept of the 'CPU' will be replaced by the next generation concept of the 'GPU', the 'Graphics' component becoming the 'Central' component.

Even Intel nods in this direction; take a look at Larrabee (first Google hit) ... it's an x86 GPGPU! It's a 100% CPU-replacement GPU!

RE: Somebody needs...
By mindless1 on 2/18/2009 8:50:13 PM , Rating: 2
Most people don't do Compiz Fusion or anything else you mentioned.

What is important in general purpose hardware is what the majority needs, otherwise Intel & nVidia effectively have so few customers remaining that they'd have to charge quite a lot more for products. That's why systems with IGPs still outsell those with any and all other video cards.

RE: Somebody needs...
By JackQW on 2/19/2009 2:21:04 AM , Rating: 2
Woops, I don't think I elaborated sound and image manipulation.

You're right, Joe Average just uses notepad.

He doesn't use a spell checking database, process tables of numerical data, brighten his poorly lit photographs, record videos of his pet, send and recieve voicemail, but most of all; he don't play any games.

If he does do those things, he certainly doesn't need to do them fast.

All of those things are what one would call 'embarassingly parellel' tasks, tasks a GPU would be well suited for.


The GPGPU movement is just in it's infancy, but good ole' serial scalar is still far more than effective enough for Joe.

In fact, why does Joe even get a 'Core i7', or a 'Core 2', why doesn't he go lowest bidder, and get something suited more to his capability, like a 'Pentium Pro'?

Oh, and I am Joe's arch-nemesis.

RE: Somebody needs...
By JackQW on 2/19/2009 2:23:06 AM , Rating: 2
Oh crap, my serial scalar spell checker didn't catch that before I hit submit. I can't wait for this GPGPU thing to kick in. I meant parallel.

RE: Somebody needs...
By mindless1 on 2/18/2009 8:50:13 PM , Rating: 2
Most people don't do Compiz Fusion or anything else you mentioned.

What is important in general purpose hardware is what the majority needs, otherwise Intel & nVidia effectively have so few customers remaining that they'd have to charge quite a lot more for products. That's why systems with IGPs still outsell those with any and all other video cards.

RE: Somebody needs...
By Motoman on 2/18/2009 9:13:36 PM , Rating: 2
Motoman simply stating the obvious in an overbearing tone is what's sparking the debate here.

Do, please, elucidate the overbearing tone for us all. Would be highly interested to see what it is you find as "overbearing" such that it incurs your mighty wrath.

...and if I was merely stating the obivous, why in the hell were you arguing with me?

RE: Somebody needs...
By walk2k on 2/18/2009 1:11:14 PM , Rating: 2
In the market Nvidia cares about - gamers - they don't use integrated graphics. They are right about that market, the CPU isn't nearly as important as the GPU.

In most cases, only at low resolutions are games CPU-bound. Once you are talking 19" monitors or larger they become entirely GPU-bound.

Intel - stick to making CPUs and chipsets.
Nvidia - stick to making GPUs.

There, problem solved.

RE: Somebody needs...
By Wieland on 2/18/2009 2:01:41 PM , Rating: 2
Your exactly right walk2k. Keep in mind that this dispute is specifically about the Core i7 and future products. At least right now the Core i7 is an enthusiast product most valuable for - you guessed it - gaming.

RE: Somebody needs...
By mindless1 on 2/18/2009 8:45:51 PM , Rating: 2
That's not entirely true, if a CPU(s) have enough performance to render to a 2D page the video card can output, all that video card needs performance wise is the same it would need to sit idle at the windows desktop at the same resolution.

In other words, a general CPU can still do the job of a purpose-specific processor if suitable programming is present, it's just not as efficient at it.

RE: Somebody needs...
By Reclaimer77 on 2/18/2009 12:25:04 PM , Rating: 1

I can't believe they are claiming the CPU is "less relevant" than the GPU of a PC. What the hell ?

RE: Somebody needs...
By Motoman on 2/18/2009 12:34:15 PM , Rating: 4
...this is why kids all over the world have forever been frustrated by the PCs their parents buy them at Best Buy.

Uneducated consumers don't know anything about PCs...let alone GPUs and their importance for gaming. It basically works like this...

"Hi, I need a new computer"
"Here, this one has the Pentium IV 3.0Ghz which makes the internet faster, and it's on sale"

They bring it home, and the kid can't play a single game, because it has Intel integrated graphics. Whether it has a P4 or an i7, Intel integrated graphics prohibit you from gaming, period. And when the kid complains, the parents just ignore him, because they just bought a "brand new computer" and therefore it must be fine.

On the other hand, that 3.0Ghz P4 with a HD4850 would be an infinitely better gaming machine than an i7 PC with Intel integrated graphics.

Not to mention that, for an awful lot of popular games, going from 2 to 4 cores isn't going to help either.

So, from a gaming perspective (only, as my examples have been), it is quite clear that the GPU is more important than the CPU.

RE: Somebody needs...
By Reclaimer77 on 2/18/09, Rating: 0
RE: Somebody needs...
By Motoman on 2/18/09, Rating: 0
RE: Somebody needs...
By StevoLincolnite on 2/18/2009 3:19:00 PM , Rating: 2
So you are denying the fact that the CPU is less important than a GPU in a gaming situation? Lets flash back to an old rig of mine...

Originally I bought it in 1999 which came equipped with a Pentium 3 667, 128mb of SDRAM and a ATI Rage 3D card, a few years later I upgraded the Rage to a Geforce 4 MX440, dropped in additional Ram for a total of 768mb, I was able to play GTA 3 fine which was the main purpose, then Half Life 2 and Doom 3 and Halo came around, they all wanted Processors that were 1.4ghz or higher, I upgraded the Geforce 4 MX440 and dropped in a Geforce FX 5700 LE and overclocked the core from 250mhz to 570mhz, the memory went from 200mhz to 260mhz (520mhz total).

Basically I was playing Half Life 2, FarCry and Halo at high resolutions with high image quality settings without a hiccup.

This was on a processor that was HALF of the minimum requirements, and WITH playable frame rates. (30-40+)

However there are some games which require CPU grunt, for instance Doom 3 was simply un-playable, 1fps with all settings on lowest, heck I even changed the config file to run it at 320x240 resolution with no change in fps.

That was a game that wanted more from my CPU, which I couldn't provide.

Games which are CPU hungry are games like Supreme Commander, it's not to say that CPU's or GPU's make a bigger difference in games, games will be more reliant on one piece of hardware or the other, it's been that way for years and years, however the majority of the games DO see larger improvements with a faster Graphics card.

But then you get a game like Crysis which just owns all your hardware equally.

I also couldn't care about what education you may have earned, there is no proof of it on this "blog", and I only have your word to take for granted, however you haven't seen the posters qualifications either, so in retrospect you shouldn't be bagging him out like he is useless.

RE: Somebody needs...
By Sanity on 2/18/2009 4:28:42 PM , Rating: 2
So you are denying the fact that the CPU is less important than a GPU in a gaming situation? Lets flash back to an old rig of mine...

Actually no, he's not denying that fact. He (Motoman) was arguing that a GPU is more important in a gaming situation, and that a CPU is less important. What are you reading?

RE: Somebody needs...
By SavagePotato on 2/18/2009 8:55:47 PM , Rating: 2
Watching people preen over who knows more about building pc's is amusing.

I think it must have something to do with the fact that it is a task that's become so simple that any monkey that can put together leggos while looking up guides on google can not only do it but excel.

Ah to be living in 1995 again when knowledge such as this was actually worth something and couldn't be googled in 30 seconds.

RE: Somebody needs...
By Motoman on 2/18/2009 10:10:03 PM , Rating: 2
...I agree, to some degree. I think it would be great if everyone built themselves a PC *once*. It's not honestly that hard...and you'd learn a lot about computers along the way.

The problem is, people think of PCs in the same manner they think about's an appliance, and what happens under the shiny cover is a mystery that they lack the desire, aptitude, and curiosity to investigate.

Hence...we get Dell and Apple. Whee.

RE: Somebody needs...
By Reclaimer77 on 2/18/09, Rating: -1
RE: Somebody needs...
By Motoman on 2/18/09, Rating: 0
RE: Somebody needs...
By Reclaimer77 on 2/18/09, Rating: 0
RE: Somebody needs...
By TSS on 2/19/2009 12:39:48 AM , Rating: 3
you both got rated down for not providing arguements and just playing it on the person.

people research their computers alright, but as they have virtually 0 knowledge of PC's, they rely on salesman/friends/family for advice. this comming from my 4 years of system operator education, including 2 internships, 1 at a company 1 at a highschool.

hell, out of my 22 classmates, nearly all of them came to *me* for hardware advice on their rigs. why? because i spend a *year* researching my new rig, which i was going to spend 3000 euro's on. same way i went to 2 of them for coding websites, or another guy for designing them.

i belive everybody can replace/put in a video card. or CPU. there's only 1 slot the damn thing fits in! but not everybody knows the *capabilities* of said hardware.

the CPU isn't any more important then your GPU. the single most important component of any system, be it at home, at work or even as a server is the purpose of the machine . if you have a database server, RAM is the most important component. if you build a rig for gaming, the GPU is the most important. if you work with 3d rendering, the CPU is the most important. if you are building a HTPC the noise production is most important. if you are building a notebook, the power consumption/performance-to-watt ratio is most important.

RE: Somebody needs...
By Icelight on 2/18/2009 12:34:48 PM , Rating: 2
Well, Nvidia is pushing CUDA and GPGPU applications like mad, that's probably what they were getting at, simply in marketing double-speak. GPUs do provide a ton of extra compute power over a single traditional multi-core CPU.

Of still need a CPU to drive it...and the applications that GPUs are extensively good at are not even close to being widely ranged.

RE: Somebody needs...
By whirabomber on 2/18/2009 1:53:05 PM , Rating: 2
Go to nVidia's website and check out the Tesla c1060 240 core GPU based supercomputer on a card (actually nVidia doesn't call it a super computer until you gear out a pc with 3-4 c1060's). With such a device, the cpu just handles the front end for the c1060 and isn't the main processing power. 4 megaflops isn't anything to say "what gpu computing" over.

For a out of the box example check out the 1 au 960(?) GPU computing based Teska s1070. Welcome to the new reality.

RE: Somebody needs...
By omnicronx on 2/18/2009 2:00:17 PM , Rating: 2
Nvidias statement has nothing to do with GPU powered processing like the Tesla. Read my post below..

RE: Somebody needs...
By Hexxx on 2/19/2009 7:57:03 AM , Rating: 2
4 megaflops isn't anything to say "what gpu computing" over.

4 Megaflops is something a 386 could provide (well, single precision FLOPS at least).

A P4 can do about 7 Gigaflops. I'm sure the word you're looking for is Teraflops.

RE: Somebody needs...
By omnicronx on 2/18/2009 1:59:19 PM , Rating: 4
You've taken this completely out of context. CPU's can be had for as little as 30 dollars which gives you the ability to do pretty much all of your daily tasks. Nvidia is right on in saying the gpu market has become more important than the CPU market. To us? perhaps not, but to bottom line profit, it surely has. In todays computer world, GPU's are definitely higher margin products than cpu's regardless if we are talking integrated or add-on card.

RE: Somebody needs...
By mindless1 on 2/18/2009 8:57:50 PM , Rating: 2
To counter, IGPs add less than 30 dollars and give the ability to do pretty much all your daily tasks too.

nVidia is saying the GPU is more important because that's what they sell, not CPU. Most people don't pay more for the entire video card (including asociated costs like the PCB, memory, other electronics onboard that nVidia doesn't profit from as a finished product) with a larger core, than they do for their CPU. We could consider gamers but since IGP are more popular than gaming cards, it would be a minority. Granted that minority is where the money is if they target the customers well, but then there's that pesky issue of competitive pricing.

RE: Somebody needs...
By Charwak on 2/20/2009 4:54:19 AM , Rating: 2
I do not really think that NVIDIA is sayin, "GPU more imp thn the CPU"
- Almost all well-known comp arch specialists blieve that the future of computing is not multi core CPUs.... but essentially a hybrid architecture which makes use of the GPUs massive processing power (agreed its limited to floating point calculations, but thats what we do majorly these days, stuff like video encoding, HD and other stuff) in symphony with the CPUs prowess. Intel realized this and is at a critical disadvantage because they do not make any GPU products till Larrabee.

-- With NVIDIA's CUDA technology, experts around the world are able to do some really hitech stuff with a ~200x speedup by using the GPU to do most of the computations. And when I say 200x believe me I am not exaggerating.

-- With the market moving more and more towards the GPU, Intel certainly needs 2 put brakes on this paradigm shift.

--The lawsuit is just a means for doing that.

"We can't expect users to use common sense. That would eliminate the need for all sorts of legislation, committees, oversight and lawyers." -- Christopher Jennings
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki