Print 47 comment(s) - last by johnstiff.. on Nov 12 at 2:31 PM

  (Source: Intel's Insides)
The feud between two of the hardware's biggest players continues to be ugly

NVIDIA and Intel, two of the electronics industry's largest veteran powers, have never been too warm or close.  Recently, the pair can't seem to stop stepping on each other's toes.  The pair's troubles started last year with NVIDIA releasing its Ion integrated graphics platform, which NVIDIA CEO Jen-Hsun Huang likened to opening a "can of whoop-ass".  Intel was not too happy about NVIDIA trying to shove it out of the integrated graphics market with the HD-ready Ion.

Nor did Intel's announcement of an upcoming discrete GPU get greeted by open arms by NVIDIA.  NVIDIA, already struggling financially in the chipset industry, then was smacked with a suit by Intel alleging that NVIDIA's license agreements did not allow it to build motherboards for Nehalem processors, or other Intel processors with an integrated memory controller.  NVIDIA fired back with a countersuit, demanding that the licensing agreements that supply the tech Intel uses to power its integrated graphics be terminated, threatening Intel's integrated graphics offerings.

Given the pair's history, it's not very surprising that NVIDIA would choose to take a jab at Intel's recent antitrust fines from the EU and the new charges leveled against it in New York.  NVIDIA has launched a new site called Intel's Insides, which it hosts and is using as a artistic podium to accuse Intel of alleged illegal activity.

The site is especially critical of CEO Paul Otellini.  A recent post features a cartoon with a cross-eyed Otellini denying using "bribery, coercion and kickback relations" to try to corner the market.  The site has a rather humorous disclaimer informing readers that it "is not provided, sponsored or endorsed by Intel Corporation."

Intel has yet to respond (and likely won't).

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Nvidia needs to change
By cactusdog on 11/6/2009 4:05:52 AM , Rating: 5
Nvidia is one of the worst players for locking out their tech or charging a premium for it.

They need a management overhaul and some decent PR guys to get back on track.

Everybody hates Nvidia so it might be time for them to look within and change the way they operate and make some decent products at fair prices.

RE: Nvidia needs to change
By Dingmatt on 11/6/09, Rating: -1
RE: Nvidia needs to change
By PeaJay on 11/6/2009 7:56:42 AM , Rating: 2
Yeah, and I'll correct YOUR little GRAMMAR and spelling error.

YOU'RE welcome!

RE: Nvidia needs to change
By Dingmatt on 11/9/2009 5:02:27 AM , Rating: 2
Touché, lol

RE: Nvidia needs to change
By Dingmatt on 11/9/2009 5:04:09 AM , Rating: 3
Though I think you've missed the point i was making.

RE: Nvidia needs to change
By kondor999 on 11/10/2009 10:24:44 AM , Rating: 2
Dude, you chastised someone for making grammatical errors, and, while doing so, managed to make two of your own.

Either you're incredibly subtle (to the point of being obtuse), or just another idiot who got pwned.

Guess which I'm voting for...

RE: Nvidia needs to change
By phxfreddy on 11/10/2009 9:12:44 PM , Rating: 2
You meant to say "Douche" right? because it seems that the D.T. ( daily tech aka Deep Throat ) commenters almost to a man seem to be total douche bags.

You're all a bunch of want to be engineers who all appear to be to damned dumb to ever get close that. So much so I felt compelled to make this one last comment about how idiotic most of you appear.

RE: Nvidia needs to change
By FaaR on 11/6/2009 9:16:59 AM , Rating: 5
You're being needlessly trite. Many people "hate" (needlessly strong word really) Nvidia for the company's douchebaggy business practices lately, including the disabling of PhysX if a non-Nvidia primary display adapter is in the system.

Customers have paid for their Nvidia card, and thus for PhysX. They have a quite reasonable expectation to be able to use the tech, regardless of the brand of the primary adapter.

I must say even as I'm utterly fascinated and excited about Fermi (I think it's gonna rock the house to its foundations), I am not going to buy any more Nvidia products until they stop behaving so unethically as they've been doing lately.

RE: Nvidia needs to change
By crystal clear on 11/6/2009 11:25:32 AM , Rating: 5
I must say even as I'm utterly fascinated and excited about Fermi (I think it's gonna rock the house to its foundations),

You are in for a dirty surprise,read this-

Nvidia has confirmed that the Fermi card held up by Jen-Hsun on stage wasn't real, but was adamant that the demos were.

RE: Nvidia needs to change
By Jabroney701020 on 11/6/09, Rating: 0
RE: Nvidia needs to change
By callmeroy on 11/9/2009 10:03:05 AM , Rating: 2
Everyone hates Nvidia? Meaning in the business world or consumer world -- which one? In either case I missed the memo I guess, because I have been a loyal Nvidia customer for hell about a decade or so by now.

It all started a long time ago with I actually tried ATI cards and was quickly disenchanted with the price/performance (if you can dig up some performance numbers from around the mid-late 90's Nvidia was stomping on ATI performance wise) --- and then their drivers annoyed me as well.

So I said screw it and bought Nvidia for the first time like I said about 10 or so years ago (I don't remember exacly when but I know it was at least 10 years back very easily could have been more) since then I've never regretted Nvidia, drivers never give me issues -- and I always feel like I got value for my money for whatever card I buy.

Thus I stick with Nvidia.

RE: Nvidia needs to change
By erple2 on 11/9/2009 3:16:17 PM , Rating: 2
mid-late 90's Nvidia was stomping on ATI performance wise

And in the same time frame, 3dfx was stomping everyone else...

about 10 or so years ago
I've never regretted Nvidia, drivers never give me issues

Then you're pretty much the only person on the planet that didn't have issues with NVidia's drivers back in 1999 and before. I remember the "driver dance" I had to do with the TNT card through the Geforce FX cards. Which was why I switched to ATI (that, and the 9700 Pro was daddy-mack awesome).

However, the days have gotten a bit more muddied. In this day and age, the crown seems to pass around between the two for any number of "metrics" you choose to pick: Performance per money, outright performance, you pick it. ATI and NVidia have more or less traded blows on that front for the past couple of years. NVidia had issues with value when the GTX280 came out, ATI had issues with value when the 2900 came out.

The point is that 10 years ago is about 5 generations ago in the computer industry. Anything done back then doesn't apply any more. ATI/AMD is much better than it used to be. NVidia is also much better than it used to be.

By ApfDaMan on 11/6/2009 6:24:29 AM , Rating: 5
I wonder what processors Nvidia employees use?

RE: Hmm
By jdietz on 11/6/2009 6:43:48 AM , Rating: 2
Who cares - irrelevant.
Either nVidia is trying to get away with not paying for a license or Intel refuses to license or they are asking for significantly more than R&C charges for a license.

RE: Hmm
By Chocobollz on 11/8/2009 3:20:26 AM , Rating: 2
Who cares - irrelevant.

LOL you obviously don't catch the irony; because the answer would be either an "Intel" processor, or an "AMD" processor ;-)

They should let their products do the talking
By grebe925 on 11/6/2009 7:57:20 AM , Rating: 3
and not some crappy cartoons. I lost a laptop because incompetent Nvidia engineers thought they were designing a cooking range instead of a graphics chip and fried it. OTOH, every Intel based product I've bought so far over decades has worked flawlessly.

RE: They should let their products do the talking
By FaaR on 11/6/2009 9:10:16 AM , Rating: 1
I suppose you never owned a "F00Fed" Pentium, or a 1133MHz Pentium III. The i850 chipset had a partially busted PCI bridge, the Northwood P4 ran slower with hyperthreading enabled than with it off, and so on.

Every company in the tech industry screws up every now and then, it's pretty much unavoidable considering the complexity of the products involved.

The broken mobile GPUs you mention were because of flaws in the *packaging*; as you may know, Nvidia has no part in the actual manufacturing of their designs; they outsource that to other companies. They did screw up though by not immediately fessing up to the problem, but instead tried to downplay and sweep it under the rug (a natural reaction for any big company; Intel has done the same in the past.)

By erple2 on 11/9/2009 3:57:02 PM , Rating: 2
because of flaws in the *packaging*

Yes and no. The flaw was in the packaging, but I seem to recall that the packaging was created based on the performance envelope prescribed by NVidia Engineers. If anything, it was a breakdown in communication between NVidia and the vendors.

Somebody has to do it
By Sharpiefiend on 11/6/2009 10:19:02 AM , Rating: 4
Here's the G4Saurus Defectus video, much more clever IMO.

RE: Somebody has to do it
By tfk11 on 11/6/2009 2:11:48 PM , Rating: 2
"Will render for food" - lol

By Leper Messiah on 11/6/2009 12:06:47 PM , Rating: 5
I mean, nVidia beaking intel for anti-competitive practices when we've got bumpgate, batmangate, fermigate...rebrandgate, I mean we could go on and on about nVidia screwing over consumers lately.

By StevoLincolnite on 11/5/2009 11:34:12 PM , Rating: 2
Just looked at all the comic strips, great stuff! Gave me a good laugh if nothing else. :)

I just hope nVidia doesn't poke the sleeping lion to much though. :P

I am wonder
By oliveneo on 11/9/2009 1:32:15 AM , Rating: 2
It very starnge news that processors Nvidia employees use?

By EasyC on 11/9/2009 12:41:13 PM , Rating: 2
Is there really a pissing contest going on here about the best IGP's for gaming??? I mean seriously. If you're using an IGP for gaming, you probably shouldn't be gaming.

If you're going to game (and are actually serious about it), don't use an IGP. Period. Get a dedicated graphics card. As far as those go, nVidia needs to concentrate more on AMD destroying them right now than silly comics attacking Intel.

By johnstiff on 11/12/2009 2:31:05 PM , Rating: 2
Nvidia harasses Intel, now that's funny thing. To be honest, don't like nvidia that much, imo they were better before!

obligatory dailytech proofread plea
By someguy123 on 11/6/09, Rating: -1
RE: obligatory dailytech proofread plea
By sprockkets on 11/6/2009 12:21:42 AM , Rating: 3
Yeah, who knows. The blog section contains actual blog stuff, like

Then the hardware section contains, that's right "Jason Mick (blog)"

The only good stuff left on this site are articles made by Anand Lai Shimpi, and those occur now like 4 times a year.

RE: obligatory dailytech proofread plea
By asliarun on 11/6/2009 3:00:55 AM , Rating: 3
Come on, you have a point but you're going overboard with it. Don't just define "good stuff" in terms of grammar and spelling, please consider content as well. I think Dailytech has done decently, if not great, in this regard. The articles do lack depth, but I get the feeling that Dailytech's premise is to straddle the space between Anandtech and the inquirer. In fact, I think that the grammatical errors are probably more so because they're trying to get the content "out of the door" as soon as possible without too much effort put into editing.

By StevoLincolnite on 11/6/2009 9:45:55 AM , Rating: 2
Personally the spelling errors don't bother me.

What bothers me is when an Article has gaping "Holes" in it's "Stories" - Or the information is severely incorrect.

However, Dailytech at-least correct the articles when it's pointed out by the posters. (Most of the time).

All in all, Dailytech is my one stop-shop for getting all the latest information on the happenings in the tech industry without having to look under every single minute stone on the Internet.

Yes some Articles are not worth the time, some don't interest me, but you do get the odd good one every now and then.

By Griswold on 11/6/2009 4:13:19 AM , Rating: 1
It's DailyMick - didnt you know?

fun site! but not new to Intel
By kknd1967 on 11/5/09, Rating: -1
RE: fun site! but not new to Intel
By Makaveli on 11/6/2009 12:25:30 AM , Rating: 3
what the hell does this mean?

"(marketing FAKE quadcore on Intel's behalf)"

RE: fun site! but not new to Intel
By Meaker10 on 11/6/2009 3:39:19 AM , Rating: 4
AMD claimed that the core 2 quads were "fake" quads because it was just 2 dual core dies on one package.... A pretty weak argument.

RE: fun site! but not new to Intel
By TSS on 11/6/2009 9:35:35 AM , Rating: 2
Didnt they later admit that they should've done the same with the X2's instead of going dual socket or naitive quad?

i thought they said that sometime after phenom was out and failing.

The arguement stemmed from the cores of the quads having to communicate via FSB (and share it with memory though i'm not sure on that), while AMD's cores where directly connected and they still had the advantage of an intergrated memory controller. So the arguement did have merit.

Currently though that whole discussion has become moot since intel ditched the FSB in the i7's and has quickpath interconnects. It'll be interesting to see where the corewars take us next though.

RE: fun site! but not new to Intel
By just4U on 11/8/2009 12:02:16 AM , Rating: 2
It wasn't so much that they should have done the same thing but rather they did to much all at once and that cost them heavily in the overall price of the cpu's plus added to the delay's in getting them off the ground.

It wasn't a total washout but I am sure if AMD were able to do it all over again they'd probably have taken the approach Intel had. The key to remember is Intel "HAD" to do it that way. Since they hadn't developed a true Quad yet.. which I think is all part of their "tick" "tock" way of doing things. ( a little bit at a time but not all at once)

RE: fun site! but not new to Intel
By Motoman on 11/8/2009 10:18:01 PM , Rating: 2
Well, it's an argument that didn't matter to consumers. It had it's technical merits, but the vast majority of people weren't sure why they should care.

By GaryJohnson on 11/6/2009 12:45:30 AM , Rating: 2
I would have said TI rather than Qualcomm. Why Qualcomm?

nVidia - do more than drawing cartoons!
By Cookoy on 11/6/09, Rating: -1
By Griswold on 11/6/2009 4:16:28 AM , Rating: 5
To be fair, intel isnt catching up at all - not until they release something that is remotely comparable. And it doesnt look like the first iteration of larrabee will be anything to call home about.

By tviceman on 11/6/2009 9:08:30 AM , Rating: 5
What has intel done to catch up?


RE: nVidia - do more than drawing cartoons!
By FaaR on 11/6/2009 9:20:40 AM , Rating: 4
Catching up fast, surely you jest?

Intel's integrated graphics is as shite as it's always been.

Sure, it's faster than it used to be, relatively speaking. Now you can run actually run high-end games from four, five years ago on it! Games which would crawl on previous generation products.

However, today's games is still performing as badly as ever on Intel's POS integrated graphics.

RE: nVidia - do more than drawing cartoons!
By Thorburn on 11/6/09, Rating: 0
RE: nVidia - do more than drawing cartoons!
By Motoman on 11/6/2009 2:07:52 PM , Rating: 3
Crank your video settings to max on Trackmania Nations and let us know how that turns out. I'd be surprised if it ran at half-way settings.

Whether or not it can run Aero is important to, I guess, everybody...but anyone who does any gaming at all is going to be making serious compromises running an Intel graphics solution.

RE: nVidia - do more than drawing cartoons!
By PrinceGaz on 11/6/2009 5:58:17 PM , Rating: 2
You don't have to crank your video settings to max to enjoy a game. The whole point of being able to choose the video settings is to allow people with older/slower machines to play the game, usually with only a slightly inferior visual appearance.

I know a lot of people here probably consider "going into the graphics settings and cranking everything up to max" almost as a mandatory part of the installation process for a game, but most ordinary people aren't like that and get very satisfactory results by using the default settings even with integrated-graphics from Intel and others in most games.

I've got nothing against people who want to buy high-end discrete cards so they can use max graphics settings (my ageing GeForce 8800GTS still packs quite a punch and lets me use quite high settings still, certainly higher than any integrated solution) but the majority of people enjoy their games just as much with far more modest hardware.

By Alexstarfire on 11/8/2009 3:15:46 AM , Rating: 2
I'm sorry, but when an IGP can't even run CS at max settings without lagging in the smoke..... IDK what you expect. You could toss in a FX5200 and have it run CS at max settings. Not CS:S, but the original CS. I don't expect to be able to turn everything up to max, but I also shouldn't have to turn everything to the bare minimum just to get it to run smoothly. Granted, this isn't the top of the line IGP from Intel, but if you need top of the line just to run above bare minimum then that's not saying very much.

RE: nVidia - do more than drawing cartoons!
By hyvonen on 11/6/2009 4:58:40 PM , Rating: 5
Why would you want to run "high-end games" on an IGP?? You have completely missed the point of integrated graphics. IGPs are there for the massses who don't NEED the gaming performance - if you had to pay extra for power you don't need, it's a waste.

If you want to play the games, there are plenty of good discrete graphics options. So quit whining about Intel's 'shite' IGPs. They are doing the job they were designed to do, and doing it well.

By heffeque on 11/7/2009 3:12:39 PM , Rating: 2
They're also for processing power for GPU accelerated programs. Not many do so right now, but with Windows 7's DirectCompute and Snow Leopard's OpenCL... things will change :-)

"I'd be pissed too, but you didn't have to go all Minority Report on his ass!" -- Jon Stewart on police raiding Gizmodo editor Jason Chen's home

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki