backtop


Print 42 comment(s) - last by jjaomni.. on Oct 13 at 6:06 PM

Rumors also say NVIDIA will report another quarterly loss

NVIDIA is certainly on the skids financially thanks to several different factors. The company is traditionally one of the most profitable in the GPU and chipset business and commonly turns in significantly improved profits each quarter.

That all changed  in Q2 of fiscal 2009 when it announced a loss attributed in part to a massive one time charge relating to higher than normal failure rates of certain notebook GPUs it sold computer makers like HP and Dell.

In August, rumors started to circulate that NVIDIA would be leaving the chipset business. NVIDIA strongly denied these rumors and said that the reports were false. NVIDIA pointed to the fact that it held 60% of the chipset market for AMD platforms in Q2 2008, SLI was the preferred multi-GPU platform, and its 790i SLI chipset was preferred by editors worldwide as reasons the rumor were false.

Later in August, the announcement was made that NVIDIA would be enabling SLI on Intel's new X58 chipset without requiring the use of its nForce 200 chip. NVIDIA had previously not allowed anyone to enable SLI without including this chip in the design. This move has been seen by some as an indication that NVIDIA is softening its stance on requiring its chipsets as a possible set up for an exit of the chipset business -- without affecting its SLI market.

NVIDIA is unable to shake the rumors that it is leaving the chipset business. CNET News reports that the rumor has surfaced again. The rumor was churned by a Pacific Crest analyst who said, "our checks confirm" NVIDIA will be leaving the chipset business next year.

Further speculation has NVIDIA pre-announcing another loss for Q3 that ends in October. The report of additional losses would be no big surprise with the computer industry as a whole seeing significant revenue reductions due to the weak economy.

Alongside the rumor that NVIDIA would leave the chipset business, another rumor is propagating. The rumor has NVIDIA providing graphics chips for the MacBook systems expected to be announced on October 14. Reports claim that NVIDIA is showing internal prototypes of Mac systems running its GPUs. NVIDIA already provides the graphics chip in MacBook Pro systems, so it would not be that big a shock to find out this rumor is correct.

Adding more fuel to the rumor that NVIDIA will see continued reductions in profits is the prediction from Pacific Crest analysts that NVIDIA could see market losses in the notebook segment to Intel's Montevina integrated graphics processor.

CNET News reports that one of the signs that NVIDIA would be providing GPUs for Apple being pointed to is a simple graphic on the NVIDIA site that some see as a MacBook design possibility. It's easy enough to see a foreign maker of accessories for the iPod leaking Apple designs by unveiling cases too early, but it would be hard to see NVIDIA make that sort of mistake.

Odds are the image is nothing more than a stock graphic the NVIDIA web designer placed on the site. An even more likely scenario has Apple updating the NVIDIA GPUs used in its MacBook Pro notebooks.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Interesting
By FITCamaro on 10/7/2008 12:37:52 PM , Rating: 1
Wonder why Apple would wanna go to Nvidia's chips considering ATI's use less power.




RE: Interesting
By silversound on 10/7/2008 12:50:56 PM , Rating: 2
I dont think apple will even consider nvidia chipset, its just a rumor they are probably still use intel chipset with core i7, ATI graphic cards run much faster than nvidia cards in Macs


RE: Interesting
By Mitch101 on 10/7/2008 1:15:18 PM , Rating: 2
Apple would need to do it just for the HD video abilities of the GPU be it decoding or encoding or like the recent talk of Photoshop.

I suspect Apple might be looking for a chipset solution as apposed to a add on graphics solution and ATI doesnt have a chipset for Intel cpu's.

An ATI solution would have to be an add on graphics board which would take a higher power requirement. So maybe on a higher end model?

This is evolutionary but expect Apple fans to predict life without would be meaningless.


RE: Interesting
By mindless1 on 10/7/2008 6:24:10 PM , Rating: 2
For one, Apple isn't paying their customers' electric bill. Total integration cost for desired features can still be lower even if the larger heatsink on top costs 25 cents more.


The only chipset I ever had trouble with...
By Motoman on 10/7/2008 10:44:21 PM , Rating: 1
...was the nForce3. I've built computers for about 13 years now...a few hundred. Not exactly a huge system integrator...but I've been around the block. I *never* had any issues with any Via chipsets, starting at the K6-2 platform. *Never* had any issues with SiS chipsets...or AMD chipsets...ULi chipsets...ServerWorks, Broadcom, or even Intel. I could have said the same for Nvidia - except the nForce3 chipset was HORRIBLE. I'm pretty sure it was a 100% failure rate. Each and every system I sent out with an nForce3 mobo came back...and virtually all of them got replaced with an ECS 755-A2 (Sis) mobo, which proved to be bulletproof.

Ugh. Every once in a while I still see an nForce3 model on the market. And every time I get a little twitch and have to resist the urge to kill somebody.

That being said, I'd be sad to see Nvidia leave the chipset industry. Their other chipsets have been fine, and AMD/ATI needs the competition for the AMD market. If Nvidia leaves, there's no choice other than AMD/ATI for an AMD-based system anymore. Less choice is bad.




RE: The only chipset I ever had trouble with...
By Reclaimer77 on 10/8/2008 7:00:22 PM , Rating: 2
You have built hundreds of computers and have never had a problem with that horrible VIA chipset ??

hmmm.. oookay. Riiight.


RE: The only chipset I ever had trouble with...
By Motoman on 10/9/2008 5:19:03 PM , Rating: 2
Nope. Never had issues with Via chipsets, which led me to believe that reports of their instability were greatly exaggerated.

On the other hand, I had horrible results with Creative sound cards often enough to come to the conclusion that reports of their awesomeness were greatly exaggerated.


By Cypherdude1 on 10/10/2008 3:25:45 PM , Rating: 2
You've never had any problems with nVidia's 750, 780, and 790 chipsets? I've been reading threads, some of them with over 1700 replies such as on eVGA's forum:
http://www.evga.com/forums/tm.asp?m=253891&mpage=5...

which rant about the nVidia 700 series "artifact" and crashing problem. nVidia's 700 series also have other stability problems which make me shy away from their chipset and go for Intel's X48 or their next memory controller-less chipset. While I am eagerly waiting for Intel's next gen CPU with built-in memory controller, Asus's X48-based P5E64 WS Evolution with 4 PCIe x16 (2 run @PCIe x4) slots is a very nice mobo:
http://www.asus.com/products.aspx?l1=3&l2=11&l3=64...

BTW, GigaByte will be making a mobo with 4 x16 slots for Intel's next-gen CPU with built-in RAM controller.

quote:
I had horrible results with Creative sound cards often enough...

You've had problems with the Creative's X-Fi series? I'm sorry to hear that. I always thought the X-Fi series were the best, especially their top-of-the-line $300 card with external box.

I have a Hercules GTXP with external box which I purchased in 2002 and it works pretty good. The drivers are very stable. However, while the S/N ratio is pretty good, about 85 dB, I suspect the frequency response is not a full 20Hz-20KHz. It's probably about 40Hz-19KHz. Also, the 10-band equalizer, which is built into the sound chip, only works on the first (front) channel. I know from other owners Creative's X-Fi series 10-band equalizer works on all channels.

Because I play music and DVD's (ported via s-video to TV) using my computer, these specs are important to me. I do not play games anymore. PC-makers don't seem to think a hardware 10-band equalizer is important, but it is. Using a software-based 10-band equalizer is not an option because of the lag and higher CPU usage.


I thought this was a given now?
By bludragon on 10/7/2008 2:08:45 PM , Rating: 5
nVidia have not (AFIK) got a license for core i7 chipsets, so they were forced to enable SLi on the intel ones.

AMD now owns the athlon chipset market, of which there is no high end market left (so no real demand for SLi).

nVidia recently announced a set of redundancies to concentrate on their core products...




dailytech discovers time travel
By jjaomni on 10/13/2008 6:06:09 PM , Rating: 2
quote:
That all changed in Q2 of fiscal 2009 when it announced a loss attributed in part to a massive one time charge relating to higher than normal failure rates of certain notebook GPUs it sold computer makers like HP and Dell.


Ummm last time i checked it was still 2008.!! Did DailyTECH find a TARDIS?




Production rights?
By Mr Perfect on 10/7/2008 1:53:17 PM , Rating: 1
Has Intel given Nvidia rights to product Core i7 chipsets yet, or are they still holding out?

If Nvidia won't be gettin the license for making i7 chipsets, then maybe that what this rumor is referring to.




Quelle surprise....
By on 10/7/08, Rating: -1
RE: Quelle surprise....
By theapparition on 10/7/2008 12:57:39 PM , Rating: 5
I know I'll take flak for this, but I've had issues with every single chipset I've ever owned.......except Intel. They are the gold standard when it comes to chipsets in my mind.

BTW,
Any reason for that screen name? Jabbing at someone are we?


RE: Quelle surprise....
By FITCamaro on 10/7/2008 1:38:03 PM , Rating: 1
He's just an idiot.


RE: Quelle surprise....
By therealnickdanger on 10/7/2008 1:45:47 PM , Rating: 5
A fat idiot, apparently.


RE: Quelle surprise....
By Pirks on 10/7/2008 2:20:56 PM , Rating: 4
No, a f1t one


RE: Quelle surprise....
By FITCamaro on 10/7/2008 2:46:42 PM , Rating: 2
For the record, the FIT in my name actually stands for something. I picked this name in college and have kept it.


RE: Quelle surprise....
By Pirks on 10/7/2008 2:51:29 PM , Rating: 3
I said f1t, not fit.


RE: Quelle surprise....
By tastyratz on 10/7/2008 4:06:01 PM , Rating: 2
I know you are but what am I?

But back on what was said- I have to agree. I have had good luck with Intel chipset's after I made the switch from AMD to Intel when the c2d came out. I noticed they just didn't seem to be as fussy or quirky of what I have used in the past.

I do have to say that I never had issues with Via chipset's however. I like via as a company and miss their competition. They do still design and market products that aren't in the main stream (such as mobo/cpu combo's for the htpc and carpc market)
Until the Atom setups came along they were top choice.


RE: Quelle surprise....
By FITCamaro on 10/7/2008 4:27:19 PM , Rating: 2
Oh I know. Wasn't saying you were. Just pointing it out.


RE: Quelle surprise....
By theapparition on 10/7/2008 7:03:14 PM , Rating: 2
Would that be Fashion Institute of Technology???

No wonder they changed their name to Florida Tech.

Ah, the good 'ol days in Browning hall and killing brain cells at the Rat.


RE: Quelle surprise....
By SLEEPER5555 on 10/8/2008 1:25:39 AM , Rating: 3
Fruit In Training?

Is that what all the college guys called you?


RE: Quelle surprise....
By clovell on 10/7/2008 2:04:13 PM , Rating: 2
Eh, I haven't been through too many, but Nvidia's chipsets are the only ones that have given me trouble. Not trying to condone the OP (jerk), but my personaly experience isn't very supportive of Nvidia.


RE: Quelle surprise....
By Clauzii on 10/7/2008 2:28:33 PM , Rating: 2
Still have a nForce2 machine running. The sound is shite, and I can't install original drivers, but it's rock-solid with the AsRock ones. I also have an old VIA-based machine (KT266A) - Rock-solid too. The Intel chipsets seems good also. I remeber when I had a BX-chipset for my PIII - It clocked to FSB133 with a PIII500e (@666MHz) - It was rocksolid too. Right now I'll await the new types, and see what I'll get for my next rig.


RE: Quelle surprise....
By blppt on 10/7/2008 8:44:38 PM , Rating: 1
Agreed on the nforce2---IMHO the best, most stable AMD chipset ever made. Parents still have a XP 2400+ XPC with an nforce2 chipset, still chugging along.

Via has been pretty much nothing but a nightmare for me, even the Intel based ones like the horrid Apollo Pro 133 mobo I had...the Abit KT7A (KT133A) I had was good, though, once they fixed that stupid SBLive data corruption problem.

The SiS 735 ECS K7S5A I had was the only non nforce2 AMD mobo I ever had that could be considered rock solid.


RE: Quelle surprise....
By Targon on 10/8/2008 8:46:40 AM , Rating: 2
The problems that many people have come from adding tons and tons of features, which means that there is a greater chance for one component or the other breaking. Then you have the quality of the motherboard and the BIOS on the board as a serious issue.

People who buy cheap motherboards with a given chipset will generally have more problems than those who buy the higher end/higher quality boards. It has been years, but I remember problems with FIC brand motherboards with VIA chipsets on them back in the days of Windows 98.

You want pain, try not being able to install the OS properly on a clean install without installing the chipset drivers in safemode half way through the installation. On some of those VIA chipsets, Windows would not talk to the chipset properly to assign IRQs with the default drivers. It was the reason I initially started to avoid VIA, but as time went on, I encountered more and more problems with VIA chipsets.

For NVIDIA, I used to like their chipsets until they decided to put the damned firewall in their chipsets. The implementation was so bad that the firewall would kick in randomly, causing the ethernet to stop working without playing games with the drivers and software. These days, I just have not seen any real advantages over the ATI/AMD based boards since the vast majority of people will not be running a SLI setup anyway. The integrated graphics(for systems that do not need a dedicated video card) on NVIDIA based motherboards also are not great, though still a lot better than Intel.

NVIDIA had a huge edge back in their NForce 2 days with Soundstorm. The audio quality was high enough where for the first time, many people did not bother with a sound card. When they dropped Soundstorm, it made it less attractive to go with NVIDIA for chipsets. It was a mistake to assume that adding a firewall to the chipset would offset what they dropped.


RE: Quelle surprise....
By deeznuts on 10/7/2008 4:35:49 PM , Rating: 2
I remember the AMD 760 served me well. That's going back a few years, but when AMD finally decided to make a chipset, it was ok. Not sure why they stopped with the consumer chipsets.


RE: Quelle surprise....
By Oregonian2 on 10/7/2008 1:09:44 PM , Rating: 2
Or at least design some... seeing as how they're not making chips anymore (to be made by Foundary).


RE: Quelle surprise....
By fuser197 on 10/7/2008 2:37:56 PM , Rating: 2
As long as we're all on the anecdotal evidence wagon, I'm actually having pretty good luck with the Nvidia 780i SLI chipset, it's way faster than the Intel P965 I was on, using the same hardware.


RE: Quelle surprise....
By on 10/7/08, Rating: -1
RE: Quelle surprise....
By murphyslabrat on 10/7/2008 3:26:18 PM , Rating: 2
quote:
You see, anyone that claims one chipset is WAY faster than another is not to be believed. What is way faster 5%? 6%?

Dude, chill. How do you know they aren't referring to overclocking success? and he even pre-disclaimed it as his own non-objective feeling (anecdotal evidence).


RE: Quelle surprise....
By mindless1 on 10/7/2008 6:20:43 PM , Rating: 2
There have been plenty of cases of one chipset being more than 10% faster than another unless one only uses very isolated synthetic tests unrepresentative of real world use. Via 693 vs Intel BX memory bandwidth? Via KT(anything) vs anyone else's PCI performance? AMD's on-die CPU memory controller versus anything else when it was first introduced? Intel's ported USB2? nVidia's integrated video until ATI began to catch up?

Historically Intel has produced some of the best chipsets but plenty of people run basic vanilla systems with other chipsets and never notice the difference. Most people couldn't even tell you whose chipset was in their system, probably not even what a chipset is because there wasn't any difference significant enough for it to be an issue in their use.


RE: Quelle surprise....
By rudolphna on 10/7/2008 3:12:01 PM , Rating: 2
As for VIA... thank god no more VIA their chipsets suck... SiS I never had problems with though. Actually, the SiS 735 board I had was absolutely the most stable platform I have EVER had. NVIDIA chipsets are just too power hungry. AMD chipsets are pretty good, but have a bit more work to do.


RE: Quelle surprise....
By Screwballl on 10/7/2008 4:22:50 PM , Rating: 2
The nforce2 was the last stable chipset nvidia put out... I have had nothing but problems with them ever since.

Yet never a problem with Intel, VIA, SIS or AMD chipsets (at least not due to a hardware/company issue).


RE: Quelle surprise....
By BladeVenom on 10/7/2008 4:51:50 PM , Rating: 2
Yes, the nForce2 chipset was great; one of my favorites.


RE: Quelle surprise....
By Meinolf on 10/7/2008 4:34:40 PM , Rating: 1
I never had a issue, might be user error


Back to the future?
By wannabemedontu on 10/7/08, Rating: -1
RE: Back to the future?
By JasonMick (blog) on 10/7/2008 12:54:22 PM , Rating: 5
Hmm... not sure if you're joking, but I hope you know companies fiscal quarters are often dated as much as a year in advance.

Nvidia's fiscal Q2 2009 finished in July. And this is not unusual at all in the business world. Only some companies have their fiscal quarters in line with the actual date.


RE: Back to the future?
By bighairycamel on 10/7/2008 1:06:22 PM , Rating: 5
Fiscal year != Calendar year


RE: Back to the future?
By therealnickdanger on 10/7/2008 1:44:01 PM , Rating: 2
Not sure if you're sarcastic or not... Either way:

FAIL

;-)


RE: Back to the future?
By Cullinaire on 10/7/2008 5:38:35 PM , Rating: 2
On second thought, no, I don't want to be "u".


"It's okay. The scenarios aren't that clear. But it's good looking. [Steve Jobs] does good design, and [the iPad] is absolutely a good example of that." -- Bill Gates on the Apple iPad

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki