backtop


Print 39 comment(s) - last by MamiyaOtaru.. on Dec 3 at 12:48 AM

NVIDIA reports another bad quarter

It's been a rough year for NVIDIA. Typically, the graphics giant is one of the most profitable companies in the technology industry. That all changed in 2008 with NVIDIA seeing one of its first quarterly losses in a very long time.

NVIDIA's first loss in 2008 was due in part to a onetime charge it took related to the repair and replacement of faulty GPUs in notebooks form some of the biggest computer makers around including Dell and HP.

NVIDIA announced its Q3 fiscal 2009 financials this week and reported a significant 20% reduction in quarterly revenue compared to the same quarter in fiscal 2008. NVIDIA took a one-time charge of $8.3 million related to the layoff of 360 employees globally. The pre-tax charge was to cover the cost of severance and related expenses.

Jen-Hsun Huang, NVIDIA President and CEO said in a statement, "We made good progress on multiple fronts during the quarter. Improving gross margin while managing operating expenses enabled us to significantly improve our operating fundamentals. We transitioned our performance segment GPUs to 55 nanometers and are now poised to recapture lost share.”

“We entered the fastest growing segment of the PC market with our first notebook chipset for Intel processors, and delivered on several exciting new growth initiatives -- 3-way SLI for the Intel Core i7 processor platform, Quadro CX for Adobe CS4 creative professionals and the Tesla supercomputing processor."

For the three-month period ending on October 26, NVIDIA's profits sank by a 74% to $61.7 million, working out to 11 cents per share. In the same quarter last year, NVIDIA turned a tidy $235.7 million in profit.

NVIDIA has big expectations for its 9400M notebook GPU that it believes could capture as much as 30% of the notebook market. At this point NVIDIA has to be hoping the new GPU will help pull it back into its past financial performance.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: If nVidia is losing money...
By Targon on 11/7/2008 7:39:58 PM , Rating: 2
AMD had been losing money for a while as a whole, mostly due to the CPU division underperforming. But, now that the ATI merger has meshed, we are seeing some huge improvements thanks to the success of the Radeon 4800 series of graphics products.

So, last quarter AMD pulled a profit, and the big question is if AMD can pull another profitable quarter. With any luck, the 45nm Phenom processors will also provide a needed boost to the CPU division as well.

When it comes to stock purchases, it is a very difficult time to decide if/when to buy. AMD is trying to spin off the fab business, but it is unknown if this will happen(foreign regulations), or if it will even benefit AMD in the long term.


RE: If nVidia is losing money...
By Clauzii on 11/7/2008 10:11:05 PM , Rating: 2
AMD really needs to up the Phenoms more than a 'usual' 10-20% if they want to compete with Nehalem. Nehalem has some almost scary figures if You look at rendering, 3D and encoding/decoding being up to >3 times faster than current Phenoms.

Intel just showed that the same GHz-number can give a MUCH faster CPU if done right (Just like AMD did when they entered the scene years ago with K6-II and K7).

I think it would be time for AMDs 'Fusion'; A single chip, say a 4 core Phenom and some hundreds of 'ATI-shaders' to boost calculations AND/or provide GPU functions. (would boost calculations even if they kept the shaders at current GPU speeds). Also with, maybe, a 4-channel DDR3 interface and some faster CPU-CPU interconnection than the current ~3GB/s (Nehalem ~25GB/s!) and a bigger L2/L3 cache.

AMD, Your move...


RE: If nVidia is losing money...
By teldar on 11/9/2008 8:43:50 AM , Rating: 2
The number of GPU cores they could put on a cpu die which already has 4 CPU cores would be minimal. And I don't know that there is enough software out there to take advantage of that even existing. It's more about saving money on the chipset and bundling it all into a more complete processor package.
I do agree that they're going to need more bandwidth when they start packing those GPU cores into their CPU's.


RE: If nVidia is losing money...
By Clauzii on 11/9/2008 2:10:04 PM , Rating: 2
When they change to 45nm, the extra space could be for those 'shaders'. AMD could even brand it 3DNow-Pro (or something), and make it an extension to the opcodes like SSE and MMX was too. I don't really see a big problem doing that?!


"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg

Related Articles
NVIDIA Cuts 360 Employees Globally
September 19, 2008, 12:53 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki