Print 47 comment(s) - last by wordsworm.. on Nov 13 at 1:39 AM

"We are very proud to have achieved our first billion dollar quarter. And, while it is a wonderful milestone to reach, we believe this is just the beginning," said NVIDIA President and CEO Jen-Hsun Huang.   (Source: NVIDIA)
NVIDIA records revenue of $1.12 billion USD for 2008 fiscal third quarter

After hearing quarterly earnings reports from Intel, AMD, Apple and Microsoft, it's now time to hear what's shaking from the guys in Santa Clara, California. NVIDIA today reported that it brought in a record revenue of $1.12 billion USD for the 2008 fiscal third quarter which ended on October 28, 2007.

"We are very proud to have achieved our first billion dollar quarter. And, while it is a wonderful milestone to reach, we believe this is just the beginning," said NVIDIA President and CEO Jen-Hsun Huang. "Our core businesses are continuing to grow as the GPU becomes increasingly central to today's computing experience in both the consumer and professional market segments."

The $1.12 billion USD tally marks the first time that a GPU company crossed the $1 billion USD threshold. NVIDIA also recorded net income of $235.7 million USD which represented a 121 percent increase year-over-year.

Total revenue for NVIDIA thus far through fiscal 2008 is $2.90 billion USD, while net income stands at $540.7 million USD. This compares with $2.19 billion USD and $285.3 million USD respectively for the first nine months of fiscal 2007.

"This is the era of visual computing and NVIDIA is at the forefront. People want a delightful, compelling experience when they interact with their computing devices, whether it's on a phone, notebook, game console, or workstation," Huang added. "NVIDIA is leading the way in making this experience more intuitive and rewarding through our relentless pace of innovation and focus on execution."

NVIDIA attributes the record quarter to a 33 percent increase in desktop GPU products and a 120 percent increase in mobile GPU products. Other star performers included NVIDIA's Quadro family of professional graphics processors and the new Tesla desk-side supercomputer products.

The news on the NVIDIA front has been pretty fierce over the past week. NVIDIA first set tongues wagging with the announcement of the GeForce 8800 GT. The GPU, which is based on 65nm G92 architecture is priced from $199 to $249 and brings back memories of the NVIDIA GeForce 4 Ti 4200 and Radeon 9800 on a performance-per-dollar basis.

Many in the hardware community let out a collective yawn when NVIDIA announced its new Enthusiast System Architecture (ESA). ESA aims to give enthusiasts control over a wide gamut of hardware components through a centralized software app.

More news on upcoming NVIDIA GPUs was also revealed by DailyTech earlier today. The company is set to unleash yet another 65nm variant of the 8800 GTS making for some very confused enthusiasts this holiday season.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By wordsworm on 11/9/2007 3:29:27 AM , Rating: 0
Everyone talks about fps. What about the other variables? Is the quality of each frame the same? Is it possible that a graphics card could skip on quality to increase the quantity of frames? It just seems a little strange that the tables would have been changed so dramatically. Is it possible that they're cheating in quality, or are all things really equal in each frame? Otherwise, I really can't see how these 320 bit cards manage to equal 512 bit cards that have better and more RAM (DDR4). Usually tech sites can point to indicators that shed light as to why one CPU is better than the next, they only seem to show how many FPS there are. Also, most sites seem to be measuring them against each other with Vista 32. A lot of sites, including Anandtech often pit the high end hardware against each other, but do so using inferior 32 bit Vista rather than 64 bit Vista. If high end hardware is going to be compared to high end hardware, why are they all using the 32 bit software? These aren't the days of 32/64 bit XP, as 32 bit software can be run on both Vista variants. Finally, where are the tests using 64 bit games? I do believe they exist, and it would be interesting to see if this variable shows any difference.

RE: Quality
By otispunkmeyer on 11/9/2007 8:43:15 AM , Rating: 1
if you bothered to read any G80 reviews you'd know that AA quality is first rate and that Nvidias Anisotropic Filtering implementation is pretty much perfect. better, even than ATi's HQAF that was present on the X1900 series.

also there realy is more to performance than just memory bandwidth. for all the 512bit bus gave AMD braggin rights, really it was a shot too far.... you just cant utilize it, theres more bandwidth than most games can make use of.

Add to that the design choices and problems with R600 and you'll see why G80 wins. you really are gonna have to sit and read because its quite lengthy. but basically ATi's design choices are kinda almost ahead of their time... its hurting them now, but it could come good in the end.

also seeing how 99% of games are written for 32bit.... using 32 or 64 bit OS doesnt really matter. most people dont run 64bit either....its just not 64bits time.

GDDR4? yeah it might be bettter, but GDDR3 is cheaper and not significantly slower. so whats the point. GDDR5 is on the books now too.

just get clued up before you come here spouting off

RE: Quality
By wordsworm on 11/9/2007 11:39:51 AM , Rating: 2
just get clued up before you come here spouting off
I guess you don't know the significance of the question mark. I was asking the question as to why so many indicators that I recognize: faster ram, equal or more shaders, wider bandwidth, etc, hasn't resulted in better performance. Also, when looking at CPUs, usually I can find fairly detailed overviews, but for GPUs the information just doesn't seem quite as good.

Also, my question as to whether or not there's a performance difference between 32 and 64 bit implementations of the videocard is completely valid. I do think this is the time of 64 bit. I think Vista is going to be the crossover point and that most new systems in 2 years will be sold with the 64 bit variant loaded on. Most websites don't test these products using 64 bit software. I don't know whether or not this would make a difference.

I have an Nvidia in my computer: a modest 8800 GTS with 320 MB. I'm relatively satisfied with it even if I can't get AA+HDR to run simultaneously on Oblivion.

Next time maybe you can take a question for what it is: someone wanting answers to questions. I haven't run into a good article explaining how it is that Nvidia has managed to take the performance crown so convincingly. If it was a margin of 5% it would be easy for me to understand. But 25%+ is hard to swallow, and it makes me wonder if there's anything missing that I'm just not seeing. If someone has information they can link me to, then I'm interested in reading it.

I don't know how you get off with your attitude that I should know everything before I ask questions. That's got to be the dumbest logic I've ever seen. Why don't you go to finishing school and learn some manners.

RE: Quality
By Savvin on 11/9/2007 8:56:44 AM , Rating: 1
I really don't think there is anything to base your implied "conspiracy theory" on. Did you feel the same way when in the past it was the other way around? Remember the Radeon 9700? It's called competitive engineering. It happens in any field. Look at the past when AMD's Athlon dominated the perforce charts. It's just the way things go, AMD/ATI will have their day again, at some point in the future. As for your 32bit vs 64bit question, there are several reasons why most sites don't use 64bit very much, if at all. First off, the install base of 64bit users is extremely small so I would assume that most sites don't want to allocate resources to test for such a small base at this time. Secondly, there are even fewer games that actually use 64bit to any great effect. Even though the 64bit OS is available, the true 64bit applications are not. Regardless, there are some 64bit game comparisons that have been done and they have pretty much shown negligible differences. I'd give it another 6 months or so before software begins to show the benefits of 64bit.

RE: Quality
By wordsworm on 11/9/2007 11:51:17 AM , Rating: 2
I really don't think there is anything to base your implied "conspiracy theory" on.
I wasn't trying to imply a conspiracy. I was just wondering if any analysis had been done on the image quality to see if there's been some compromises in order to bring those FPS up.

64 bit ought to be around the corner. It seems that games like Crysis and Oblivion are testing the limits of 32 bit, and the headroom must be found in 64 bit programming. I'm not a programmer, so if I'm mistaken, please feel free to correct me.

Most websites make comparisons using hardware that's out of the reach of most people. Since a lot of them like to push the envelope, I can't help but wonder why they haven't tried doing the same with the OS.

"Game reviewers fought each other to write the most glowing coverage possible for the powerhouse Sony, MS systems. Reviewers flipped coins to see who would review the Nintendo Wii. The losers got stuck with the job." -- Andy Marken

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki