Print 19 comment(s) - last by jabber.. on Jul 26 at 12:02 PM

The Mercury News puts Jen-Hsun Huang on the hot seat

The graphics industry was just turned on its head yesterday with the announcement that AMD was acquiring ATI Technologies. But while that news is of great significance, ATI rival NVIDIA has plans of its own when it comes to the future of graphics. Dean Takahashi of The Mercury News had the opportunity to interview Jen-Hsun Huang, CEO of NVIDIA. The two talk about the GeForce FX failure, why the company is dealing with Sony in the console arena this time around instead of Microsoft and how far away we are from the "Toy Story" standard.

In regards to having greater photorealism in games, Huang feels that we are still at least 10 years away. Techniques like depth of field, motion blur and HDR are being refined, but we still have a ways to go before computers can move with the fluidity of human beings. Many movies today feature human characters acting in front of blue screens as action takes place around them. The effect is (for the most) part seamless to movie viewers, but gamers still have a ways to go before we see that kind of photorealism on the PC. “With "Superman Returns,'' you can't really put a camera on a person and have him fly through a metropolis. That entire movie was animated. It was one big computer-generated movie with a guy in tights in front of a blue screen. You look at that imagery, and you know we are nowhere near that level of imagery,” said Huang.

Although the interview was conducted before the AMD-ATI merger went public, it would be interesting to see what Jen-Hsun Huang has to say about the deal. Speculation is running rampant as to what the merger means for the industry as a whole – a merger that affects not only AMD and ATI, but also Intel and NVIDIA. Intel hasn’t commented on the merger but we do expect to being hearing from them before the week is out.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Why not nVidia?
By AtaStrumf on 7/25/2006 12:34:30 PM , Rating: 2
What I wonder is why didn't AMD buy nVidia instead of ATi? Would seem like a better choice to me. ATi is still pretty new in the chipset business and still has some stupid problems with it's southbridge.

RE: Why not nVidia?
By Furen on 7/25/2006 12:38:07 PM , Rating: 1
Because nVidia is worth as much as AMD is. Also, ATI's shader and memory controller technology is better than nVidia's. So basically ATI had better tech and was much cheaper.

RE: Why not nVidia?
By Knish on 7/25/2006 12:53:51 PM , Rating: 5
Furen you usually post fairly throught out posts, but that was pretty subpar for you. AMD bought ATI to get a foot hold into the following, in this order of importance:
1.) Mobile
2.) TVs
3.) XBOX
4.) Core Logic
5.) GPU

If AMD wanted to spend $5 billion to design a graphics chip, i can pretty much damn well assure you it'd be better than what intel, NVIDIA or ATI has. ATI was struggling financially, had a decent foothold in markets AMD wanted in, and had management that wouldnt put up a fight. The shaders and memory controllers are really really insignificant in the scale of the deal.

In all honesty, AMD and Intel are still companies that apply themselves to 99.9% of the industry, not the enthusiasts like you and me. graphics will take a step back for a while, but your mom and grandma's next AMD computers will give the Intel alternatives a decent run for the money.

NVIDIA? Considering they have a tendency to agitate everyone in the industry and they just lost their number one channel partner, and they directly compete with all of Intel's graphics solutions (which is a multi billion businss for intel), I don't really see the NVIDIA/Intel partnership going anywhere. Good luck to them selling $900 video cards to 7 kids at voodoopc

RE: Why not nVidia?
By Xavian on 7/25/2006 3:05:19 PM , Rating: 2
AMD is worth some $2.5b more than Nvidia, so they are certainly not equal Furen.

RE: Why not nVidia?
By sircuit on 7/25/2006 12:39:58 PM , Rating: 2
Maybe Intel will.......

RE: Why not nVidia?
By Enron on 7/25/2006 5:16:40 PM , Rating: 3
Intel has no reason to buy Nvidia, they have nothing to gain from them.

RE: Why not nVidia?
By defter on 7/25/2006 1:21:00 PM , Rating: 2
The reason:

Incidentally, according to Patrick Moorehead, vice president of advanced marketing at AMD, the company did indeed entertain the idea of merging with nVidia. But that would have been a merger of equals, and AMD's top brass would not have been in charge.

AMD wanted to be in charge, thus they picked a smaller company.

Full of it
By on 7/25/2006 1:04:18 PM , Rating: 3
I read through the article and all I took away from it was how out of touch Huang seems to be of the market at computing in general. As brought up in the text, its almost as if nVidia was hit by luck more than anything.

For example, he pontificates how 'great' it is to work with Sony on the PS3 and its new untested technologies. But he (along with Sony) seem to miss the point that new technology by itself doesn't determine the market [and at the PS3 price, they both Sony and nVidia learn this lesson the hard way].

He blathers repeatedly about nVidia spend 3/4 of a billion dollars on R&D but doesn't seem to know where that money is actually going.

He talks about moving into the cell-phone GPU market, by likening cellphone evolution to early desktop computers vs. typewriters. Editing and multiple print generation on a computer (over typewriters) is clearly an advantage, but he doesn't have any convincing arguments why I'd need a specialized GPU in a cellphone. To me, battery-power is the area which cellphones need R&D, not expensive, power sucking graphics.

Maybe Huang just isn't good at public interviews, but he seems aloof.

RE: Full of it
By PitbulI on 7/25/2006 1:24:49 PM , Rating: 2
But, don't you want to play Battlefield 2142 on your small 1x2 inch screen? At a Blazing 30 frames per second I might add. Hehehe.

Well, nvidia want to get into the cellphone market because it just isn't the graphics chip they would be supplying. A lot of cell phones are sold. More so than graphics cards and motherboards. There is money to be made there but I think that market is crowded enough. Nvidia would have to make a huge splash.

RE: Full of it
By lsman on 7/25/2006 3:21:18 PM , Rating: 2
There are gaming market in cellphone in Japan and rest of world.

RE: Full of it
By rushfan2006 on 7/25/2006 3:25:59 PM , Rating: 2
But, don't you want to play Battlefield 2142 on your small 1x2 inch screen? At a Blazing 30 frames per second I might add. Hehehe.

Ok sorry for the somewhat off-topic post to this thread but ...dude...Battlefield 2142 looks "sexy" read the preview in CGW....this game is definitely one I'm looking forward to (and that is saying really hasn't been since WoW was released that I was really excited for a PC game).

RE: Full of it
By caboosemoose on 7/25/2006 2:45:31 PM , Rating: 3
I agree with Generic Guy, this interview shows up Huang as the bullshitter he is. He the GX2 as a new GPU, when we all know that its a couple of G71s stuck on one assembly. He also spouts some pretty bad BS about the FX family.

Oh really?
By tfranzese on 7/25/2006 2:32:45 PM , Rating: 2
You can't build chips for all the game consoles. That's not possible. They would all like a slightly different style from the others. Difference is important. The same chip company would have difficulty designing chips for the different styles. It's also so high stakes that you need to focus. No one has enough extraneous resources around to build chips for all the game consoles.

IBM does a pretty good job of it (Gamecube, 360, Wii, PS3).

RE: Oh really?
By Xavian on 7/25/2006 3:08:39 PM , Rating: 2
except the fact that IBM is a chip manufacturing colussus? IBM probably has more fabs then Intel.

RE: Oh really?
By tfranzese on 7/25/2006 4:24:30 PM , Rating: 2
Except that Jen-Hsun Huang doesn't make any such distinction, so tell me again your point?

RE: Oh really?
By DigitalFreak on 7/25/2006 4:08:20 PM , Rating: 2
Well, ATI did 2 out of 3 this time around (XBox360, Wii). Old Wun Hung Lo is a dingle berry.

By Enron on 7/25/2006 5:16:01 PM , Rating: 2
ATI has better technology than Nvidia, at a cheaper price.

By phymon on 7/25/2006 11:14:24 PM , Rating: 2
Some of the reasons are the company is cheap, ATI managed to support two of the next gen game consoles, unified shaders, has more horsepower than others (nVidia).

Missed the boat?
By jabber on 7/26/2006 12:02:16 PM , Rating: 2
The big problem facing the graphics firms is the current trend of 'bruteforce graphics processing is everything'. All its giving us is more power consumption and heat for marginal benefit.

Far smarter for AMD would have been to snap up PowerVR tech for things like smartphone gaming etc. Refine the PowerVR method and that could be quite interesting. Good frame rates with far less power required.

Hmm but who has just smapped up licensing for PowerVR instead? Intel. Could make for interesting mid-range integrated solutions.

Fun times ahead hopefully.

"I modded down, down, down, and the flames went higher." -- Sven Olsen
Related Articles
AMD-ATI: A Done Deal
July 24, 2006, 5:00 AM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki