backtop


Print 26 comment(s) - last by larson0699.. on Jun 4 at 4:02 PM

NVIDIA announces GeForce 9M, takes subtle jabs at Intel

NVIDIA is looking to up the ante in the realm of notebook GPUs with its new GeForce 9M series. NVIDIA is banking on both power and efficiency to win over OEMs and end-users with the new GPUs.

NVIDIA claims that the new GeForce 9M GPUs are up to 40% faster than the previous generation GeForce 8M parts and as much as ten times faster than a certain chip giant's "generic" integrated GPUs -- namely, Intel. All of the new GPUs incorporate PureVideo HD and support DVI, HDMI 1.3, DisplayPort 1.1, and Blu-ray Profile 2.0. All GeForce 9M GPUs comply with the MXM 3.0 graphics module specification.

On other feature to consider with the GeForce 9M is the inclusion of Hybrid SLI technology. This allows OEMs to incorporate a low-power GPU for everyday desktop duties and a higher-performing part for graphics-intensive duties -- NVIDIA SLI support may be added at a later date.

"Beginning this summer, GeForce 9M GPUs and Hybrid SLI, paired with AMD and Intel CPUs, will enable a new breed of notebooks," said Jeff Fisher, NVIDIA's GPU Senior VP. "These new notebooks will be optimized to deliver a visual experience and raw computing performance that traditional cookie-cutter notebooks with integrated graphics simply can’t touch."

In addition, this new GPU features a multi-core architecture which will not only speed up entertainment applications, but will also speed up today’s lifestyle applications, like video encoding from a PC to a small personal media device, where the speed up in the video conversion is up to 5x faster with the GeForce 9M family GPUs."

NVIDIA has broken the GeForce 9M into three categories: Value, Mainstream, and Performance. The Value sector will be solely represented by the GeForce 9100M G. The Mainstream sector will be propped up by the GeForce 9400M, GeForce 9300M GS, and GeForce 9200M GS. Finally, the Performance sector features the GeForce 9600M GT, GeForce 9600M GS, and GeForce 9500M G.

Unlike some of ATI's latest graphics offerings, the new GeForce 9M series will not support DirectX 10.1. NVIDIA says that consumers will base their buying decisions on price and performance and that support for a particular API is not of extreme importance.

For those looking to take advantage of PhysX -- a recent acquisition of NVIDIA's -- drivers are expected to be made available during Q3 2008.

NVIDIA definitely isn’t holding back with its ribbing of Intel with this latest GPU release. Intel has expressed its intentions to bulk up its integrated GPU offerings and expand into discrete graphics. NVIDIA is fighting back by littering subtle jabs in its press releases (“generic”, “cookie cutter”, etc.) and with not so subtle comments from its CEO.

It should be interesting to see how things in the graphics market pan out as Intel and NVIDIA continue to cross paths within the next 18 months.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Wait...
By jbizzler on 6/3/2008 9:52:13 AM , Rating: 2
Wait, these aren't new. My brother already has an ASUS F8SN with a 9500M GS.




RE: Wait...
By benx009 on 6/3/2008 9:56:23 AM , Rating: 2
Yeah, the Geforce 9500M GS has been out for a while now...


RE: Wait...
By Brandon Hill (blog) on 6/3/2008 10:00:39 AM , Rating: 2
I don't see the 9500M GS listed anywhere in the article...


RE: Wait...
By JasonMick (blog) on 6/3/2008 10:27:33 AM , Rating: 3
You're a bit confused.

To sum things up....
quote:
"Nvidia is ready for its next-generation GPU launch. According to sources at graphics card makers, the company plans to launch its GeForce 9 series GPU after the Lunar New Year in February.

The first chip to rollout of in GeForce 9 family will be the D9E, a high-end product that adopts 65nm manufacturing. The new product will also support DirectX 10.1 and Shader Model 4.1, revealed the sources.

In addition to the D9E, Nvidia will roll out a mid-range GeForce 9 family product named D9P in June 2008. The new GPU will adopt 55nm processing, the sources pointed out.


Source: http://www.digitimes.com/mobos/a20071129PD216.html

This article is on the D9P series rollout, D9E was already released.

For those interested, the only two current hardware partners for the D9E and D9P releases are Acer and Asus. Currently they offer models with the 9650 M GS, the 9500 M GS (which you mentioned), and the 9300 M G (mentioned in the article, now offered by Asus).

Source: http://www.nvidia.com/object/wtb_notebooks.html#ni...

Also, keep in mind the 9500 M GS you mentioned is just a renamed GF 8600M GT shrunk to a 65 nm process, and with different core/memory clocks.


RE: Wait...
By jbizzler on 6/3/2008 6:11:31 PM , Rating: 3
Ah, I see. The article title is misleading, making it look like these are the first 9Ms.


RE: Wait...
By carage on 6/4/2008 11:03:40 AM , Rating: 2
Technically the 9500 M GS is not a G9X generation chip.
It is simply a rebadged 8600 M GT with die shrink.
I was also suckered into buying one of these thinking it was a G9X chip.


Nvidia still hasn't adressed thier weakness
By theapparition on 6/3/2008 9:59:00 AM , Rating: 1
I've long been a supporter of Nvidia since they offer a superior implementation of OpenGL (which my applications primarily use) and thier drivers (IMHO) have been much more stable. Thier Quadro series can't be touched.

But boy was I disappointed when I upgraded graphics cards in my media center PC. POS video quality. I thought it would have gotten better, but no. ATI has a much better solution here. Nvidia needs to get thier act together with video. I don't see anything on thier roadmap to indicate that thier doing that.

Still it is nice to see them increase graphics performance for laptops. But with more and more laptops also running video, I still see this as a weakness.

Also,
What's with the jabs at Intel? If you have a better product, people will come. Apparantly, name calling is the new marketing.




RE: Nvidia still hasn't adressed thier weakness
By knitecrow on 6/3/2008 10:02:55 AM , Rating: 2
I think their chipset drives are worse than graphics. I had to return my 780i board because of all the trouble it game me and got an intel based chipset motherboard.


By FITCamaro on 6/3/2008 11:46:46 AM , Rating: 2
I got a 680i and haven't had any issues. But that was because of price more than anything. But they do run hot .

Anyway, as far as the article is concerned, I'm excited about something like hybrid SLI for laptops. I've always wanted a laptop that, when just browsing the internet works off a low power integrated GPU. And when playing games has a better performing dedicated GPU. Alienware offered this for a short time with a switch to select which GPU to use. This goes one better by making the switch pretty much on the fly.

So hopefully we can soon have a kick ass gaming laptop that still gets good battery life.


By theapparition on 6/3/2008 12:59:43 PM , Rating: 2
That may be, but I was only commenting on thier graphics card drivers.


RE: Nvidia still hasn't adressed thier weakness
By Jephph on 6/3/2008 1:27:42 PM , Rating: 2
quote:
But boy was I disappointed when I upgraded graphics cards in my media center PC. POS video quality. I thought it would have gotten better, but no. ATI has a much better solution here. Nvidia needs to get thier act together with video. I don't see anything on thier roadmap to indicate that thier doing that.


Sorry. Had to. I get annoyed when people use their/they're/there wrong, but this is extreme. First off, it should have been "they're" Secondly, even if it were "their" it's spelled "their", not "thier".


By theapparition on 6/4/2008 9:55:24 AM , Rating: 2
I'll never win a spelling bee or grammar award, that's for sure.

But at least I sleep well at night with the knowledge that I'm not an internet loser who is so insecure that he has to point out anothers grammatical errors, because he has no other relevant view on the topic.

Sorry, had to. :P

Corrections should be pointed out when they obfuscate the intent of the story. Other than that, why does anyone care?


By DallasTexas on 6/3/2008 10:59:00 AM , Rating: 2
All that is missing from Nvidia is a the infamous billboard along Times Square that AMD made a stink with. Clearly, Nvidia is no AMD (they actually execute) but Nvidia is going down the same path as AMD with rhetoric not seen since AMD's Opteron 15 minutes of fame.

Kicking dirt on the Intel monster that has near infinite resources, technology and know-how does nothing but fuel the fires that have repeatedly resulted in Intel trouncing with far superior products. Nvidia should take a slice of humble pie and leave the cocky "can of whoop-ass" like rhetoric to AMD.




By StevoLincolnite on 6/3/2008 6:54:13 PM , Rating: 2
I think nVidia is in a better position in terms of technology in the graphics department right now, Larrabee, probably wont be the Rasterisation renderer speed demon that many hoped, nVidia also have 3dfx under there hood. (The ones who many believe are what kick started the 3D acceleration era).
Mind you, there job on the Geforce FX was not all that flash, but since then they have had a solid line-up.

There is only 1 problem, I see with the nVidia Graphics cards, and that is the lack of Direct X 10.1 support, some people claim "It's not worth worry about" - But realistically is this history repeating itself?

For instance, the Geforce 4 MX series was a Direct X 7 part without any shaders, although more powerful than the Geforce FX, it could not play any games which had to use shaders. (Like Oblivion, where the Geforce FX could run it rather well with OldOblivion, if the Geforce 4 MX, had Pixel Shader 1 support, that would have been awesome).

Another Example is the Radeon x8xx series, with there lack of SM3 support, they could very well easily run games like Bioshock, but unfortunatly because of the lack of a certain feature, they have no hope in running the game. (Unless you use Shadershock Shader replacements).
The card itself was a screamer back in the day, more powerful than the x1600 and x1650 series that came after it, and the x850XT PE overclocked out-performed the 7900GS, but unfortunately couldn't run some games, people didn't worry about it back when purchasing the card, and thought SM3 isn't worth it, or won't be needed any time soon, well news flash it did, and it was.

Will History repeat itself?


By winterspan on 6/4/2008 3:02:50 AM , Rating: 2
I don't work in the industry, nor am I an obsessed gamer, but I do follow the technology. And as far as I know, leaving out support for DirectX 10.1 is nowhere near as critical as leaving out DX7 shader support or SM3.0 support. DirectX 10.1 has no major significant technology difference than DirectX 10.0, so I don't see where a situation could occur like the ones you described in the other cases...


By StevoLincolnite on 6/4/2008 7:01:46 AM , Rating: 3
One of the main improvements touted by Microsoft in DirectX 10.1 is improved access to shader resources - In particular, this involves better control when reading back samples from multi-sample anti-aliasing. In conjunction with this, the ability to create customised downsampling filters will be available in DirectX 10.1.

Floating point blending also gets some new functionality in DirectX 10.1, more specifically when used with render targets - New formats for render targets which support blending will be available in this iteration of the API, and render targets can now be blended independently of one another.

Shadows never fail to be an important part of any game title's graphics engine, and Direct3D 10.1 will see improvements to the shadow filtering capabilities within the API, which will hopefully lead to improvements in image quality in this regard.

On the performance side of things, DirectX 10.1 will allow for higher performance in multi-core systems, which is certainly good news for the ever growing numbers of dual-core users out there. The number of calls to the API when drawing and rendering reflections and refractions (two commonly used features in modern game titles) has been reduced in Direct3D 10.1, which should also make for some rather nice performance boosts. Finally, another oft-used feature, cube mapping, gets its own changes which should help with performance, in the form of the ability to use an indexable array for handling cube maps.

One of the major additions which will impact image quality in DirectX 10.1 regards precision, in a couple of different disciplines. Firstly, this revision of the API will see the introduction of 32-bit floating-point filtering over the 16-bit filtering currently on show in DirectX 9 and 10 - This should see improvements to the quality of High Dynamic Range rendering which use this functionality over what is currently available.


MXM 3.0 means exactly what??
By SpaceRanger on 6/3/2008 9:45:12 AM , Rating: 2
From what I've heard, these cards are not upgradeable directly, or am I wrong on this?




RE: MXM 3.0 means exactly what??
By Goty on 6/3/2008 1:53:04 PM , Rating: 2
That's a touchy subject. Technically, yes, the module would be upgradable, but first you'd have to actually find another MXM module to put in the laptop, and then you'd have to find one that is actually the same form factor as the card you've got. The situation before has been that most OEMs develop their own variations of the MXM form factor to get the cards to fit in their laptops.

A little more information:

http://en.wikipedia.org/wiki/MXM


Decent chips
By wetlegs6 on 6/3/2008 11:59:15 AM , Rating: 2
I got an Asus F8Sg with a 9300M G, it isn't a bad thing for integrated graphics. Never been a fan of Nvidia cards - especiall with all the hell my friends with Nvidia have been talking about with their Vista drivers.

But, so far so good.




RE: Decent chips
By larson0699 on 6/4/2008 4:02:53 PM , Rating: 2
You use Vista.

I use Linux.

NVIDIA FTW.


AVIVO?
By BigLan on 6/3/2008 10:30:28 AM , Rating: 2
quote:
In addition, this new GPU features a multi-core architecture which will not only speed up entertainment applications, but will also speed up today’s lifestyle applications, like video encoding from a PC to a small personal media device, where the speed up in the video conversion is up to 5x faster with the GeForce 9M family GPUs."


So is this going to be like ati's avivo video converter and allow the gpu to do the encoding? I hadn't heard of nvidia offering this before - is it available on their desktop line as well? Does it only work in certain applications?




Driver
By dickeywang on 6/3/2008 10:55:58 AM , Rating: 2
Look at the series of drivers that are released by Intel for the X3100 video chip, which can only be described as disappointed (even though the X3100 chip isn't as powerful as the Nvidia 8M series, it is certainly capable of delivering better performance than what we get with the current Intel driver).

If Intel want to catch up Nvidia/ATI on the mobile GPU business, they need to put much more efforts on developing good driver for their chips.




physx
By FITCamaro on 6/3/2008 11:50:04 AM , Rating: 2
quote:
For those looking to take advantage of PhysX -- a recent acquisition of NVIDIA's -- drivers are expected to be made available during Q3 2008.


Is this just for the 9x00 series? Because I thought they promised Physx support for the 8800 series and up.




still low performance...
By zshift on 6/3/2008 12:25:21 PM , Rating: 2
It still upsets me that nvidia is releasing all of these mobile graphics without driver support. I've been able to modify the inf files of desktop drivers to work with my laptop, but then you lose the energy saving benefits and battery life is shattered. When are they gonna come up with a mobile gpu that gives decent performance with today's games while offering good cooling and battery life. If intel did the switch from pentium (overheating and poor battery life) to core 2 series (much cooler with way better performance), then I'm sure Nvidia can do the same.

Also, nvidia's naming system can be confusing to those that don't know specifics. for example, the 9600gt has 64 shader cores, while the 9650 gs mobile gpu has only 32. Any average joe buying a gaming laptop will think theyre getting a great gpu, but in the end it's still only mediocre perfomance.




What about Quadro M?
By pauldovi on 6/3/2008 6:57:00 PM , Rating: 2
Any love for the Quadro product line? I would like a lower power bracket 570M.




By FXi on 6/3/2008 9:05:22 PM , Rating: 2
Don't get me wrong, Nvidia makes decent mobile chips, but these are 256bit memory interface parts that are minor refresh of designs 2 years old.

A cut down GT200, on 55nm processing would be the real deal. You can bet that AMD has a series of mobile parts coming, based on the R700 (that is already AT 55nm) that will be true next gen parts and will make these above look like dogs. Every mobile buyer knows there is a time to buy and a time to hold off, because the parts are rarely upgradeable and thus you have to live with a choice for a long time.

Finally they are following the standard form factor, but Nvidia and the vendors have to promise to build upgradeable parts at least on the mid to high end. Otherwise it's marketing babble that is worth less than the paper it's printed on.

These make nice placeholders until we can get the real next gen parts, but they really aren't anything more than a die shrink and some power savings. The real deal comes with parts that are 55nm and are based on the GT200.

At the prices they are charging for these things, you should get the real deal.




"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer

Related Articles
Intel Discusses GPU, Hybrid CPUs
March 17, 2008, 4:55 PM
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki