backtop


Print 29 comment(s) - last by smn198.. on May 10 at 8:47 AM

Next generation nForce gets official... on paper... again... really

We just got our hands on the newest NVIDIA core logic roadmap.  Several weeks ago, NVIDIA announced the nForce 500 series chipsets based on the MCP55 controller.  The 500 series family includes plans for an nForce 590, 570, 550 and SLI derivatives.  NVIDIA has made announcements to its partners that it will now "officially" launch its MCP55 lineup on May 23, 2006, for both Intel Conroe and AMD AM2 motherboards.  Expect availability of nForce 590 on both platforms on launch days for each new processor core.

nForce 590 will be the highest performance NVIDIA chipset for AMD AM2 and Intel Socket 775.  The chipset will be specifically tweaked for SLI and Quad SLI and feature a new technology called "LinkBoost."  LinkBoost will, supposedly, offer increased bandwidth between GPU and MCP if, and only if, the system uses NVIDIA-only components.  Currently, only the 90nm GeForce series graphic cards will support LinkBoost, but future high end cards will as well.

The nForce 590 SLI, 570 SLI and 570 Ultra chipsets also features an option called "FirstPacket."  FirstPacket apparently is NVIDIA first attempt at packet prioritization, or Quality of Service on the NVIDIA firewall.  The new chipsets will also feature a "teaming" feature that will allow for some rudimentary traffic shaping while using both Ethernet connections on the MCP.

The nForce 590 is a two chip package.  The SPP uses 90nm TSMC packaging while the MCP is a 130nm TSMC package. Other MCP55 will use similar setups, but will not "relaunch" until later this year.  NVIDIA MCP61 showed up on company roadmaps earlier this year and will be the company's first single core motherboard chipset in many years.

And yes, NVIDIA's slides also confirm high definition audio will appear with nForce 590.  SoundStorm2 proponents may or may not see this as a sign of the apocalypse.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

wow
By Xorp on 5/9/2006 1:15:29 AM , Rating: 3
Better performance with a nVidia graphics card...

what a ******* scam




RE: wow
By Dom on 5/9/2006 1:19:12 AM , Rating: 2
I was just logging in to post that. I'm not a fan boy of either. If at the time of me buying a new PC, ATI is more to my liking, then this is BS. Some features will only work with Nvidia cards. You can have it then, I'm not wasting my money on something I can't upgrade to my liking. Even if I get an Nvidia card initially, later I won't be able to get an ATI card I really like. The way motherboards are tied to video cards these days is just plain retarded. Just takes away from customization.


RE: wow
By chickenselects on 5/9/2006 2:05:32 AM , Rating: 2
If nVidia made the chipset it only makes sense not to support the competition. Don't think that SLI and Crossfire to be the same thing.

my 2 cents


RE: wow
By xsilver on 5/9/2006 2:16:26 AM , Rating: 2
yes
and ati are doing to same thing
so what are you going to do? get intel eXtreme graphics? ;)


RE: wow
By Furen on 5/9/2006 2:41:54 AM , Rating: 2
Jeez people, this time it ACTUALLY MAKES SENSE to lock out non-nvidia parts out of using this feature because it's basically a PCI-e overclock. Perhaps not an overclock in the conventional sense since I doubt that it'll change the operating clock but changing the specs of the standard for hardware that you cannot be sure of and dont care to test would be worse (remember that even on Nvidia's side only 90nm parts will work with this).


RE: wow
By plewis00 on 5/9/2006 5:08:56 AM , Rating: 2
Exactly, this actually makes sense. You lot would be much more upset if you plugged in an ATi card which it overclocked and resultantly crashed the system wouldn't you?


RE: wow
By Live on 5/9/2006 6:55:39 AM , Rating: 2
No letting me as a user decide if I want to have it on or off makes sense. Not this bull about letting Nvidia decide for me.
Its just another artificial way to segment the market and avoid real competition. That only make sense for Nvidias profits but it doesn't help me as a consumer in anyway.


RE: wow
By theprodigalrebel on 5/9/2006 7:46:40 AM , Rating: 2
Again, its not supported by ALL nVidia cards, just the 90nm ones. Maybe nVidia designed those cards to handle whatever this 'Link Boost' technology will (may?) provide.

Any way, sounds like marketing-speak. Dual-16X still isn't a compelling improvement (overall) than Dual-8X. Link Boost will 'probably' end up offering a 1-2% improvement in SLI configs running at 1920x1200x4XAA....an improvement that will only show up in timedemos and benchmarks.

Kudos to them for not making it available to users who 'want to try it themselves and see if they want it'. They are prob saving people from potential installation headaches.


RE: wow
By GoatMonkey on 5/9/2006 8:17:11 AM , Rating: 2
Just wait for the final product before condemming it. It could be possible that some motherboard manufacturers will disable this "feature".


RE: wow
By PAPutzback on 5/9/2006 9:16:19 AM , Rating: 2
I agree. I imagine it wil lall work out fine. I love my Shuttle box with the NForce chipset and soundstorm. And the only video cards I have run in it have been ATI. 9500, 9800 and now an X800XL. Never an issue.


RE: wow
By dgingeri on 5/9/2006 11:07:21 AM , Rating: 2
Issue here: the link boost technology says "offer increased bandwidth between GPU and MCP" meaning it will probably only boost video/audio decoding and boost performance of HD video playback. It likely won't have anything to do with gaming. It also likely will only work with nvidia gpu's due to it sending the compressed audio stream directly to the HD audio subsystem for it to decode. That is why it wouldn't be compatable with ATI video cards. ATI will likely have something similar. it is simply something that is not defined in the PCI-Express functions, and would only be an nvidia feature. I dunno if I like the sound of that, personally. things done outside of the standards usually just affect stability, not performance. Their attempts at making IDE and SATA subsystems faster simply resulted in instability.


When will we stop using the term "Leak"?
By rushfan2006 on 5/9/2006 10:32:04 AM , Rating: 2
I just find it funny that the IT news media (not just this site but all over) use the term "leak". Everytime I hear the term "leaked" I think of like government or military information or details being leaked..

Dude there is no such thing as "leaks" with IT products anymore...the companies WANT the information to get out...its nothing special...so why do we keep making it sound exciting or ground breaking by saying "leaked?"..lol




RE: When will we stop using the term "Leak"?
By Tebor0 on 5/9/2006 10:42:53 AM , Rating: 2
The word "Leak" helps the advertiser (Dailytech) out just as much as it does the company trying to sell it.


By KristopherKubicki (blog) on 5/9/2006 12:10:31 PM , Rating: 2
I've been contacted by NVIDIA's legal department before claiming what I write is illegal. I'm pretty sure they wouldn't lean on me if they wanted the information out there.

That aside, mainstream media already has this information but signed non-disclosure agreements. DailyTech does not sign NDAs so when a manufacturer hands us the information, I think leak is perfectly acceptable.


By Tebor0 on 5/9/2006 2:30:14 PM , Rating: 3
The overall point I think trying to be made here is that sure you can call this a “leak” and sure there may be lawyers out there trying to stop this “leak” but the word “leak” has lost it's strength and much of the excitement it used to bring in the IT industry because it is very apparent that the word is a marketing tool for many and a “leak” quite often really isn’t a leak.



Sweet
By Maximilian on 5/8/06, Rating: 0
RE: Sweet
By Sh0ckwave on 5/9/2006 12:38:09 AM , Rating: 2
Looks pretty good, will have to wait and see how it compares to RD580 though.


RE: Sweet
By OrSin on 5/9/2006 9:08:55 AM , Rating: 2
Do don't get things too much in a uproar about the Link Boost. I doubt it will have any real advantages for the high end cards. The PCI-E bus is not full so any type pf boost will most likely be very small (less then 2%). What it might help with is the lowend cards that use system memory. This would make more sense and would give a reason why it would only work with Nv cards.


RE: Sweet
By Trisped on 5/9/2006 1:29:13 PM , Rating: 2
Not to mention the fact that NVIDIA likes to release "finished" products in a beta testing stage, like with the 4GPU cards. I hope early adopters are still able to use the boards without problems. I would hate to upgrade my MBoard just to end up with one that doesn't work with my 7900GTX SLI set up.


RE: Sweet
By akugami on 5/9/2006 3:15:28 PM , Rating: 2
I also can't help but get the feeling this is nothing more than a SLI booster and for low end cards that use system memory. I believe ATI does something similar when using their cards in Crossfire mode but don't quote me on that. For mid-range systems and high end single card systems, this is likely to do nothing.


Wasn't activearmor a bust?
By Dfere on 5/9/2006 8:11:19 AM , Rating: 2
I thought the Nvidia firewall tech was nothing but a headache anyways?

Also- won't this take several months to work the bugs out when it eventually does hit? If you don't opt for SLI what would be the advantage?




RE: Wasn't activearmor a bust?
By Trisped on 5/9/2006 1:32:19 PM , Rating: 2
Same for the first year+ of SLI, that didn't stop people from spending 1200+ to get the setup

SLI had both major problems and minor incentives. I wouldn't worry about this stuff being any different.


Do you mean traffic <Sharing>?
By sxr7171 on 5/9/2006 7:11:48 PM , Rating: 2
I think that's what you mean.




I don't
By smn198 on 5/10/2006 8:47:21 AM , Rating: 2
^


I am permantly boycotting NVidia motherboards
By AlexWade on 5/9/2006 8:27:21 AM , Rating: 1
I am currently using a NForce3 motherboard, and NVidia has forgotten all about me. Their Windows XP drivers have a major bug in it. If you use a SATA drive, all your drives are slow. Does NVidia care? No. And they never release a final driver for NForce3 for Windows XP x64. My system still works great, why should I upgrade so I can get good drivers? If it happens once, it will happen again to you. So, I will never own another NVidia motherboard again. ATI and NVidia always have upper-end about the same, so I won't be losing much.




By mxzrider2 on 5/9/2006 11:28:17 PM , Rating: 2
yeah i dont think ur right, my nforce 3 board works jsut fine with the drivers on nvidia.com, works just as good as my nforce4, the thing i think is wierd was when i was getting drivers for a old system with a nf2 it had a x64 version. how does that work, x64 with a 32 bit proc, waht


By R3MF on 5/9/2006 8:24:02 AM , Rating: 2
oh how i long for a mATX motherboard with the 570 chipset with:
> 8x electrical (16 physical)
> 8x electrical (16 physical)
> 4x electrical
> 4x electrical

they will eventually release the X-Fi and PPU on PCI-E...........




OO!
By Alphafox78 on 5/9/2006 4:31:04 PM , Rating: 2
Soundstorm!!!! OOOOH!!!!
God be praised!
Its probably castrated with AC97 audio..




Conroe + Soundstorm
By AggressorPrime on 5/9/2006 5:06:43 PM , Rating: 2
Yay! Now my next soundstorm (I have an Athlon XP with nForce 2 now) can come with my next CPU (cheap Conroe).




"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg

Related Articles
NVIDIA 2006 Core Logic Roadmap
April 17, 2006, 7:41 AM
NVIDIA Announces nForce 500 Series
March 7, 2006, 10:19 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki