backtop


Print 29 comment(s) - last by thecoolnessrun.. on Feb 15 at 8:58 PM

The GeForce 6150 and nForce 430 is updated as "MCP68" GeForce 7050 and nForce 630a, now a single-chip design

NVIDIA has another chipset in the works – the GeForce 7050 and nForce 630a, also known as MCP68. The new GeForce 7050 and nForce 630a will arrive in time to compete with AMD’s upcoming RS690-family. AMD’s RS690 hit a snag and has yet to launch in mass quantities unfortunately.

Although the GeForce 7050 and nForce 630a appears to be a brand new chipset, its feature-set is similar to current GeForce 6150 and nForce 430 offerings. It does feature a GeForce 7-series derived graphics core, as with the MCP61 series, though the MCP61-family carries the GeForce 6100 moniker.

NVIDIA’s new GeForce 7050 integrated graphics core features native HDMI and DVI support. Integrated HDCP keys allow the GeForce 7050-series to display protected video content, when connected to an HDCP compliant display of course. PureVideo HD video processing is supported for hardware acceleration of VC-1 and H.264 video formats. It is up to motherboard manufactures to decide the video output capabilities of a GeForce 7050 IGP based motherboard though.

Although GeForce 7050 and nForce 630a based motherboards feature a GeForce 7-series IGP, there is still plenty of expansion options such as a single full-speed PCIe x16 slot. Other notable features of the GeForce 7050 and nForce 630a combination include dual-channel DDR2-533/667/800 support, PCIe x1 & PCI slots, high-definition audio, ten USB 2.0, four SATA 3.0Gbps ports with RAID and Gigabit Ethernet.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

This reminds me of GF4MX...
By bottle23 on 2/14/2007 6:19:50 AM , Rating: 2
People thought they were getting a budget GF4-series solution. They were not happy when enthusiasts told them the GF4MX is nothing more than an improved GF2MX. (and that there's no GF4 features as found on the more expensive mainstream lines).

What was the sweet-spot back then? GF4 Ti4200?




RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 7:48:40 AM , Rating: 2
The Geforce MX440 was actually a decent performer back in the day, The only thing it lacked was Pixel shading.
Taken from Wikipedia about vertex shading:

"It should be noted that older versions of the drivers (versions 53.xx for example) for the NV18 based GeForce4 MX and GeForce4 Go supported vertex shader model 1.1 via hardware assisted software emulation, however at some point this support was dropped completely. The newer drivers report that they support vertex shader model 0.0. On certain games which are able to take advantage of vertex shading, using the older drivers can actually result in a significant performance increase."
Original source: http://en.wikipedia.org/wiki/GeForce_4_Series

The MX 440 Did support some technologys only found in the Ti series, and one thing that the Ti series lacked, not to mention that it also outperformed the Geforce FX 5200. And all Geforce 2's and the Radeon 9000 in certain instances, The MX440 even runs Doom 3 and Half Life 2, Not bad for an old card :)

The Technology's it shares with the Ti series is:
LMAII (Lightspeed Memory Architecture II)
Accuview Antialiasing engine
Visibility subsystem (Doesn't bother to render scenes you cant see).
nView multidisplay technology.

It also has a full Mpeg2 Decoder, Which something that the Ti Series lacked. (Hardware accelerated DVD's for instance... Meaning low CPU usage)
The Geforce 4 MX440 was a great card for the price, Was released in 2002, and updates to the card went throughout till 2004 or maybe later than that.

Its still true that its main heart is still sided with the Geforce 2, But the Geforce 2's back in the day were great cards.


RE: This reminds me of GF4MX...
By mino on 2/14/2007 9:03:30 AM , Rating: 2
No offence, but after the R9700 introduction MX440 found itself competing with R8500LE and later R9000 which is simply superior solution.

The problem with the naming was that MANY people actually bought GF4 (i.e.MX420) as it was said to be superior to anything at the time and got just overclocked(and tweaked) GF2 which was allready obsolete. And to mention FX5200 as an justification is crazy, that was one of the mostconsumer-screwing products as far as I remember.


RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 10:10:00 AM , Rating: 2
The Radeon 9000 was released a year or two after the MX440 etc
Still in some cases the MX440 outclassed the Radeon 9000, But why bother buying the 9000? The MX440 was cheaper, (Good for people who are on an uber budget), Plus the Radeon 9000 didn't have stellar performance, Infact I doubt you could run many games in all they're pixel shader glory.
Plus with the Advent of the MX440-8X Nvidia increased the memory clock speed on the cards yielding better performance again.
Everyone complained about the MX440, Yet ATI did the same thing with the Radeon 9xxx series, The Radeon 9000, 9200 series were based upon a modified Radeon 8500. Whats the difference? Double standards? I know that the Radeon had pixel shading, But a card that slow would probably be better without perhaps? (I haven't yet looked up benchmarks on pixel shading enable games for those cards).

And the Geforce 2/4 MX had support for Pixel shading if you counted the Geforce 2's Primitive NVIDIA Shading Rasterizer which is a primitive pixel shader of sorts.


RE: This reminds me of GF4MX...
By KernD on 2/14/2007 11:28:00 AM , Rating: 2
The GF4 MX were the worst thing that could happen, it forced the game developer to support the fixed pipeline even to this day, because allot of young gamers still have them, in there dad's old computer... so when we make games based on animation movies for example, we're asked to support these.


RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 1:31:53 PM , Rating: 2
That also means the Radeon 9000/9200 series "Held back game developers" as it only supported Pixel shaders 1.4
But then again games like Oblivion don't support it out of the box does it? No you are required to get OldOblivion.
Alot of Gamers still had a Geforce 2, When the Advent of the Geforce 4 MX came rolling around, People just upgraded to the beefier Hybrid Geforce 4/2. And what about Intels Graphics market? They currently hold about... 40% on Graphics systems sold. Why hasn't that stunted the industry? The developers could have allways put a bit more time in Using the Geforce 2 registers to get the "Primitive" Pixel shaders working via Nvidia's shader rasterizer. Saying its the worst thing that could happen is a little moot, It was released in an era where there were hardly any Pixel shader games, and Games like Morrowind 3 ran fine on them, WarCraft 3, And World Of Warcraft work fine on the Geforce 4 MX, Half Life 2, Doom 3, Even got it to run Far Cry. The only Problem I can see for developers when it comes to Pixel Shaders and Fixed function is when you try to apply pixel shader/vertex shaders, to colour data, where this problems comes more inherit is with bump mapping. I can see the issues you may be suggesting, But thats only an issue for lazy developers, To maximize profits you NEED to support the largest possible market, which unfortunately does date back to the TnL and no Pixel or Vertex shading era. And games based on animation movies? well allot of them I'm afraid is something kids want and enjoy, And not everyone can afford the latest and greatest to pay a rather large (In my opinion) cost for a game, Only to find they're system cannot play it.

I know allot of people with Sub 2Ghz Pentium 4's, Athlon XP systems, Geforce 2/4MX, 256Mb of ram. Yet the rest of the machine doesn't hold the developers back? that gets me, Instead lets flame the MX440 ;) I mean it only provided a cheap solution, That gave good performance and had good features back in 2002.


By thecoolnessrune on 2/15/2007 8:58:39 PM , Rating: 2
my comp has a GeForce 4 MX 420 and bro has a 440. They are REALLY bad today but back then they were great. And as you said, I can get 40FPS in HL2 1024x768 all low.


RE: This reminds me of GF4MX...
By PrinceGaz on 2/14/2007 8:17:22 AM , Rating: 2
The thing is, is that there is no real difference between the GeForce 6 architecture and the GeForce 7; they have basically identical feature-sets. At launch the GF7 appeared to introduce transparency anti-aliasing which the GF6 did not support, but it turned out the GF6 also supported transparency AA once they enabled it in the 91.47 and later drivers. As such, GeForce 6 and 7 cards should really all be grouped together as the GeForce 6/7 generation and only differentiated by performance.

So whether they call it the GeForce 6100 or 7050 is irrelevant as they both mean the same thing; a very low performance model of the GF6/7 generation. At least they lowered the model number from x100 to x050 when they moved it from the GF6 to the GF7 group.


RE: This reminds me of GF4MX...
By Samus on 2/14/2007 1:04:37 PM , Rating: 2
geforce7 runs cooler (many don't even have fans) and has h264 acceleration, a very important feature.


RE: This reminds me of GF4MX...
By Samus on 2/14/2007 1:03:40 PM , Rating: 2
nvidia sweet spots:

all were ~$200 at launch

geforce2 gts
geforce3 ti200
geforce4 ti4200 128mb (~$30 more than 64mb)
geforce5 fx5700 (with 128bit memory)
geforce6 6600gt
geforce7 7900gs


...
By yehuda on 2/14/2007 10:42:44 AM , Rating: 2
Hi, a minor correction here -

The official name for MCP61P is GeForce 6100 and nForce 430. It's 430, not 405.




RE: ...
By Anh Huynh on 2/14/2007 11:04:11 AM , Rating: 2
GeForce 6100 and nForce 430 is the MCP51G and a two-chip design. The nForce 430 is also paired up with the GeForce 6150, 6150 LE and 6150 SE as well. The GeForce 6100 and nForce 405 and nForce 400 are MCP61 parts.


RE: ...
By johnsonx on 2/14/2007 11:30:24 AM , Rating: 2
Compare and contrast:

http://www.ecs.com.tw/ECSWebSite/Products/Products...

That's the single chip MCP61P aka GeForce 6100 & nForce 405.

http://www.ecs.com.tw/ECSWebSite/Products/Products...

That's the dual chip MCP51G, aka GeForce 6100 & nForce 410. The higher featured nForce 430 is usually paired with the higher clocked GeForce 6150 (though I have seen a few 6100/430 boards). In any case, it appears the single chip MCP61P will continue to replace the dual chip setup.


RE: ...
By Anh Huynh on 2/14/2007 11:59:07 AM , Rating: 2
You are correct. My mistakes. It would seem the complete GeForce 6100 line-up was replaced with MCP61 variants, which is a bit annoying since they kept the same names.


RE: ...
By johnsonx on 2/15/2007 2:46:30 AM , Rating: 2
Actually Anh, I was adding additional information to your correction of the OP. I wasn't correcting you at all. The only way you were remotely mistaken was not mentioning that the 410 is the common companion to the old GeForce 6100/MCP51G, but I wasn't being that nit-picky!

Agreed on the naming thing. It's especially confusing that the single chip MCP61P has two names even though it's one chip, and doubly so that the names are the same (or nearly the same) as as the previous MCP51G two-chip set. Add to that the fact that nVidia now has different sets of drivers for the different flavors, and you've got product name hell. The new rename is welcome for that reason alone, and probably should have been done from the beginning.

I still don't see why one chip needs two names though.


Not quite like ATI
By MonkeyPaw on 2/14/2007 7:45:29 AM , Rating: 2
If the article is referring to the X300 and the "new" X1100 being the same thing with different names, that's not totally true. While they are indeed the same core, the X1100 runs 33% faster than the X300, according to ATI.

I really like my 6100 IGP system, but it will not compare to RS690--should we ever see it! What really disappoints me is that this in nVidia's apparent answer to the RS690. It also suggests that they've got nothing for Intel either. I fail to see their lack of desire to fill this market.




RE: Not quite like ATI
By Conroe on 2/14/2007 8:35:14 AM , Rating: 2
I've read nvidia is making a Intel IGP. It's probably better to spend it's resources in market that it has no IGP in. I have a 6100 mobo and have found it laking. It's too slow to use any of the good features. I have to run just about any game at lowest settings to get lousy FPS. That rig is just for internet. I do hope to see faster IGPs in the future. Maybe nforce Intel IGP will be a pleasant surprise?


RE: Not quite like ATI
By therealnickdanger on 2/14/2007 8:37:43 AM , Rating: 2
I'd like to know where NVIDIA is going to stand with integrated DX10-level graphics. We know Intel is going there and AMD is as well. Where's the 8100IGP?


RE: Not quite like ATI
By shabodah on 2/14/2007 9:00:22 AM , Rating: 2
Yesterday's review links had the ECS 690 board reviewed against the 6150 IGP boards, and really, the difference in performance is moot. It was a little better, but about as much better as the 6150 is compared to the 6100. Certainly nothing to write home about.


PureVideo is supported by all GeForce 6100
By 13Gigatons on 2/14/2007 9:30:25 AM , Rating: 2
According to Nvidia's site: http://www.nvidia.com/page/gpu_mobo_tech_specs.htm...

I think the difference is SDTV vs HDTV.

This of course brings up a question of why the hell do they need so many crappy version of IGP ?

They all pretty much suck compared to videocards and the price is not much difference between a 6100 and 6150 mobo and yet they have 31 flavors.

Just offer the geforce 6150 and mcp430 and dump the rest.




RE: PureVideo is supported by all GeForce 6100
By Mitch101 on 2/14/2007 2:38:12 PM , Rating: 2
I agree.

Unless the Geforce 6100 is some sort of failed 6150 that doesnt operate 100% making it a 6100 there is no reason for the 6100 going forward.


RE: PureVideo is supported by all GeForce 6100
By Anh Huynh on 2/14/2007 3:10:28 PM , Rating: 2
The GeForce 6150, while it may appeal more to enthusiasts, most likely has a smaller market share than the GeForce 6100. It may be more powerful and have more features, but it costs more. The GeForce 6100 is low-cost, single-chip and runs Windows Aero well enough. It also allows OEM manufacturers to ship low-cost or business oriented AMD systems, which is the bread and butter of the market.


By 13Gigatons on 2/15/2007 8:57:11 AM , Rating: 2
Your missing the point, the price difference is zero between the Geforce 6100 and Geforce 6150. Then there is the MCP405, 410, 430, etc.

Nvidia would save a lot of confusion if the just offered the Geforce 6150 and MCP430. Looking at the price of mobo it's $80 for a 6100 and $80 for a 6150. This does nothing but add confusion for the consumer.

Click the link above and you will see what a joke Nvidia's IGP 31 flavors is, Intel on the other just offers the x3000.


runs Vista Aero Glass just fine
By johnsonx on 2/14/2007 11:42:51 AM , Rating: 2
I just delivered my first customer system with Vista, and I can tell you that the MCP61P runs Aero perfectly at the standard LCD res of 1280x1024. BUT, you do have to give the IGP 128Mb of RAM. When I had it set for only 64Mb, it would only run Aero up to 1024x768.

With 128Mb for the IGP, the Windows Experience score for the gfx subsystem is 3.0. Any less RAM than that and the score drops to 2.4. This is with DDR2-533 RAM, the cheap and safe choice for office systems. I don't know if the IGP performs better with faster RAM, but I presume it would.

This is all a bit of a relief to actually see, as I've been telling customers for over a year that all my GeForce 6100 boxes would be Vista Ready. I don't know if any of them will actually want an upgrade, but it's nice to know that it will work.

Has anyone else reading run Vista on a single channel socket-754 GeForce 6100 board? I've got dozens of those in the field, but none available for testing ATM.




RE: runs Vista Aero Glass just fine
By Slack3r78 on 2/14/2007 12:04:27 PM , Rating: 2
quote:
This is with DDR2-533 RAM, the cheap and safe choice for office systems.

Why in the world are you still using DDR2 533? 667 has been going for the same price as 533 for ages now and I've seen zero stability difference between the two.


By johnsonx on 2/14/2007 11:46:02 PM , Rating: 2
I've been burned a couple of times using DDR2-667, so I now default to DDR2-533 for non-perfomance critical systems (you know, regular run-of-the-mill office systems). I have no doubt that a BIOS update couldn't have fixed the problems I encountered, but the fact was there was no fix at the time. In both cases swapping in DDR2-533 solved the problem. For the miniscule additional performance, DDR2-667 hasn't been worth the potential headache so far.

You're right that the price has largely equalized now, so perhaps I shouldn't have said 'cheap'.


RE: runs Vista Aero Glass just fine
By Webgod on 2/14/2007 1:53:40 PM , Rating: 2
I've seen a GF 6200TC with 128MB of RAM on the card get a 3.0.


HDCP over DVI
By hellokeith on 2/14/2007 10:26:34 AM , Rating: 2
I wonder if this board supports HDCP over DVI? That would be a boon to HTCP builders who want HD DVD & BD playback.




RE: HDCP over DVI
By Webgod on 2/14/2007 1:57:40 PM , Rating: 2
Not really. It probably has built-in MPEG-2 acceleration which would do ok for the earliest MPEG-2 Blu-Ray movies. But I doubt NVIDIA would put VC-1 or H.264 acceleration in something this cheap, which you'd want for guaranteed performance with everything else Blu-Ray or HD-DVD.


"I'd be pissed too, but you didn't have to go all Minority Report on his ass!" -- Jon Stewart on police raiding Gizmodo editor Jason Chen's home

Related Articles
AMD 690 Delayed Again
December 21, 2006, 1:23 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki