Print 29 comment(s) - last by thecoolnessrun.. on Feb 15 at 8:58 PM

The GeForce 6150 and nForce 430 is updated as "MCP68" GeForce 7050 and nForce 630a, now a single-chip design

NVIDIA has another chipset in the works – the GeForce 7050 and nForce 630a, also known as MCP68. The new GeForce 7050 and nForce 630a will arrive in time to compete with AMD’s upcoming RS690-family. AMD’s RS690 hit a snag and has yet to launch in mass quantities unfortunately.

Although the GeForce 7050 and nForce 630a appears to be a brand new chipset, its feature-set is similar to current GeForce 6150 and nForce 430 offerings. It does feature a GeForce 7-series derived graphics core, as with the MCP61 series, though the MCP61-family carries the GeForce 6100 moniker.

NVIDIA’s new GeForce 7050 integrated graphics core features native HDMI and DVI support. Integrated HDCP keys allow the GeForce 7050-series to display protected video content, when connected to an HDCP compliant display of course. PureVideo HD video processing is supported for hardware acceleration of VC-1 and H.264 video formats. It is up to motherboard manufactures to decide the video output capabilities of a GeForce 7050 IGP based motherboard though.

Although GeForce 7050 and nForce 630a based motherboards feature a GeForce 7-series IGP, there is still plenty of expansion options such as a single full-speed PCIe x16 slot. Other notable features of the GeForce 7050 and nForce 630a combination include dual-channel DDR2-533/667/800 support, PCIe x1 & PCI slots, high-definition audio, ten USB 2.0, four SATA 3.0Gbps ports with RAID and Gigabit Ethernet.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

This reminds me of GF4MX...
By bottle23 on 2/14/2007 6:19:50 AM , Rating: 2
People thought they were getting a budget GF4-series solution. They were not happy when enthusiasts told them the GF4MX is nothing more than an improved GF2MX. (and that there's no GF4 features as found on the more expensive mainstream lines).

What was the sweet-spot back then? GF4 Ti4200?

RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 7:48:40 AM , Rating: 2
The Geforce MX440 was actually a decent performer back in the day, The only thing it lacked was Pixel shading.
Taken from Wikipedia about vertex shading:

"It should be noted that older versions of the drivers (versions 53.xx for example) for the NV18 based GeForce4 MX and GeForce4 Go supported vertex shader model 1.1 via hardware assisted software emulation, however at some point this support was dropped completely. The newer drivers report that they support vertex shader model 0.0. On certain games which are able to take advantage of vertex shading, using the older drivers can actually result in a significant performance increase."
Original source:

The MX 440 Did support some technologys only found in the Ti series, and one thing that the Ti series lacked, not to mention that it also outperformed the Geforce FX 5200. And all Geforce 2's and the Radeon 9000 in certain instances, The MX440 even runs Doom 3 and Half Life 2, Not bad for an old card :)

The Technology's it shares with the Ti series is:
LMAII (Lightspeed Memory Architecture II)
Accuview Antialiasing engine
Visibility subsystem (Doesn't bother to render scenes you cant see).
nView multidisplay technology.

It also has a full Mpeg2 Decoder, Which something that the Ti Series lacked. (Hardware accelerated DVD's for instance... Meaning low CPU usage)
The Geforce 4 MX440 was a great card for the price, Was released in 2002, and updates to the card went throughout till 2004 or maybe later than that.

Its still true that its main heart is still sided with the Geforce 2, But the Geforce 2's back in the day were great cards.

RE: This reminds me of GF4MX...
By mino on 2/14/2007 9:03:30 AM , Rating: 2
No offence, but after the R9700 introduction MX440 found itself competing with R8500LE and later R9000 which is simply superior solution.

The problem with the naming was that MANY people actually bought GF4 (i.e.MX420) as it was said to be superior to anything at the time and got just overclocked(and tweaked) GF2 which was allready obsolete. And to mention FX5200 as an justification is crazy, that was one of the mostconsumer-screwing products as far as I remember.

RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 10:10:00 AM , Rating: 2
The Radeon 9000 was released a year or two after the MX440 etc
Still in some cases the MX440 outclassed the Radeon 9000, But why bother buying the 9000? The MX440 was cheaper, (Good for people who are on an uber budget), Plus the Radeon 9000 didn't have stellar performance, Infact I doubt you could run many games in all they're pixel shader glory.
Plus with the Advent of the MX440-8X Nvidia increased the memory clock speed on the cards yielding better performance again.
Everyone complained about the MX440, Yet ATI did the same thing with the Radeon 9xxx series, The Radeon 9000, 9200 series were based upon a modified Radeon 8500. Whats the difference? Double standards? I know that the Radeon had pixel shading, But a card that slow would probably be better without perhaps? (I haven't yet looked up benchmarks on pixel shading enable games for those cards).

And the Geforce 2/4 MX had support for Pixel shading if you counted the Geforce 2's Primitive NVIDIA Shading Rasterizer which is a primitive pixel shader of sorts.

RE: This reminds me of GF4MX...
By KernD on 2/14/2007 11:28:00 AM , Rating: 2
The GF4 MX were the worst thing that could happen, it forced the game developer to support the fixed pipeline even to this day, because allot of young gamers still have them, in there dad's old computer... so when we make games based on animation movies for example, we're asked to support these.

RE: This reminds me of GF4MX...
By StevoLincolnite on 2/14/2007 1:31:53 PM , Rating: 2
That also means the Radeon 9000/9200 series "Held back game developers" as it only supported Pixel shaders 1.4
But then again games like Oblivion don't support it out of the box does it? No you are required to get OldOblivion.
Alot of Gamers still had a Geforce 2, When the Advent of the Geforce 4 MX came rolling around, People just upgraded to the beefier Hybrid Geforce 4/2. And what about Intels Graphics market? They currently hold about... 40% on Graphics systems sold. Why hasn't that stunted the industry? The developers could have allways put a bit more time in Using the Geforce 2 registers to get the "Primitive" Pixel shaders working via Nvidia's shader rasterizer. Saying its the worst thing that could happen is a little moot, It was released in an era where there were hardly any Pixel shader games, and Games like Morrowind 3 ran fine on them, WarCraft 3, And World Of Warcraft work fine on the Geforce 4 MX, Half Life 2, Doom 3, Even got it to run Far Cry. The only Problem I can see for developers when it comes to Pixel Shaders and Fixed function is when you try to apply pixel shader/vertex shaders, to colour data, where this problems comes more inherit is with bump mapping. I can see the issues you may be suggesting, But thats only an issue for lazy developers, To maximize profits you NEED to support the largest possible market, which unfortunately does date back to the TnL and no Pixel or Vertex shading era. And games based on animation movies? well allot of them I'm afraid is something kids want and enjoy, And not everyone can afford the latest and greatest to pay a rather large (In my opinion) cost for a game, Only to find they're system cannot play it.

I know allot of people with Sub 2Ghz Pentium 4's, Athlon XP systems, Geforce 2/4MX, 256Mb of ram. Yet the rest of the machine doesn't hold the developers back? that gets me, Instead lets flame the MX440 ;) I mean it only provided a cheap solution, That gave good performance and had good features back in 2002.

By thecoolnessrune on 2/15/2007 8:58:39 PM , Rating: 2
my comp has a GeForce 4 MX 420 and bro has a 440. They are REALLY bad today but back then they were great. And as you said, I can get 40FPS in HL2 1024x768 all low.

RE: This reminds me of GF4MX...
By PrinceGaz on 2/14/2007 8:17:22 AM , Rating: 2
The thing is, is that there is no real difference between the GeForce 6 architecture and the GeForce 7; they have basically identical feature-sets. At launch the GF7 appeared to introduce transparency anti-aliasing which the GF6 did not support, but it turned out the GF6 also supported transparency AA once they enabled it in the 91.47 and later drivers. As such, GeForce 6 and 7 cards should really all be grouped together as the GeForce 6/7 generation and only differentiated by performance.

So whether they call it the GeForce 6100 or 7050 is irrelevant as they both mean the same thing; a very low performance model of the GF6/7 generation. At least they lowered the model number from x100 to x050 when they moved it from the GF6 to the GF7 group.

RE: This reminds me of GF4MX...
By Samus on 2/14/2007 1:04:37 PM , Rating: 2
geforce7 runs cooler (many don't even have fans) and has h264 acceleration, a very important feature.

RE: This reminds me of GF4MX...
By Samus on 2/14/2007 1:03:40 PM , Rating: 2
nvidia sweet spots:

all were ~$200 at launch

geforce2 gts
geforce3 ti200
geforce4 ti4200 128mb (~$30 more than 64mb)
geforce5 fx5700 (with 128bit memory)
geforce6 6600gt
geforce7 7900gs

"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone
Related Articles
AMD 690 Delayed Again
December 21, 2006, 1:23 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki