backtop


Print 38 comment(s) - last by tcsenter.. on Sep 27 at 8:45 PM

MCP73 finally hits the market for Intel chipsets

NVIDIA is set to launch its long-awaited MCP73 integrated graphics solution for Intel processors – the GeForce 7 series and nForce 600i series. The new GeForce 7 series IGP solutions for Intel processors arrive in three variants – GeForce 7150 and 7100 with the nForce 630i and the GeForce 7050 with nForce 630i or nForce 610i.

The GeForce 7 series IGP features Vista aero glass compatible graphics with DirectX 9 and Shader Model 3.0 support. NVIDIA equips the GeForce 7150 and 7100 IGP with a native HDMI interface with HDCP compatibility for Blu-ray and HD DVD high-definition video playback. The GeForce 7050 and higher support a single-link DVI output with HDCP compatible. The cost-effective GeForce 7050 lacks DVI, HDMI and HDCP.  

NVIDIA’s flagship GeForce 7150 features a clock speed of over 600 MHz while the lesser GeForce 7100 has a 600 MHz clock speed. The lower GeForce 7050 has a 500 MHz clock speed.

Memory support differs on the GeForce 7150/7100 and 7050. The GeForce 7150 and 7100 feature support for DDR2-800 while the GeForce 7050 is limited to DDR2-667.

All IGPs paired with the nForce 630i feature support for 1333 MHz front-side bus processors. The GeForce 7050 with nForce 610i is limited to 1066 MHz front-side bus processors.

Other notable features of the GeForce 7 series iGP and nForce 600i series include support for one PCIe x16, two PCIe x1, four SATA 3.0Gbps, two PATA and RAID. NVIDIA differentiates the nForce 630i and 610i with a few networking, USB and storage features. The nForce 630i supports Gigabit Ethernet, ten USB ports and RAID 0, 1, 0 + 1 and 5. The nForce 610i is limited to 10/100 Ethernet, eight USB ports and RAID 0 and 1.

NVIDIA also offers a graphic-less nForce 630i with the same features as the IGP variants, including DDR2-800 and 1333 MHz front-side bus support.

Expect NVIDIA GeForce 7 series IGP motherboards for Intel processors to show up in the coming weeks with varying prices. NVIDIA expects the GeForce 7100 with nForce 630i to take on Intel’s G33 Express with a sub-$60 build cost.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

IGP waits for dx10, but for what?
By nerdye on 9/25/2007 12:31:44 AM , Rating: 3
Sure many people are dissapointed in the fact that we are yet to see integrated dx10 graphics solutions as of yet from the likes of nvidia, ati, and intel, yet as a gamer, what is the point? Ati and Nvidia have very nice decidated high end dx10 cards right now in the 2900 and 8800 series respectively, but who wants an underpowered 2600 or 8600 respectively as dx9 games are still prominent in the same price range as higher powered 7950gt's and 1950xt's? The same arguement can be made in the IGP market, what good is a dx10 gpu if it has no power to play games? Ati, Nvidia, Intel, give us full hd decode in an igp solution and you will see the sales, and people finally stop complaining about graphics power that will never be there in an igp solution.




RE: IGP waits for dx10, but for what?
By iGo on 9/25/2007 1:15:14 AM , Rating: 2
I think Intel already has IGP that's DX10 comaliant X3500 (on G35 chipset) and X3100 (mobile variant of X3000, but DX10 compliant). Though I haven't seen any board with X3500 yet.


RE: IGP waits for dx10, but for what?
By Anonymous Freak on 9/25/2007 2:29:44 AM , Rating: 2
G35 (and, claims Intel, also G965 and GM965,) will be DX10 capable with a software update, not on launch. (Latest estimate I've seen is early 2008.) Heck, the 15 month old X3000 in the G965 only just got T&L support a month ago! (And still doesn't have it in Vista.)

So, in short, all of Intel's "X3x00" series IGPs are supposedly hardware-capable of DX10, but the software hasn't come out yet. (Previous Intel IGPs had no real hardware rendering at all, they were just hardware conduits for software rendering. The X-series contains real T&L hardware, but still relies on the main processor for a bit of work, so the drivers have to have DX10 'software' in them.)


RE: IGP waits for dx10, but for what?
By defter on 9/25/07, Rating: 0
By IntelUser2000 on 9/25/2007 4:42:50 AM , Rating: 3
quote:
G35/G965/GM965 will NOT support shader model 4.0 which is the main new feature of DX10. Just having a DX10 driver available doesn't mean that the chip supports all DX10 features...


And you are sure of this because how?? G35/G965/GM965 all support different levels of DirectX. Please stop being ignorant and search for it before posting: http://www.intel.com/products/chipsets/G35/index.h...

http://softwarecommunity.intel.com/articles/eng/14...

G965: DX9 SM3.0 Open GL 1.5
GM965: DX10 SM 4.0 OpenGL 1.5
G35: DX10 SM4.0 OpenGL2.0


RE: IGP waits for dx10, but for what?
By ET on 9/25/2007 4:44:35 AM , Rating: 2
Yes it does. DX10 has very few optional features.


RE: IGP waits for dx10, but for what?
By iGo on 9/25/2007 8:10:24 AM , Rating: 2
Are you sure about G35 not supporting Shader Model 4.0 ???
http://en.wikipedia.org/wiki/Intel_GMA

The table above says it does... can anyone confirm the facts please?


By KristopherKubicki (blog) on 9/25/2007 10:06:29 AM , Rating: 2
It did not on its first incarnation. I believe it does now.


By tcsenter on 9/25/2007 12:05:34 PM , Rating: 2
quote:
G35/G965/GM965 will NOT support shader model 4.0 which is the main new feature of DX10.
Well you have it partially right. DX9.0C and SM3.0 will be the limit for G965 (X3000).

GM965 and GL960 (X3100), and G35 (X3500) will in fact support DX10 with full SM4.0 support in hardware (and support for on-demand software vertex/geometry processing just like in X3000), but Intel won't have a driver ready to enable SM4.0 until Q1/2008 (probably bet on Q2/2008).

quote:
Just having a DX10 driver available doesn't mean that the chip supports all DX10 features...
Apparently, you aren't familiar with the concept of fully programmable unified shaders or execution units.


RE: IGP waits for dx10, but for what?
By ET on 9/25/2007 4:48:35 AM , Rating: 2
Previous Intel IGPs had no real hardware rendering at all, they were just hardware conduits for software rendering.

That's a gross misunderstanding (or misrepresentation). Only vertex processing was in software. Pixel processing, and all other processing, was of course in hardware. This was true for most integrated graphics over the years, from both Intel, NVIDIA, ATI, SiS and VIA (S3).


By cheetah2k on 9/25/2007 1:18:58 AM , Rating: 1
I wonder if we will ever see an onboard swappable IGUs with SLI capabilities?

This could be great for those of us who want a small form factor multimedia center but with gaming capabilities surpassing what is currently available in the IGU market now


RE: IGP waits for dx10, but for what?
By Maskarat on 9/25/2007 5:48:33 AM , Rating: 3
An underpowered passively cooled 2600/8600 is the ideal board for an HTPC. You would have to be crazy to put a bigger more hot card in small factor enclosure!! IGP's with HDMI and HDCP would further remove the need for a discrete graphic card, especially where gaming is of no issue.

What is interesting to know is if the new chipsets support PureVideo?? Having accelerated HDcontent would help push this boards, especially when paired with lower powered, cooler cpu's!


RE: IGP waits for dx10, but for what?
By Ajax9000 on 9/25/2007 9:39:13 PM , Rating: 2
I've just read the following Nvidia pages and the news is somewhat disapointing.

Summary PDF -- http://www.nvidia.com/object/IO_35712.html
AMD (MCP78) features -- http://www.nvidia.com/object/mobo_gpu_features_ben...
AMD (MCP78) specs -- http://www.nvidia.com/object/mobo_gpu_tech_specs.h...
Intel (MCP73) features -- http://www.nvidia.com/object/mcp_features_benefits...
Intel (MCP73) specs -- http://www.nvidia.com/object/mcp_intel_techspecs.h...

PureVideo is only listed for the MCP78 (7050PV+630a) combination. All the other AMD chipsets and none of the Intel chipsets have PureVideo HD.

If, in the future, they release an MCP73 using (say) 7050PV+630i then memory will be limited to DDR667.

There is no details thus far, but what would be good is if the new chipset fixes the HD Audio problem that all current HDMI video cards seem to suffer from (i.e. the problem whereby the chipset supports HD Audio, but the video cards can only accept SPDIF-grade audio for HDMI pass-through).


By Maskarat on 9/26/2007 2:44:45 AM , Rating: 2
I think this is really really disappointing. Hardware accelerated HD content is the main reason behind putting discreet graphics in an HTPC. Nvidia are already falling behind in my opinion on the HD front, especially in quality terms. No doubt, gaming wise they a have very powerful chips, but driver support and HD, are definitely lacking.

I really hope AMD/ATI kick it up a notch, especially with their new Linux agenda.


Niiice
By RjBass on 9/24/2007 11:46:22 PM , Rating: 1
In my opinion NVidia has made some of the better onboard graphics solutions. I have an older AMD socket 754 board down in my shop with onboard NVidia graphics that can take up to 128megs of systems memory. When loaded with Vista Ultimate it can run in full Aero mode just fine. Another board with ATI on board graphics struggles with Aero.




RE: Niiice
By Spuke on 9/25/2007 12:00:52 AM , Rating: 2
Isn't this chipset supposed to support DX10?


RE: Niiice
By Cylee22 on 9/25/2007 12:13:58 AM , Rating: 4
no because according to my knowledge, the geforce 7 series is a dx9 chip. the geforce 8 series chips are the ones that are dx10 compatible.


RE: Niiice
By The Sword 88 on 9/25/2007 1:13:04 AM , Rating: 2
Exactly


RE: Niiice
By LogicallyGenius on 9/25/07, Rating: -1
RE: Niiice
By wordsworm on 9/25/2007 2:13:49 AM , Rating: 2
quote:
Poor NVIDIA. with multi cores there is no need of special graphics chips, just one core is more than enough to do the job.
I agree. Who needs a chip that's good at floating point and has pixel shaders, etc, for graphics? 2D is enough, right guys?


RE: Niiice
By Samus on 9/25/2007 4:08:25 AM , Rating: 2
you're both cracked.


RE: Niiice
By Kim Leo on 9/25/2007 2:43:53 AM , Rating: 2
today's AMD/ATI based boards works just fine, even better than nVidia's lower consumption higher performance. according to some reviews of the two boards nVidia's usually overclock more, but this should not be an issue since this is most a board you would put in the average user PC.


RE: Niiice
By RjBass on 9/25/2007 10:13:13 AM , Rating: 2
Ya I wouldn't know about that. Both the boards I have are older on the 754 socket. I just like the way the Nvidia board performs over the ATI board. But for my main rig I'm running a Radeon X800XL which even though it too is older, still performs nicely.


Corporate Synergy
By thestereotype on 9/25/2007 12:05:42 AM , Rating: 2
It seems Intel/NVIDIA has more synergy than AND/ATI...thought it was a funny thought.




RE: Corporate Synergy
By AmberClad on 9/25/2007 2:25:46 PM , Rating: 2
You make sound as if Intel and nVidia are buddy-buddy with each other...

Intel wants a bigger chunk of the graphics market and to expand from just offering low end integrated solutions -- hence Larrabee. Nvidia, on the other hand, wants to take away some of Intel's dominance in the integrated market by offering its own (and better) IGPs.

And in any case, there's a second article today that says Nvidia is also planning to release an AMD IGP solution. So I wouldn't exactly call Intel and Nvidia best buds.


RE: Corporate Synergy
By thestereotype on 9/25/2007 8:48:44 PM , Rating: 2
I was refering more to the struggles AMD/ATI seem to have, despite being one "company."


RE: Corporate Synergy
By tcsenter on 9/27/2007 7:16:40 PM , Rating: 2
quote:
Intel wants a bigger chunk of the graphics market and to expand from just offering low end integrated solutions -- hence Larrabee.
Larrabee in all likelihood will never be a discrete GPU. Rather, you will see it as a co-processor configured or optimized for graphics (and other data sets, depending on the target application), not unlike a hardware game physics processor, or Mercury Computer's Cell-based PCI-E accelerator board. See:

http://www.mc.com/products/productdetail.aspx?id=2...

Beyond that, high-end discrete consumer/gaming graphics was never the primary objective of Larrabee. The Larrabee project is aimed squarely at next-gen terascale HPC and supercomputing applications, with the ultimate goal of total convergence of the GPU, CPU, and certain components of the core logic such as interconnects, memory controller, and I/O.

Anyone who thinks Larrabee is about taking on NVIDIA or ATI in 3D graphics market can't see the proverbial forest for the trees. Think bigger.
quote:
Nvidia, on the other hand, wants to take away some of Intel's dominance in the integrated market by offering its own (and better) IGPs.
NVIDIA has had better IGPs for 5+ years now. It is Intel that is stepping-up its IGP offerings significantly relative to everyone else.
quote:
And in any case, there's a second article today that says Nvidia is also planning to release an AMD IGP solution.
GF 7025/7050 for AMD based on GeForce 7-Series (MCP78) has already been released. The MCP73 IGP is the Intel version of MCP78 IGP.

NVIDIA might add a 'higher end' part to round-off the AMD line-up such as 7100, but it will be the same design (with higher clocks or something).


RE: Corporate Synergy
By tcsenter on 9/27/2007 8:45:39 PM , Rating: 2
Oops, that should read MCP73 for AMD, not MCP78.


Waiting for dual-mode notebook graphics.
By Anonymous Freak on 9/25/2007 2:40:09 AM , Rating: 2
I'm still waiting for the notebook graphics solutions that use an IGP for 2d or 'on-the-go' rendering, and switch to an external PCI-Express chip for 'high performance' 3d, or plugged-in rendering. (By external I mean an external chip on the notebook's motherboard, not an actual external-to-the-notebook card.) Either AMD/ATI or nVidia was talking about it a few months ago, where you could have (to use an nVidia example,) a low-power IGP 7100 chip (for example) that you would use most of the time, and it would seamlessly transition to a PCI-Express-connected GeForce Go 8800GTX (or something like it) for gaming when plugged in to wall power. (Or if you pick "I know my battery life will only be 5 minutes, but do it anyway" mode.)

This would be the perfect solution for SLI notebooks so they could actually be used on battery. What does the Alienware notebook get, like an hour of life? (I see that it fails the MobileMark battery life test!) It can't even finish a single DVD movie! (89 minutes was the max for a previous-gen model with less-hungry video cards, which wasn't enough to even run a full cycle of the MobileMark test; and it only got 75 minutes while playing a movie.)

I mean, my MacBook Pro may not be a powerhouse, but it does get almost 4 hours of battery life. I'd love it if I could get even longer by trading off 3d game performance. It's not like I play games while on battery power anyway. Whenever I play games on it, it's plugged in.




By FITCamaro on 9/25/2007 7:00:30 AM , Rating: 2
Alienware has been offering that solution for a few years.


disapoitment
By MGSsancho on 9/25/2007 2:50:15 AM , Rating: 2
I am disappointed the budget chip will only be 10/100mb Ethernet. all of Intel's offers support 1gb Ethernet. hell all of Via's new stuff supports gigabit networks. good thing none of these will pop up in dell comps in business networks. I do not feel like I must explain the advantages of this. oh well im not the decision maker at Nvidia.




RE: disapoitment
By Utoryo on 9/25/2007 5:38:33 AM , Rating: 2
Uhm, are you sure you know what you're talking about? Unless VIA's website is massively outdated, *none* of their southbridges integrate Gigabit ethernet. It's all 10/100: http://via.com.tw/en/products/chipsets/southbridge...

However, as noted in that table, they support a separate Gigabit Ethernet controller chip. Since you say VIA motherboards have Gigabit Ethernet, I can only presume they tend to add that tiny bit of extra cost to most of their motherboards.

It is important to realize, however, that motherboard manufacturers could add such a Gigabit controller to a 7050-based motherboard just as well. This is not a disadvantage for MCP73 - quite on the contrary, compared to VIA, integrating Gigabit at all in the higher-end models is an advantage. Compared to Intel, this is of course a disadvantage - but the 7050 will presumably be priced lower than any Intel chipset.


Took their sweet time!
By fk49 on 9/24/2007 11:16:14 PM , Rating: 2
Finally, some decent Core 2 boards for the $50-$75 range...this is where AMD still beats out Intel as a platform.

I've been needing one of these for a mini-pc. Hopefully it overclocks as well as its siblings!




Finally
By AcAuroRa on 9/25/2007 12:18:10 AM , Rating: 2
Its nice to see that nVidia / Intel finally released a good combination. Friend of mine just built a small HTPC and used a abit AMD mATX mobo with HDMI and it uses the 7050. I believe the model is the AN-AM2 HD (the AN-AM2 is the HDMI-less but with DVI instead and no firewire version of the AN-AM2HD).




7150 for AMD?
By mxnerd on 9/25/2007 1:29:05 AM , Rating: 2
Anyone know if 7150 IGP will be available for AMD?
Only 7050 is available at this time.




By Utoryo on 9/25/2007 5:49:09 AM , Rating: 2
I find it interesting that MCP73 is clocked so much higher than MCP68, which is the AMD equivalent. The GeForce 7050 PV/nForce 630a is clocked at 425MHz, while the MCP73-based GeForce 7150 will be clocked above 600MHz.

That's a 50% difference, which could make things a lot more interesting: even Intel's chipsets are competitive against MCP68 (when they can actually play the game, that is) but now, I'm not sure which will win the most benchmarks. You'd certainly expect MCP73 to lose some at least because of its lower bandwidth (single-channel...)




What is the clock speed?
By nitrous9200 on 9/25/2007 5:20:12 PM , Rating: 2
"NVIDIA’s flagship GeForce 7150 features a clock speed of over 600 MHz while the lesser GeForce 7100 has a 600 MHz clock speed"

I'm sure it does. What is the actual clock speed of the 7150 and will it even be worth it on integrated graphics?




It's about value, and about AMD.
By gochichi on 9/25/2007 8:13:17 AM , Rating: 1
I think for budget systems, and budget laptops it's going to be important to deliver these few ounces of graphics capability that really relieve some of the graphics bottle-necking that is bound to take place even in basic usage scenarios.

I know that when deciding what to buy, I do look favorably at embedded NVIDIA or ATI over Intel where a laptop is concerned. I was surprised I could get some decent, if basic Half Life 2 action going on on a AMD laptop with NVIDIA.

In order to keep AMD at bay, Intel needs to cede some of their integrated graphics segment to NVIDIA (or come up with something as beefy or beefier themselves) in order to deliver on the value equation at the price points that most interest consumers.

In terms of desktops, I think the $100.00 video card range never looked so good. With the 2600XT with HDMI (with AUDIO!!) delivering fantastic 3D acceleration, and seriously effective video decoding hardware. Same goes for the 8600GT. Most importantly, these use very power, so you can slap them into just about any full size budget desktop without having to overhaul the cheap power supplies from popular vendors such as Gateway, HP, Dell etc. This is a seriously good time for budget gaming and affordable media center PCs.

I think that as users experience the inexpensive hardware they just bought and took to others about their experiences, it's going to be important for integrated graphics to deliver more. Most people don't sit around and benchmark, but with Aero and HD TV (+ HD DVD , Bluray etc) people are going to know if their new computers zip through this content or not, and that's going to equate to sales.

One more thing: VGA connectors are really just downright inappropriate on a new computer these days. Computers, even laptops should be coming with either a DVI port or VGA and HDMI ports (DVI can carry analog or digital so it can stand alone, HDMI can only carry digital which would cause a million problems b/c video projectors are typically VGA). So adding the HDMI option is always broadening your target market with a mere $5.00-$10.00 of added expense.




"Folks that want porn can buy an Android phone." -- Steve Jobs

Related Articles
New NVIDIA Intel Desktop Chipsets for 2007
February 13, 2007, 4:49 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki