backtop


Print 51 comment(s) - last by Griswold.. on Jul 2 at 11:14 AM


Image quality comparison between GMA 950 (left) and GMA X3000 (right)

Intel G965 block diagram

A quick comparison of 945GM (left) and GM965 (right), specifications not final
DX9, Pixel Shader 3.0 and hardware T&L

DailyTech previously reported rough details of the integrated graphics core in the upcoming Intel G965 Express chipset—Intel Graphics Media Accelerator X3000. Early roadmaps showed the graphics core as being Microsoft Windows Vista Premium compatible with support for Aero Glass. It was also said the graphics core would have Clear Video Technology to improve video playback.

DailyTech recently had the opportunity to pick apart some of the specifications for Intel's GMA X3000 architecture, and compare it to some of the other architectures available today. Perhaps the largest improvement of GMA X3000 over GMA 950 is the move away from a fixed function pipeline in favor of a programmable pipeline.  NVIDIA and ATI abandoned fixed function pipelines in 2001.

Intel’s latest motherboard update has more detailed information on the Graphics Media Accelerator X3000. DirectX 9 features such as Pixel Shader 3 and Vertex Shader 3.0 are supported. This time around the Vertex Shader 3.0 units are hardware based instead of the software based shaders found in previous GMA900/950 and Extreme graphics cores. A hardware transform and lighting engine has also been integrated and a significant improvement over the previous software T&L engine. High dynamic range is also supported for great realism in gaming. Lastly the GMA X3000 graphics core will be clocked up to 667 MHz -- quite a bit higher than current budget ATI and NVIDIA offerings.

Video output capabilities of the GMA X3000 are limited to a native VGA output. HDMI, DVI, UDI, component, composite and S-Video can be added through the SVDO port or with an ADD2 expansion card like the previous GMA900/950 graphics cores. This more or less indicates HDCP compliance will be left up to the motherboard manufacturer or ADD2 card manufacturer. GMA X3000 will support resolutions up to 2048x1536 including 720p/1080i/1080p in 16:9, 4:3 and letterbox aspect ratios.

Intel Clear Video Technology will provide algorithms and features to improve video playback. Clear Video Technology will have plenty of graphics power to simultaneously playback one high definition and one standard definition video stream for picture-in-picture. Hardware acceleration for high definition MPEG2 and VC1 is supported. However, it doesn’t look like Intel will offer hardware acceleration for H.264 at this time. An advanced de-interlacing algorithm is also integrated for improved video quality of interlaced sources such as DVD’s and cable programming. The built-in advanced pixel adaptive de-interlacing algorithm supports standard and high definition video content up to 1080i lines of resolution.

Intel previously had a single integrated graphics core for all of its integrated core-logic but this time around there are at least two different graphics cores. While the consumer level G965 Express receives the GMA X3000 graphics core, Q965 Express chipsets receive the GMA 3000 graphics core. Differences between the GMA X3000 and GMA 3000 include the lack of Intel Clear Video Technology on the GMA 3000. This isn’t too surprising as the Q965 is part of Intel’s Stable Image Platform Program aimed towards business and corporate users. Aside from the lack of Clear Video Technology the GMA 3000 graphics core retains compatibility with Windows Vista Premium with Aero Glass interface like the GMA X3000. There’s no word if the GMA 3000 graphics core clocked as high as GMA X3000.

One thing that is important to remember about the GMA X3000 family is that it is a completely programmable pipeline architecture -- meaning Intel only needs to update the microcode to add support for features like SM 4.0.  This opens the door to a few possibilities with where Intel can go with the architecture.  For example, since the Santa Rosa notebook platform is based on G965, but will not launch until next year, Intel may take the opportunity to add better features to the core.

G965 Express is expected to launch the last week of July with Core 2 processors while Q965 Express is expected to launch the first week of September with Intel’s vPro business platform.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

clock
By Myrandex on 6/30/2006 12:27:21 PM , Rating: 2
The slide says ~400mhz, but the article says 600+, so which is it?




RE: clock
By Anh Huynh on 6/30/2006 12:41:24 PM , Rating: 2
A recent marketing brochure shows 667 MHz so it'll be 667 MHz.


RE: clock
By KristopherKubicki (blog) on 6/30/2006 1:01:59 PM , Rating: 2
The slide is for GM965 -- for mobile devices. The desktop chipset is G965, which the article is referring to when it says up to 667MHz.


RE: clock
By OrSin on 6/30/2006 1:06:28 PM , Rating: 2
ATI chipset will be using a x700 core by then. Intell will stay bottom dog.


RE: clock
By brystmar on 6/30/2006 1:14:25 PM , Rating: 2
Maybe in terms of performance they'll still be third, but clearly that isn't of concern to them. Keep in mind that Intel is still *BY FAR* the largest graphics provider in the world.


RE: clock
By stncttr908 on 6/30/2006 1:44:50 PM , Rating: 2
But it's better than the bottom of the barrel crap that 50% of the market is running currently, you have to give them that.


RE: clock
By stupid on 6/30/2006 1:44:50 PM , Rating: 2
So what?

Intel is not in the business of performance graphics. Their main concern is to make sure their IGP will be compatible with Windows Vista. That way system builders need not add in a discreet GPU. That would only increase the price to their customers who are mostly big businesses.


RE: clock
By IntelUser2000 on 6/30/2006 11:40:32 PM , Rating: 2
quote:
So what?

Intel is not in the business of performance graphics. Their main concern is to make sure their IGP will be compatible with Windows Vista. That way system builders need not add in a discreet GPU. That would only increase the price to their customers who are mostly big businesses.


There are rumors going around Intel may go discrete graphics again.


RE: clock
By Lonyo on 6/30/2006 1:56:14 PM , Rating: 2
X700 isn't SM3 or SM4 compatible though.
It will likely be faster, but won't be able to to the latest things (though they would likely run too slowly on either hardware)


RE: clock
By Thmstec on 6/30/2006 2:21:57 PM , Rating: 2
ATI has stated that the x700igp will be updated to support SM3 atleast, if not SM4.


RE: clock
By coldpower27 on 6/30/2006 3:13:53 PM , Rating: 2

No, from what we know now it's the RS700 which will support DX 10, and Shader Model 4.0 and is based on RV6xx technology, RS600 which is RV4xx derived, will remain with Pixel Shader 2.0b


How many pipelines?
By Bladen on 6/30/2006 12:06:15 PM , Rating: 1
?




RE: How many pipelines?
By ForumMaster on 6/30/2006 12:10:37 PM , Rating: 2
dunno. wish they'd atleast double the pipelines from 4 to atleast 8. maybe even triple but i doubt it. the core speed of 667Mhz will definenetly require a heatsink and fan. but it's about time that intel finally provided good level IGC's.


RE: How many pipelines?
By Anh Huynh on 6/30/2006 12:12:19 PM , Rating: 2
Completely passive.


RE: How many pipelines?
By ForumMaster on 6/30/2006 1:27:40 PM , Rating: 2
how? my 6600 that runs at 300Mhz requires a heat sink and fan. it that chip runs at twice the speed, how won't it require a fan? and FYI, Intel Extreme Graphics 3, the 3rd generation of intel's IGC, has 4. why would they lower it to 2? and i doubt that even an 8 pipeline card would run F.E.A.R. but it would help for when everything is GPU accelerated. And it's about time that intel provided IGC with progammable pipelines. makes it so much better.


RE: How many pipelines?
By masher2 (blog) on 6/30/2006 2:04:55 PM , Rating: 2
There's more to the equation than clock speed. The 6600 is built on what, a 110nm process? It also is going to have more transistors. Taken together, that means considerably higher power consumption.


RE: How many pipelines?
By shabodah on 6/30/2006 2:28:42 PM , Rating: 2
7300GT= 8 Pipelines = Runs Fear


RE: How many pipelines?
By Pirks on 6/30/06, Rating: -1
RE: How many pipelines?
By coldpower27 on 6/30/2006 3:11:23 PM , Rating: 1
I would say it should do acceptably at 640x480 with no AA or AF.


RE: How many pipelines?
By IntelUser2000 on 6/30/2006 11:38:42 PM , Rating: 2
quote:
how? my 6600 that runs at 300Mhz requires a heat sink and fan. it that chip runs at twice the speed, how won't it require a fan? and FYI, Intel Extreme Graphics 3, the 3rd generation of intel's IGC, has 4. why would they lower it to 2? and i doubt that even an 8 pipeline card would run F.E.A.R. but it would help for when everything is GPU accelerated. And it's about time that intel provided IGC with progammable pipelines. makes it so much better.


Why would it necessarily require a fan?? The current integrated graphics from Intel, the GMA950, along with its chipset 945G, is run passively even though it runs at 400MHz. Is that enough for you?? Not to mention its on 130nm process.


RE: How many pipelines?
By peldor on 6/30/2006 12:38:16 PM , Rating: 2
That they've not revealed that crucial bit of info makes me pretty skeptical about the whole thing.

8 pipelines would be a surprise to me. I'd guess 4. I think Intel is primarily interested in having something that can handle Aero Glass, not FEAR.


RE: How many pipelines?
By FrozenCanadian on 6/30/2006 1:18:50 PM , Rating: 2
4 would suprise me. I'm guessing 2.

In any case I will never own a machine with Intel graphics ever again. I really hate how mobos with Intel graphics rarely have AGP or PCI-E slots.


RE: How many pipelines?
By coldpower27 on 6/30/2006 3:07:11 PM , Rating: 2
It's unlikely they will go down in pipeline count, remember GMA 950 already has 4 pipelines.

If anything I would guess this core has 4 pipelines, with some hope it could be 8.


RE: How many pipelines?
By BaronMatrix on 6/30/2006 11:37:38 PM , Rating: 2
Yes. It's about time. But this isn't the first time ntel has promised wonders and delivered mediocrity in IGPs.

I'm moreso concerned about the GUI response and not wheteher it can play an HD. Especially since a slightly higher chunk of Core 2 will blaze throgh it.


Unified Shaders
By ltcommanderdata on 6/30/2006 9:25:06 PM , Rating: 4
It has been confirmed that the new GMA X3000 has 8 "pipelines", double that of the GMA 950. Combined with the high 667MHz clock speed, the GMA X3000 will probably be able to provide competition with modern low-end graphics cards like the X1300HM and vanilla X1300.

http://www.laptoplogic.com/news/detail.php?id=951
(here they indicate 8 pipelines)

My question is if you can confirm that these 8 shaders are unified shaders.

http://www.hkepc.com/bbs/itnews.php?tid=610998

There has been some speculation that Intel's term of "multi-threaded array of programmable execution units" seems to imply unified shader units. Intel's diagram in that link certainly indicates that that is a possibility. It would make sense too, since the GMA X3000 is supposed to offer DirectX 10 support including SM4.0 through a supposed BIOS update or driver update closer to the time of Vista's release. Unified shaders in the GMA X3000 would also put Intel in a good position technologically if rumours about them re-entering the discrete graphics card market are true. The G965's die size seems very big though, because of the 8 shaders, which is probably why it hasn't launched yet. I wonder if that HKEPC's image of the G965 is on the 90nm process yet or was it an early 130nm trial which may explain its large size.

Any confirmation on anything I've mentioned would be appreciated.




RE: Unified Shaders
By KristopherKubicki (blog) on 6/30/2006 10:07:20 PM , Rating: 2
Keep in mind, thats 8 programmable pipelines, some are used for the VS, etc. Intel has not announced how many will be pixel pipelines.


RE: Unified Shaders
By ltcommanderdata on 6/30/2006 11:31:46 PM , Rating: 2
So do you think it's possible that the 8 shaders would be unified ones like in Xenos and R600? It would make it easier for them to offer DirectX 10 support since the 8 unified shaders could switch between PS and VS mode now like in Xenos and then with a BIOS update and the proper driver they can switch between VS, PS, and GS in Vista. 8 unified shaders, 4 TMUs, and 4 ROPs would make a nice combination.


RE: Unified Shaders
By NextGenGamer2005 on 7/1/2006 4:14:37 AM , Rating: 2
It would HAVE to have unified shaders if there is any truth to the DirectX 10 support through a BIOS update. DX10 adds a new shader, the geometry shader (it sits between the pixel and vertex shader units), so GMA X3000 would have to be using unified shaders that already contain support for the geometry shader if only a BIOS update is needed to make it DX10 compliant. My guess is that at launch, it will have 8 unified shaders but 6 will be "locked" as pixel shaders and 2 will be "locked" as vertex shaders. Then, closer to Vista launch, the BIOS update will "unlock" these to full unified shaders that can dynamically change between pixel, vertex, and geometry operations depending on the current graphics load. 4 TMUs and 4 ROPs sounds about right to me as well. And a 90-nm process would be required for this to operate passively (in fact, I wouldn't be too surprised if this was actually using a 65-nm process, just because the northbridge still has to contain the memory controller and PCI Express lanes as well; that's a lot to fit into one chip).


RE: Unified Shaders
By squeezee on 7/1/2006 7:09:09 AM , Rating: 2
Shaders don't have to be unified at the hardware level, that isn't a DX10 requirement. Having separate or semi-separate PS/GS/VS units in hardware doesn't affect the ability to bypass or disable the GS in software(Driver or Firmware).

Regardless of how they've done it hopefully it will make a big jump the abysmal performance of their current IGPs.


RE: Unified Shaders
By Zoolook on 7/1/2006 5:45:10 PM , Rating: 2
Have to agree, 667 MHz on an igp sounds like a 65-nm process, Intel already has 3 running and will add one more 65 and a trial 45 in the autumn. Sounds like they are finally turning up the heat on ATI and Nvidia on the integrated front, I've always wondered why they haven't pushed it harder before. Maybe they have been to complacent with their large market share.


RE: Unified Shaders
By ltcommanderdata on 7/1/2006 9:33:53 PM , Rating: 2
Chipset production is switching to 90nm now that those fabs are done producing processors. The 65nm fabs are needed to supply current processors, especially the Core 2's since Intel is deciding to do 3 launches (Woodcrest, Conroe, and Merom) in 3 months. Not to mention Celeron D production is switching to 65nm with the Cedar Mill based ones, Celeron M is on 65nm with the Yonah based parts, and the 65nm Tulsa for Xeon MP is launching in Q3. Needless to say, the 65nm fabs are overworked.

The 96x series of chipsets are the first to be on the 90nm process, which may be why they're a bit slow to market since now only does the logic need to be debugged and optimized but they have to accomodate a process transition too. Evem with the 90nm process, with 8 shaders on the IGP and all the other northbridge features, the die seems to be very large if the picture in the HKEPC article I originally linked is any indication. I just hope the G965 can be passively cooled. The Inquirer found that the P965, the graphics-less version, consumes 25% more power than the 945P.

http://www.theinquirer.net/default.aspx?article=32...

Hopefully, that's only an early engineering sample, possibly a 130nm trial, or else it doesn't bode well for the G965. It'll be even more worrisome for the mobile version of the G965 in Santa Rosa.

In regards to the implementation of unified shaders in a DirectX 9 environment, I don't think it's necessary to actually "lock" the pipelines as either PS or VS. ATI's upcoming R600 can determine whether the incoming data is vertex, pixel or geometry and assign it to an open pipeline. They plan to use this same mechanism in DirectX 9 games just that it only needs to decide between vertex and pixel data, bringing the flexibility of unified shaders to DirectX 9 games. Intel can just use the same type of mechanism to avoid having to "lock" their pipelines. Seeing the good relations between ATI and Intel, I wouldn't be surprised if they're actually working together on unified shader implementation.

http://www.theinquirer.net/default.aspx?article=32...


Intel drives open source
By tbtkorg on 6/30/2006 3:07:31 PM , Rating: 4
I prefer Intel integrated graphics because Intel provides true open-source graphics drivers. You can delve in and explore Intel chips, program them yourself, not just play canned games sold to you by someone else. Some people don't mind nVidia's and ATI's closed-driver business tactics; some aren't interested in hacking their chips. That's fine, but if you care about hardware hackability, if you care about open source, then Intel's integrated graphics are much to be preferred.

That Intel's integrated graphics add a mere ten dollars to the cost of a PC, that they consume little power and generate little heat, is an added bonus.

There are indeed good reasons for a gamer to deploy an nVidia or ATI card in his PC, but for many of us there are even better reasons to stand fast on the old reliable rock of Intel.




RE: Intel drives open source
By mlau on 6/30/2006 3:45:06 PM , Rating: 2
Yeah, XGL is going to fly on these things :)
I wish Intel also added h264 decoding support,
would make an excellent MythTV box!


RE: Intel drives open source
By bob661 on 6/30/2006 6:15:47 PM , Rating: 1
quote:
There are indeed good reasons for a gamer to deploy an nVidia or ATI card in his PC, but for many of us there are even better reasons to stand fast on the old reliable rock of Intel.
Ah yes! Ye ole reliable Intel. Never a problem with them. Intel stuff just runs and runs. Hardware perfection, I say.


RE: Intel drives open source
By stmok on 7/1/2006 3:58:13 AM , Rating: 2
Definitely...If this new IGP performs reasonably well (to my satisfaction), I wouldn't mind leaning over with this solution and use it with a Merom/Conroe solution under Linux. At least then we won't need to wait for ATI or Nvidia to come up with patches and updates everytime a kernel or Xorg change occurs!

I notice it supports OpenGL 2.0 spec. :-)


SUN
By Alphafox78 on 6/30/2006 5:15:53 PM , Rating: 2
If you use one of these IGPs it will make the sun brighter!!! look at the pic!




RE: SUN
By peternelson on 7/1/2006 12:00:18 PM , Rating: 3

No, in the first pic, it was definitely that small MOON over there.

Wait a second...... that's no moon that's a space station!


RE: SUN
By Lord Evermore on 7/2/2006 4:25:23 AM , Rating: 2
A fraction of a second later, that buggy was obliterated by the obvious particle cannon death beam; the camera captured the light given off by the air being burned as the beam shot through the atmosphere, the light moving just slightly faster than the beam itself.


thank ms
By Visual on 7/2/2006 6:02:32 AM , Rating: 2
gotta hand that to microsoft - if it werent for the aeroglass requirements in their next os, we'd hardly be seeing something that good from intel. pity it got delayed so much though

and nice going for intel too, if it turns up true




By Blackraven on 7/2/2006 10:54:41 AM , Rating: 2
Intel GMA X3000 = 128 MB video card

Or perhaps more? What about the other specs & details?


Gawd did we need another X?
By Lord Evermore on 7/2/2006 4:34:49 AM , Rating: 1
Seriously, have marketing drones just lost all their creativity? WinXP, AthlonXP, AthlonX2, Geforce MX and FX, X3/5/6/7/800, Radeon X1k, Xpress chipsets, various GTX and XT models, Audigy X-Fi, Core2 Extrem X6800, and now the GMA X3000 (and I'm sure plenty of other X products). I was all Xed out by the time the Geforce FX came out, I don't know how much more I can take before I start boycotting anything with an X in it.




RE: Gawd did we need another X?
By Griswold on 7/2/2006 11:14:51 AM , Rating: 2
With the famous words of Indiana Jones:

"X" never ever marks the spot.


looks good
By R3MF on 6/30/2006 12:43:57 PM , Rating: 2
this more than anything is making me consider turning intel for my HTPC box.




By Lord Evermore on 7/2/2006 4:06:41 AM , Rating: 2
Maybe I'm missing something but what do you mean one of the new chipsets won't have hardware shading? The GMA3000's only missing feature is the Clear Video Technology.


Video output
By Chriz on 7/2/2006 11:05:33 AM , Rating: 2
Native video output still limited to VGA. Decisions like this pretty much ensure that VGA will never get replaced by something better like DVI, etc.




First comment!
By caboosemoose on 6/30/06, Rating: -1
Why Intel is better than AMD
By bjacobson on 6/30/06, Rating: -1
RE: Why Intel is better than AMD
By shabodah on 6/30/06, Rating: 0
By IntelUser2000 on 6/30/2006 11:36:04 PM , Rating: 2
quote:
7600GS= 35 watts. Intel NB/SB chipsets usually surpase that number.


No it doesn't. The combined numbers for P965+Southbridge is ~19W.


"Game reviewers fought each other to write the most glowing coverage possible for the powerhouse Sony, MS systems. Reviewers flipped coins to see who would review the Nintendo Wii. The losers got stuck with the job." -- Andy Marken














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki