Print 38 comment(s) - last by jmuffat.. on Jan 24 at 7:07 AM

Intel recruiting for upcoming discrete graphics solutions and a hybrid CPU/GPU product

Intel is currently recruiting employees for its Visual Computing Group, or VCG, according to Beyond3D. The new recruits will work on unannounced Intel discrete graphics products and integrated CPU/GPU products to compete with AMD’s upcoming Fusion architecture.

According to the job description:
Intel's Visual Computing Group (VCG) has the mission to establish the future of computing for high-throughput workloads. We are focused on developing discrete graphics products based on a many-core architecture targeting high-end client platforms. Our vision is that the resulting ingredients and technology will extend to mobile clients, servers, and embedded platforms over time. VCG will initially focus on discrete graphics products but will also expand the previous charter to include developing plans for accelerated CPU integration.
VCG positions are available at Intel Hillsboro, Oregon and Austin, Texas campuses with various engineer and developer positions available. It is unknown when Intel's first discrete graphics product since the ill-fated i740 will materialize.

Shortly after the AMD acquisition of ATI, the company announced it would pursue a project called Fusion, a development tree aimed at integrating GPU elements into a CPU. Since then the company has remained silent about discrete graphics products.

With the entrance of Vista analysts predict upgradeable graphics may be a hot market.  In order to fully utilize Aero Glass and Flip 3D features in the premium versions of the OS, the computer needs considerably higher performance hardware than what is deployed in IGP setups today.  Cost effective, low end discrete graphics cards would bring the majority of last year's hardware up to snuff with Vista's hardware requirements.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

I want Intel in!
By MrDiSante on 1/22/2007 9:01:52 PM , Rating: 5
I want Intel in the GPU market - more competition's always a good thing and hopefulle they can reverse this horrid trend of more and more power-hungry GPUs.

RE: I want Intel in!
By FITCamaro on 1/22/2007 9:07:00 PM , Rating: 2
Any GPU from Intel is going to be not much better than integrated graphics. They're not going after the high end. They're going after having whats necessary to run premium Aeroglass effects. Thats it.

RE: I want Intel in!
By MrDiSante on 1/22/2007 9:14:46 PM , Rating: 2
You never know - I don't know the numbers but I imagine margins on higher-end video cards would be a bit higher than entry level. Furthermore even if it's a bit of pressure from the lower end it still helps.

RE: I want Intel in!
By wien on 1/23/2007 7:56:07 AM , Rating: 2
We are focused on developing discrete graphics products based on a many-core architecture targeting high-end client platforms.

RE: I want Intel in!
By Targon on 1/23/2007 8:08:25 AM , Rating: 2
The thing about having a high end machine is the CPU tends to have a LOT of processing ability, which can compensate via drivers for a lack of GPU power.

As it stands now, Intel's latest GPUs support DirectX 9 due to drivers that compensate for what the hardware lacks.

RE: I want Intel in!
By Khato on 1/23/2007 11:18:04 AM , Rating: 2
As it stands now, Intel's latest GPUs support DirectX 9 due to drivers that compensate for what the hardware lacks.

Oh really now? And how exactly is the graphics architecture in the current G965 lacking? Sure, it's not quite a DirectX 10 architecture yet, but what DirectX 9 features aren't there?

RE: I want Intel in!
By Bluestealth on 1/22/2007 9:47:07 PM , Rating: 2
Does anyone know how broad Intel's cross licensing with AMD is? If it were to include ATI's patent portfolio they would have a pretty nice selection already. Besides if they wanted to they could probably just buy NVidia.

By Bluestealth on 1/22/2007 9:58:39 PM , Rating: 2
Hmmm.... Intel signed a cross licensing agreement with Nvidia a few years back... which is probably still active... looks like Intel just has to worry about the little patent trolls then...

By Targon on 1/22/2007 10:07:49 PM , Rating: 2
I believe that the Intel/AMD agreement only applies to x86 instruction sets, and not even the exact implementation. This is part of the reason why Intel tried to get away from x86 with the Itanium, so AMD wouldn't be able to make Itanium compatible chips.

x86-64 aka AMD64 is still just an extension of the old x86 instruction sets, so Intel had the right to make their own implementation(though I still have doubts about Intel's right to make up their own name for AMD's invention.

By Bluestealth on 1/23/2007 3:08:59 AM , Rating: 2
From what I have heard it was all encompassing as far as the companies patent portfolios go, however I may not be fully understanding what was meant. I believe this allows them to recreate AMD's 64 bit instruction set into their own product. I doubt AMD would have been happy if they used the same name on a similar but different implementation of their product. There are some key differences between AMD64 and EM64T.

By DallasTexas on 1/23/2007 9:59:18 AM , Rating: 2
Really? Ya think Intel acquired critical technologies and patents from Lockheed Martin when they acquired Real3D? What's special about NVIDIA licenses that Intel "needs" and evidently AMD does not need?

Your full of crap, IMHO. As the #1 PC graphics in the world for years, ya think Intel needs to 'catch up'? Intel could have participated in high end graphics but chose not to. Why? Well, it's not their core business and the graphics industry was thriving with many competitors. Note Intel does not sell coffee, either. Whata dumb schmuck.

By FITCamaro on 1/24/2007 6:51:02 AM , Rating: 2
They're #1 in terms of marketplace due to their lowend integrated chipsets that go into millions of Dells and other platforms. This hardly makes them the best at graphics and not needing to catch up.

Yes they could have entered the highend GPU market. But they didn't and most likely still won't because its doubtful they would be able to compete on the highend with ATI or Nvidia.

Oh Really?
By Zurtex on 1/22/2007 9:11:36 PM , Rating: 2
"a development tree aimed at integrating GPU elements into a GPU "

Wow, AMD must have learnt so much when they bought out ATi :P

RE: Oh Really?
By KristopherKubicki on 1/22/2007 9:13:05 PM , Rating: 2

RE: Oh Really?
By rebturtle on 1/23/2007 2:15:02 AM , Rating: 1
Wow, AMD must have learnt so much when they bought out ATi

How about learned, soo, and AT I ? Clearly, hooked on phonics didn't work for you.....

RE: Oh Really?
By Zurtex on 1/23/2007 4:48:28 AM , Rating: 2
I think you'll find learnt is actually an accepted modern alternative :P. But I did have to check that up. I'm quite convinced, ATi is what I was going for and I think "Soo" was a character out of The Sooty Show ;)

RE: Oh Really?
By S3anister on 1/23/2007 8:00:52 PM , Rating: 2
it's ATi, either that or it doesn't matter, because for the longest time (and i believe still) the ati logo was "ATi"

i know this because that's what it says on my videocard heatsink... that and all of the other images associated with ATi.

By Vokus on 1/22/2007 8:47:34 PM , Rating: 2
Come on how long will I have to wait...

I need this now in my laptop...

RE: hum...
By KristopherKubicki on 1/22/2007 9:05:21 PM , Rating: 2
I need this now in my laptop...

Don't bet on it. The whole idea with discrete GPUs is that you plug them into existing hardware. Intel would just build an IGP for a notebook.

RE: hum...
By Khato on 1/23/2007 3:20:44 AM , Rating: 2
Eh, every Intel chipset of late is technically an IGP, just that functionality isn't enabled. The problem with that, however, is that the 3D graphics core even with a measly number of execution units takes up an awful lot of die space comparatively. Don't want to make that any bigger, since that makes the chips that don't have graphics enabled bigger too - all the various cost analysis stuff says how big integrated 3D can be. Anyway, laptop is a great starting market for a discrete part because laptops never have the ultra high end, and Intel's manufacturing edge can be used to far more effect.

Makes me all the more interested to see some actual benchmarks of the broadwater integrated graphics, especially under Vista. Though by the time this comes around the architecture will have gone through 2 generations or so. Still, exciting to actually hear something about it in the 'news' finally.

Methinks not...
By Thoreau on 1/22/2007 10:07:40 PM , Rating: 2
I don't care what department the jobs are for... I'm NEVER going back to work for Intel! EEEEEEEEEEEEVILLLLL!!!!!!

Well, they DID have a nice employee smoking area at least...

RE: Methinks not...
By GoatMonkey on 1/23/2007 8:53:29 AM , Rating: 2
Was it just their office politics that you thought were evil, or the company as a whole?

RE: Methinks not...
By DallasTexas on 1/23/2007 7:18:11 PM , Rating: 2
Were you one of the first chunks of dead wood they let go or one of the last?

moneys on
By Comdrpopnfresh on 1/23/2007 1:43:59 AM , Rating: 3
My bet is that they are going to provide a cpu/gpu hybrid. Thats what amd is doing for the low end, so its logical. Also- theres too much competition in discrete graphics, unless they plan on using the core architecture in a gpu. If they did that, I'd suppose their spin would be that because of the efficiency, you can reach higher clocks, make less noise, draw less power, and have better performance, plus it'd be easier to have multiple gpus on one pcb- which means less expansion slots taken up. They also have better fabrication techniques- with 65nm mastered and further pursuits already being sampled.

RE: moneys on
By Griswold on 1/23/2007 5:06:21 AM , Rating: 3
Why on earth do you think their core architecture can be used for a GPU?

"G80 is more CPU than GPU "
By crystal clear on 1/22/2007 11:57:28 PM , Rating: 2
As I scan a vast number of websites,I rember reading this article below-


"G80 is more CPU than GPU "

A WELL-CONNECTED source says Nvidia's G80 is as programmable as the average CPU. Nvidia’s flagship chip powering the Geforce 8800 GTX and GTS has, he said, almost everything that a CPU needs.
This may answer the question, can and will Nvidia make a CPU? Certainly the chip is not meant to be X86 compatible but it is apparently super programmable. We hear that it could probably handle X86 instructions as well.

AMD techies took a deep look at the G80 and confirmed the claims. G80 has all the features of CPU but it is big, hot and bulky. However, Nvidia still has to shrink the chip to make it cost effective.


*This reminds me a story-Whilst the 2 cats(Intel/Amd) fight over the cheese the mice(Nvidia) takes it away.

*Can Nvidia can come up with a CPU & GPU combined as one component?

*Can anybody out there check/verify -not the story but
IS THE G80 PROGAMMABLE JUST AS CPU & what are its potentials?

*"G80 is more CPU than GPU "-Can somebody put this to test?

Very interesting.

RE: "G80 is more CPU than GPU "
By zsouthboy on 1/23/2007 11:17:09 AM , Rating: 2
It's not a CPU.

Turing complete, probably. However, Turing completeness makes no claim to how long calculations can take...

It would not do the things that a CPU does in an efficient manner as a.... CPU. Period.

Correction Needed
By politicalslug on 1/23/2007 12:39:19 AM , Rating: 2
DirectX 10.0 Hardware isn't needed for anything other than DirectX 10.0 specific games, which will likely still include a DirectX 9 & 8 shader path.

That said, for Aero Glass & Flip3D the only thing required is DirectX 9 compliant hardware that supports shader model 2.0 or higher.

Where do they find the idiots that write these articles on technology that haven't the slightest clue what they're writing about.

RE: Correction Needed
By deeznuts on 1/23/2007 1:55:29 PM , Rating: 2
That must have changed since I remember reading several months ago, and therefore believing since, that almost all features are available to a DX9 card but some minor eye candy is only available to a DX10 card.

I don't see this combo.
By mindless1 on 1/24/2007 6:32:16 AM , Rating: 2
Are people really going to get the premium version of vista then ponder IGPs or a new GPU entry into the market?

Seems a bit of a mismatched system to me.

RE: I don't see this combo.
By jmuffat on 1/24/2007 7:07:04 AM , Rating: 2
There are more uses to a GPU than just 3D graphics nowdays, and I see this as the main reason why Intel wants to enter this market. This is as much of a reply to ATI and nVidia as one to Sony/IBM's cell technology...

By IGoodwin on 1/23/2007 12:25:33 PM , Rating: 2
There are a few items that are interesting in this direction that make the future much more interesting.

Scanning through the comments here, there are comments on IGS being disabled on motherboards when adding a discreet solution. If the ability to share GPU power is coming, then having an add on card that works in combination with the motherboards own capabilities would give that card an advantage and not waste silicon on the MB.

The question then becomes which manufacturers cards would work, especially as current MBs are already going GPU solution specific. As a market leader, maybe Intel would have to publish the interface to enable competition.

By Pirks on 1/22/07, Rating: -1
RE: DX10?
By InsaneScientist on 1/22/2007 10:08:10 PM , Rating: 2
That's not a lie at all...

First, to be technical, to by lying, one must have an intent to decieve... and I doubt very much that Dailytech is trying to decieve us.
If you're not intending to decieve someone, it's just a mistake.

Anyhow, with that said... The majority of the Aero Glass features are there with DirectX 9, but there is a small functionality boost going up to DX10.
So, in reality, it should probably say "In order to fully utilize Aero..."

I don't feel like typing the explanation all out, especially since I explained this a couple of days ago, so I'll just copy and paste from my own post. ;)

It's not really a huge thing (it doesn't look any better or anything; it's just a small functionality issue.)

It isn't Aero glass, per se, that doesn't function at its full capabilites, but rather the Desktop Window Manager engine (DWM).

This might not be exactly correct, it is just my best understanding of what's going on... if someone has better information, please enlighten us.
Basically, DirectX 9 cards can only process one graphical stream at a time. Since the Aero glass interface requires a constant stream of GPU acceleration, if something else that requires GPU horsepower is started on the system, Aero glass needs to shut down so that the other process can get access to the resources of the GPU. (and it does)
Most of the time you don't see this, because the majority of the things that require graphics acceleration are full screen games... there are a few things, though, that will revert Vista to the Windows Vista Basic theme.

For example, one of them is Java. If you want to check this out, install Sun's Java package (if you don't already have it.) and then load a webpage (any page) that has java embedded in it. Vista should revert to the Windows Vista Basic Theme until you close your internet browser. (I haven't actually done this with anything other than IE, but considering how the engine is supposed to work, I would assume that it's the same with Firefox, Opera, etc.)

Anyhow, DX10 allows for simultaneous processing of multiple graphics streams, allowing Vista to continue using the glass interface, while still giving the other process the resources it needs.

That's all it really is... I didn't mean to make it seem like a big deal. It's not... but there is a little bit more functionality that you get with a DX10 part over a DX9 one.

P.S. Forgive my incoherency... I'm extremely tired at the time of this post. Hopefully I got my point across.

From here:

RE: DX10?
By borowki on 1/22/2007 11:17:18 PM , Rating: 5
Dude, don't just make stuff up. Here's what Microsoft has to say on the subject:

ExtremeTech: Speaking to that point, we know that the desktop in Vista is drawn using DirectX 9. What happens if you have a DirectX 10 card? Does the desktop still use DX9, or does that switch over and use DX10?

Blythe: It continues to use DirectX 9. Largely the reason for that is when we built the desktop, that was being done concurrently with the design of DirectX 10. It becomes somewhat more complicated to build both the low-level technology and the thing on top of it, concurrently. It's better to sort of have a time gap between those. At the same time, we were making some minor tweaks to DirectX 9 to accommodate new features that were needed to do the desktop. For us, it's best to have one consistent platform. Even though we could imagine there being benefit to the desktop using DX10, it's better to do all the debugging and get it to work with DX9 and ship that. Then over time, as the hardware base builds up for DX10, by the time we do the next major release, we'd be looking at trying to move the entire desktop onto 10.

As for the Java issue, it is described in details in a bug report:

Has nothing to do with the 3D render pipeline.

RE: DX10?
By InsaneScientist on 1/22/2007 11:48:41 PM , Rating: 3

That was the original explanation I saw when the issue first came up in the beta testing phase...

I guess this is what I get for not keeping current on how they're developing things. D'oh...

I sit corrected. Thank you for setting me straight.

/foot in mouth :P

"DailyTech is the best kept secret on the Internet." -- Larry Barber
Related Articles
AMD Reveals More "Fusion" Details
November 17, 2006, 10:01 AM
AMD-ATI: A Done Deal
July 24, 2006, 5:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki