backtop


Print 65 comment(s) - last by M4gery.. on Jun 30 at 12:57 PM


Haswell CPUs will contain vector processors and a more power on-die GPU. The chips are designed to power the next generation of "Ultrabooks".  (Source: ASUSTek)

An Intel corporate blog post seemed to confirm both the presence of vector coprocessor silicon and a 2013 release date for the 22 nm Haswell.  (Source: Intel)
Company looks to new 22 nm architecture to hold off AMD and ARM Holdings

Intel Corp. (INTC) has dropped a few hints to its upcoming 22 nm Haswell architecture, currently under development by the company's secret Oregon team.  In a post on the Intel Software Network blog titled "Haswell New Instruction Descriptions Now Available!", the company reveals that it plans to launch the new CPU in 2013.

Haswell will utilize the same power-saving tri-gate 3D transistor technology that will first drop with Ivy Bridge in early 2012.  Major changes architecturally reportedly include a totally redesigned cache, fused multiply add (FMA3) instruction support, and an on-chip vector coprocessor.

The vector process, which will work with the on-die GPU, was a major focus of the post.  The company is preparing a series of commands called Advanced Vector Extensions (AVX), which will speed up vector math.  It writes:

Intel AVX addresses the continued need for vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications. Intel AVX is designed to facilitate efficient implementation by wide spectrum of software architectures of varying degrees of thread parallelism, and data vector lengths.

According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.  

Intel has a ways to go to meet that objective -- its on-die GPU in Sandy Bridge marked a significant improvement over past designs (which were housed in a separate package, traditionally), however it also fell far short of the GPU found in Advance Micro Devices (AMD) Llano Fusion APUs.

Intel has enjoyed a love/hate relationship with graphics makers AMD and NVIDIA Corp. (NVDA).  While it's been forced to allow their GPUs to live on its motherboards and alongside its CPUs, the company has also fantasized of usurping the graphics veterans.  Those plans culminated in the company's Larrabee project, which aimed to offer discrete Intel graphics cards.

Now that a commercial release of Larrabee has been cancelled, Intel has seized upon on-die integrated graphics as its latest answer to try to push NVIDIA and AMD out of the market.  Intel is promoting heavily the concept of ultrabooks -- slender notebooks like the Apple, Inc.'s (AAPL) MacBook Air or ASUTEK Computer Inc.'s (TPE:2357) UX21, which feature low voltage CPUs and -- often -- no discrete GPU.

Mr. Kilroy reportedly wants ultrabook manufacturers using Haswell to shoot for target and MSRP of $599 USD, which would put them roughly in line with this year's Llano notebooks from AMD and partners.  It's about $100 USD less than current Sandy Bridge notebooks run.

Intel faces pressure from a surging ARM Holdings plc's (ARMH) who is looking to unveil notebook processors sometime next year.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

wow no kidding
By tastyratz on 6/24/2011 12:28:46 PM , Rating: 5
Something to be released in 2 years will rival the technology of today? Amazing!

What a useless comment. Cool if it is competitive to discreet laptop graphics - but usually "its as good as 2 year old tech" is not the bragging point to focus on




RE: wow no kidding
By kleinma on 6/24/2011 12:46:40 PM , Rating: 3
Well they didn't say exactly which discrete cards of today it would rival. If they are saying the top of the line ATI radeon 6XXX series performance will be available in an integrated GPU, that is pretty impressive, even 2 years away. But discrete graphics cards come in a broad range, and they didn't elaborate really, so it is hard to know if I should be impressed or not.


RE: wow no kidding
By Mitch101 on 6/24/2011 1:05:06 PM , Rating: 1
I hope they do I would like a lot more physics in my games. More than just a torn flag waving or other cheap eye candy.

Could also lead to cheap $400.00 laptops that rival todays mid-high range gaming rigs.


RE: wow no kidding
By bug77 on 6/24/11, Rating: 0
RE: wow no kidding
By Targon on 6/24/2011 8:17:26 PM , Rating: 2
Ageia failed because the framerates were horrible when the PPU was being used. Think back to the original 3G cards, where 3DFX dominated, and WHY. There were the 3DFX Voodoo cards, and then you had all the others, most of them not really improving the framerates, even though graphics clarity WAS better.

The Voodoo and Voodoo 2, better graphics, and faster performance, it was a win/win. Look at physics, where if you turn it on, your framerates go down so much you want to turn it off. Software acceleration with the feature on is the only thing that would make the PPU seem like an improvement.


RE: wow no kidding
By epobirs on 6/24/2011 9:34:56 PM , Rating: 2
Ageia was always dooomed even in their best case scenario of delivering something in their discrete product that couldn't be matched on pure performance by the GPU makers. The best they could hope for was to turn a big profit on a buyout by one of the big GPU companies.

The problem was that they could never hope to have a slam dunk, you either have our card or your stuff is hopelessly inferior, situation. The most that they could hope for was a brief window when a few games would get a lot of mileage out of the card. The technology wasn't sufficiently different from what GPGPU enables, meaning the next GPU generation would always offer a substantial chunk of what the discrete card offered but effectively for free in the end users eyes.

Worse, high-end users faced a choice between adding a PPU card or a second GPU card. The latter choice was arguably more versatile, albeit more costly in most cases where the card is fairly recent and strong.

So, even if it worked perfectly and gained wide support, Ageia couldn't ever offer a killer value proposition against the likes of Big GPU.


RE: wow no kidding
By Samus on 6/25/2011 7:34:29 AM , Rating: 2
Hearing Intel talk up their 'next-gen graphics' is like listening to a broken record, and dates back about as far as vinyl as well.

I simply fail to imagine how Intel will out-engineer AMD in GPU design. Having an enemy in nVidia and a direct competitor in AMD, they will have to source their own engineering team, which no matter how well funded, will simply not outpace nVidia or AMD in 2 years.


RE: wow no kidding
By ekv on 6/26/2011 2:43:41 AM , Rating: 2
quote:
I simply fail to imagine how Intel will out-engineer AMD in GPU design.
By and large I'd have to agree with you there. However, it is fair to say that Intel has superior manufacturing capabilities, and that their stumbles (Cougar Point SATA Bug) lately have been relatively minor (contrast with Phenom). In comparison, AMD/ATI and NVidia are rather tied to TSMC, though neither they or GlobalFoundries et al. have and/or can match Intel.

Intel has also shown somewhat surprising adaptability recently in terms of CPU architecture. AMD has seemingly been stagnant, though quite the opposite is likely true, perhaps due to budget / cash-flow difficulties. Which goes to show once again that a sound economic position can help you [buy engineers].


RE: wow no kidding
By HollyDOL on 6/29/2011 2:42:53 PM , Rating: 2
I wouldn't underestimate Intel. Don't forget they can afford to throw much more money in the project. Probably much more than AMD and nVidia could put in together.
That in combination with current technological advantage they have could create them path to really get competitive gpu. Not saying it will, just it could be possible.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:11:11 PM , Rating: 2
quote:
Really, are you that bothered by the fact that when you toss a grenade it has a perfect parabolic trajectory, completely disregarding the air friction?

What a strangely specific example.

I'm of the opinion that stuff looks more real if it moves like it's real. You can slap all the textures and shaders and effects onto a 3D model that you like, even to the point of making it look photorealistic for stills, but if it still moves like a mannequin having a stroke, the effect is ruined.


RE: wow no kidding
By bug77 on 6/27/2011 3:52:41 AM , Rating: 3
That what I was looking for: an example where real physics will have noticeable impact. Cause we have ragdoll physics right now. All I have seen in tech demos was smoke, cloth (both of which will be unnoticed in a fast paced game - maybe useful in an RPG, but that's more about the story and game play) and particles. And particles are useless since we saw that while you may be able to compute a million of them each second, no video card will cope with displaying that many.


RE: wow no kidding
By DanNeely on 6/24/2011 1:35:53 PM , Rating: 2
Actually getting something close on paper wouldn't be that hard. On paper llano is 25% as fast (lower in reality due to the memory bottleneck), and GPUs are still seeing enough design improvements to double yearly, instead of every 18mo. 22nm could get them halfway there all by itself; and intel is claiming trigate's performance boost is the equivalent of a process node which gets them most of the rest of the way there.


RE: wow no kidding
By Belard on 6/24/2011 4:25:25 PM , Rating: 2
For all we know, and most likely... They'll have an on-die GPU that'll be equal to todays discrete video cards... since they are NOT specific and intel is thankfully - bad at video (just stick to CPUs and SSDs), they could be talking about the ATI 6450, a bottom end video card, which is a bit better than what AMD is offering in their Fusion-C CPU/APUs.

So... in two years from now, AMD will have their 9000 series cards (again) and intel will be 2-4 years behind, as usual. And of course, AMD Fusion chips will be more advanced that what they are today.

intel... sometimes does stupid things. That's nice.


RE: wow no kidding
By fic2 on 6/24/2011 10:41:43 PM , Rating: 3
quote:
intel... sometimes does stupid things. That's nice.


Biggest thing I can think of in their resent "GPU competition" mode was ripping out half of the GPU in most of the Sandy Bridge chips sold. Whose stupid idea was it to include the HD3000 in only the 'K' series overclockers parts?


RE: wow no kidding
By fteoath64 on 6/25/2011 3:53:41 AM , Rating: 2
Intel's own stupid idea, of course!. If they had HD3000 in the lower-end chips, they would be competing against themselves which is a bad idea. So crippling the lower-end chip is a "marketing" decision.

I still think they ought to ship variants of the SB chips without any GPU cores in there since they suck and OEMs put in discrete GPU anyway so why part for an energy wasting part which is not used ?.


RE: wow no kidding
By PhatoseAlpha on 6/25/2011 7:38:36 PM , Rating: 2
A quick look over today's PC games shows a very simple reality: Huge numbers of them are console ports, designed to run on hardware that's 6 years old already. The 360 isn't scheduled for replacement until 2015.

You don't need a 2013 graphics card to play a console port. You don't even need a 2011 graphics card unless you're doing something like 6xMultimonitor and Anti-aliasing.

In that light...well, a CPU that can smoothly play WoW and the vast menagerie of console ports without a graphics card doesn't seem quite so stupid.


RE: wow no kidding
By 85 on 6/24/2011 12:47:54 PM , Rating: 4
quote:
Something to be released in 2 years will rival the technology of today? Amazing!


LMAO!

i was thinking the same thing. reminds me of the Guinness commercials where the guys with mustaches say "brilliant!"


RE: wow no kidding
By MozeeToby on 6/24/2011 12:49:10 PM , Rating: 1
You don't have to have the most powerful product available to be successful, there is more to picking a solution that raw processing speed.

If I could get performance equal to today's high end graphics cards in an integrated solution on a laptop two years from now I would be thrilled; they'd have lower power consumption and heat than a discrete card, fewer moving parts to fail and make noise, and a smaller form factor. And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.


RE: wow no kidding
By StevoLincolnite on 6/24/2011 3:17:07 PM , Rating: 4
quote:
And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.


Maybe even less than 3 years.
Intel has historically been slow to adopt new Direct X standards, heck past iterations never even supported the full Direct X standard. (I'm looking at you, GMA 900, 910, 915, 950, 3000, 3100 chips with no TnL or vertex shaders.)

End result is that despite the chips having "just enough" grunt to handle a modern game even at the lowest of settings... It didn't have the feature set to run them without additional software like Swift Shader, Oldblivion etc'.

Another example is Sandy Bridge, it is still stuck in the Direct X 10 era despite Direct X 11 being available on nVidia and AMD GPU's for a couple of generations now.

Plus, Intel drivers are just plain bad, slow to be updated and the compatibility just isn't in it.
It took intel what... A year or two just to enable SM3 and TnL on the x3100 chips? Even then TnL was only half done as some games still ran it in software.

Unfortunately for the past decade I've told people if you wish to play video games, then get even a low-end solution from nVidia or AMD and skip the Intel IGP.
The drivers at the very least make it for a much less painful experience.


RE: wow no kidding
By Motoman on 6/24/11, Rating: 0
RE: wow no kidding
By croc on 6/25/11, Rating: 0
RE: wow no kidding
By justjc on 6/25/2011 4:20:56 AM , Rating: 2
quote:
Personally, ANY space on the CPU die devoted to graphics alone is wasted die space in my opinion...


Perhaps you're right on the Intel side, as Intel feels it would hurt their processor dominance if GPU acceleration is used.

On the AMD and ARM side however the space used for GPU processors will more and more be the parts delivering the processing power. After all today most browsers have graphics acceleration, Flash have graphics acceleration, Office 11 have graphics acceleration and that's just the programs available today I could remember.

No doubt GPU acceleration will play a bigger role in the future and Intel will have to change their way.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:19:26 PM , Rating: 2
quote:
Adds some cost to the chipset, but is far more flexible...

In today's cost-conscious environment, more cost = less flexibility.


RE: wow no kidding
By Motoman on 6/24/2011 1:02:35 PM , Rating: 5
Hey now, this is Intel we're talking about. Being only 2 years behind would be an enormous achievement for them.


RE: wow no kidding
By Mitch101 on 6/24/2011 1:41:51 PM , Rating: 2
Now that I think about it Intel was going a different direction than AMD/NVIDIA. Intel was optimizing thier design for raytracing which current GPU's from AMD/NVIDIA are optimized for Rasterization. Since game engines are written around Rasterization this might explain why AMD/NVIDIA have such a lead of the Intel GPU's.

See Image Ray Tracing vs Rasterization
http://www.cdrinfo.com/images/uploaded/Ray-tracedV...

If I recall correctly from a number of articles Real Time Raytracing is a ways off but they would adopt a middle ground combining the best of both before reaching the ultimate goal.

My thought it if Intel can achieve 60FPS through raytracing while AMD/NVIDIA achieve 200FPS through Rasterization although the AMD/NVIDIA offering is faster the end result is Intel could have more realistic visuals.

Of course game engines would need to be written to render as ray tracing which as I understand very easy but the details arent.

Just my thoughts. Dont count out Intel.


RE: wow no kidding
By Motoman on 6/24/2011 2:08:05 PM , Rating: 1
...considering that Intel's marketshare in the Gaming Graphics market is, um, 0%...I wouldn't hold my breath waiting for any game manufacturers to start writing special rendering code for Intel chips.


RE: wow no kidding
By Mitch101 on 6/24/2011 3:47:05 PM , Rating: 2
In Gaming Graphics yes
http://store.steampowered.com/hwsurvey/videocard/

But Intel dominates in Integrated graphics.
http://software.intel.com/en-us/articles/common-mi...

Of course how many integrated graphics machines aren't being used and have a video card plugged in.

Don't underestimate companies with deep pockets.


RE: wow no kidding
By ClownPuncher on 6/24/2011 3:50:26 PM , Rating: 2
They really blew my socks off with larrabee. My larrabee card can play Crysis 4 maxed out!


RE: wow no kidding
By Alexvrb on 6/25/2011 3:28:21 PM , Rating: 2
Oh yeah well I'm running 4 Larrabee cards in SLIfireX on my NF1290FXH97 and I can run Crysis 4 in 6 monitor 3D mode maxed out!


RE: wow no kidding
By Motoman on 6/24/2011 7:50:15 PM , Rating: 1
Call me crazy but I don't think game producers build games with the intention that they'll be played on PCs not intended to play games. Like ones with integrated graphics.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:20:59 PM , Rating: 2
They're absolutely crazy to ignore the largest market in existence. The guys that made Torchlight were geniuses; that game runs excellently on Intel graphics.


RE: wow no kidding
By Motoman on 6/27/2011 11:22:13 AM , Rating: 1
Uh-huh. Never heard of it.

Recon COD runs well on Intel graphics? Rift? Dare I say...Crysis?

If your assertion is correct, then every major gaming company is crazy. None of them do any work to accommodate Intel graphics - most of them buddy up with either ATI or Nvidia as it is.


Hah
By dagamer34 on 6/24/2011 12:18:08 PM , Rating: 5
I'll believe it when I see it.




RE: Hah
By tallcool1 on 6/24/2011 12:33:34 PM , Rating: 2
Agreed! They said the same kind of hoopla about Larrabee...


RE: Hah
By Breakfast Susej on 6/24/2011 1:05:49 PM , Rating: 1
Nvidia's CEO is a rampaging douchebag. However he does at least deserve credit for being right on the money when he referred to larrabee as laughabee.


RE: Hah
By Pessimism on 6/24/2011 1:49:34 PM , Rating: 2
His refusal to own up to bumpgate prompted me to permanently boycott all NVIDIA products. Shame too, because Tegra2 is spreading through smartphones and tablets like a virus.


RE: Hah
By Mitch101 on 6/24/2011 2:07:56 PM , Rating: 3
Im glad I missed bumpgate but my boycott comes from the NVIDIA hard drive fiasco that caused me to lose data. The solution I vaguely recall was to turn off some advanced hard drive option in the bios and use the Microsoft drivers. Both the NVIDIA drivers and the bios setting could cause data loss and I did. NVIDIA never owned up to the problem.

Its a shame I was a cheerleader for them back when the TNT came out but they are not the same company now.


RE: Hah
By MrTeal on 6/24/2011 2:16:36 PM , Rating: 2
I feel the same way about OCZ today. There has been open issues with the Vertex 2 series where they don't respond to commands and will BSOD after resuming from sleep. OCZ's recommendation for the last almost year has been "don't sleep your laptop, turn it off instead".


RE: Hah
By DanNeely on 6/24/2011 2:35:53 PM , Rating: 2
A recent article on Anandtech implies they're finally able to reproduce the problem. The issue went away when they hooked up any sort of debugging tools, making it a nightmare to diagnose. Hopefully this means they'll actually be able to fix it soon...


RE: Hah
By MrTeal on 6/24/2011 3:17:04 PM , Rating: 2
That's the Vertex 3 issues. The Vertex 2 issues don't seem to have an end in sight.


RE: Hah
By DanNeely on 6/24/2011 3:43:31 PM , Rating: 2
I thought it was the same issue for both generations of the SF controller.


RE: Hah
By MrTeal on 6/24/2011 4:11:44 PM , Rating: 3
I don't believe so. According to the last edit on the OCZ support thread on this topic, it's still an issue.

quote:
EDIT by RyderOCZ on March 30: DO NOT hibernate your SSD or use sleep. Shut the machine down when you are not using it. That is our best recommendation at this time. Hibernation and/or sleep may cause the drive to no longer be recognized when you wake it up.


RE: Hah
By Alexvrb on 6/25/2011 3:39:16 PM , Rating: 1
At some point I've been hit by Nvidia's Bumpgate, HDD controller issues (corruption), and their "hardware-based" firewall that never worked reliably. So while I won't bash their discrete graphics, I am leery of their chipsets and integrated graphics.


RE: Hah
By B-Unit on 6/27/2011 10:17:13 AM , Rating: 2
My boycott relates to the furious rebranding of the G92 generation of cards (8800GT -> 9800GT -> GTX(S?)250). I felt that it was inexcusable to simply rename the same card. Granted, most who would bother to buy a descrete GPU should have known, but for the average consumer, there was no way to know you were buying a 2 generation old product.


RE: Hah
By Chadder007 on 6/24/2011 12:40:37 PM , Rating: 2
Exactly what I was going to post.


What a useless comment.
By MrTeal on 6/24/2011 12:22:55 PM , Rating: 5
quote:
According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.


I'm sure it will exceed today's discrete graphics. The Radeon 6450 is a pretty weak chip, I don't think Intel could possibly be slower than that since Sandy Bridge already is.

Or did he mean faster than a 6990? Good luck with that. So, what that quote indicates is that the new iGPU will have... some level of performance. Way to be specific.




RE: What a useless comment.
By killerroach on 6/24/2011 12:40:38 PM , Rating: 5
Congratulations. You summed up marketing in a nutshell.


RE: What a useless comment.
By corduroygt on 6/24/2011 12:46:05 PM , Rating: 1
As long as it's as fast as a discrete GPU with the same thermal/power envelope, I'd consider it to be very good.


RE: What a useless comment.
By nafhan on 6/24/2011 1:24:37 PM , Rating: 2
Technically... Sandy Bridge rivaled "today's" discrete graphics at the time it came out, considering "today" at the time of Sandy Bridge's release included the 5450. If they had made that announcement about Sandy Bridge in 2008/9, it would have been even more true.


RE: What a useless comment.
By Alexvrb on 6/25/2011 3:30:28 PM , Rating: 2
Intel: "Llano graphics in 2013."


OK Intel
By Smilin on 6/24/2011 4:34:54 PM , Rating: 5
Benchmarks or it didn't happen. Until then STFU. Your track record sucks too bad to take your word.




By T2k on 6/24/2011 12:55:16 PM , Rating: 4
Seriously: it is pathetic, this absollutely dumb, stupid marketing attempt to sell your junk. Just STFU and ship it, even the dumbest investors know you are full of sh!t - when it comes to GPUs Intel is literally CLUELESS.




The focus
By Jaybus on 6/24/2011 2:33:52 PM , Rating: 2
Having a faster GPU on chip is not the focus. The point is treating it like a coprocessor. Long ago, this was the path floating point processors took. They started out as separate chips on separate sockets. Data had to be shipped into and out of both processors over a bus. Eventually, it was moved on die and integrated into the CPU core, the control unit shipping instructions from the same (cached) instruction queue to either the fp or integer unit at equal cost.

Now we are finally seeing the beginnings of the integration of a vector floating point unit, (not counting the very limited SIMD unit). It is a huge difference, and is why the vector unit was focused on. The key differences are that the data doesn't have to be shipped over the PCI-E bus, but over far faster on die and RAM memory channels. Program code doesn't have to be shipped to a separate memory space for the separate instruction queue of a GPU card/chip, but is inline in the same cached instruction queue. It is a paradigm shift, because it makes developing compilers and software to utilizes vector processing much, much simpler.




RE: The focus
By xyzCoder on 6/24/2011 9:46:12 PM , Rating: 2
"The key differences are that the data doesn't have to be shipped over the PCI-E bus, but over far faster on die and RAM memory channels."

You are comparing an integrated solution with a non-integrated solution. Compare Intel's supposedly brilliant 'vector floating point unit' against AMD's latest integrated offerings and they are similar but AMD/NVIDIA come out ahead in part because theirs support frameworks like OpenCL.

And unless code widely gets compiled specifically to use these instructions, they are going to end up as wasted space on the silicon of 99% of customers, even though specific benchmarks maybe give great results.


AVX is already available...
By nickblack on 6/26/2011 3:50:25 PM , Rating: 2
AVX is already out, and included on most if not all Sandy Bridge processors:

[skynet](0) $ grep avx /proc/cpuinfo | sort -u
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm sse4_1 sse4_2 popcnt aes xsave avx lahf_lm ida arat epb xsaveopt pln pts dts tpr_shadow vnmi flexpriority ept vpid
[skynet](0) $

The FMA extensions, however, are not part of Sandy Bridge's AVX implementation.




RE: AVX is already available...
By silverblue on 6/27/2011 7:58:57 AM , Rating: 2
They are, however, part of Bulldozer, albeit as FMA4. Intel originally opted for 4, AMD for 3, and both companies switched from one to the other. Allegedly, AMD may adopt FMA3 for compatibility reasons if Intel refuses to change.


Food for thought
By ChipDude on 6/26/2011 11:51:27 PM , Rating: 2
Laugh, throw stones or whatever regardless of whether you hate or love them, think about this

About 20 years ago there was RISC and some pretty rich entrenched companies like IBM, DEC, HP, SUN to name a few. Their architecture was superior by a long shot. There was this stodgy company making a product hobbled by complex and unwieldy features from the dark ages.

Spring forward today and what is running where? Yes there is a new upstart, but if you really look this is the RISC wars all over again.

Don't under estimate this company that seems to do no right but makes close to 10 billion bucks a year. Sure they wasted 10's of billions as well, but remember who has the technology to cram the most transistors into a little chip gives that said company huge time and ability to waste and still in the end win...




RE: Food for thought
By SPOOFE on 6/27/2011 12:19:11 AM , Rating: 2
Like it or not, Intel knows where the market is: Graphics-intensive gaming is still a niche compared to overall sales, and the incentive to get involved is small.


Big news.
By jfelano on 6/25/2011 12:23:39 PM , Rating: 3
So in 2 years Intel will have graphics as good as today's graphics cards. Amazing.




What is discrete?
By Goty on 6/24/2011 12:41:17 PM , Rating: 2
A 6470 is a discrete graphics card. Just sayin'.




Ummmm
By atlmann10 on 6/24/2011 1:17:05 PM , Rating: 2
This is just trash talk it has to be if you think about it. If AMD's on board GPU eats anything Intel has now it will do the same in two years right. I seriously doubt AMD or Nvidia or anyone else for that matter will be producing graphics facilitating hardware in two years on the same level as GPU's right now. Especially not as fast generally as both AMD and Nvidia move with there hardware development for the last five year. Now the Intel eats Nvidia thing might give them some valid point, but for them to eat Nvidia and have an internal GPU on die they better do it before this year ends which I don't see happening anyway.

So this is pure marketing like when the first Llano rumors were sub-officially released.

As far as it goes Intel is facing some heavy, heavy competition now really. AMD is about to drop a whole new series type of processors, quad core ARM will be out supposedly by October, TI is all over the place as is Nvidia in ARM core processing as are Samsung and all kinds of other companies such as Tilera.The lightscribe VS USB3 stuff which seems to still be an APPLE laptop specialty like firewire was which never really took off. on and on and on!!!




Intel Says
By wwwcd on 6/24/2011 2:16:59 PM , Rating: 2
Intel Says...slowly, slowly baby and we come ...to fail together




By GeorgeOu on 6/25/2011 9:19:47 AM , Rating: 2
Intel Sandy Bridge is a much older processor than the just released AMD Fusion. There was no Fusion to fall short when Sandy Bridge was released.

Ivy Bridge will be in the range of Fusion performance but probably push the power consumption down.




The echo
By atlmann10 on 6/25/2011 1:52:32 PM , Rating: 2
I feel like I have heard this from Intel before about there upcoming GPU and it's capabilities, am I wrong, or is there an echo?




Too little, too late
By M4gery on 6/30/2011 12:57:44 PM , Rating: 2
Imo, this is just intel trying to save face after a brutal smackdown by Llano. (and make no mistake, its damned brutal)

Besides, unless Intel can somehow pull off asymmetrical Crossfire or SLI, Llano is still going to have a distinct advantage there. Based on my builds and usage over the last several years, I am highly considering building a Llano rig since it works with crossfire. (just a shame that it doesnt help DX9 titles)




"A lot of people pay zero for the cellphone ... That's what it's worth." -- Apple Chief Operating Officer Timothy Cook














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki