Print 43 comment(s) - last by KOOLTIME.. on Feb 22 at 3:00 PM

Intel hopes to reach full production capacity after a two month delay

Just months ago at the 2011 Intel Developer Forum, executives with the world's largest traditional personal computer chipmaker, Intel Corp. (INTC) were all boast and bravado, saying their competitors were years behind in process.  Indeed, the talk about the dramatic gains in terms of power efficiency and clock speed using Intel's proprietary 22 nm FinFET 3D-transistor design sounded very impressive.

But the first chinks in the armor perhaps began to show at the 2012 Consumer Electronics Show, when Intel was caught faking its 22 nm Ivy Bridge DirectX 11 demo during its ultrabook pitch.  Intel brushed off the trickery, but the incident raised some serious questions.  If the 22 nm chip was launching in April at production volume and had already been taped out in final form, why would Intel have to use canned video?  Why couldn't it show its real product?  Why the obfuscation??

Well, DigiTimes is reporting that multiple OEM sources have shared that Ivy Bridge is being delayed from April to June.  While not a huge delay, the report raises questions about whether Intel's 22 nm process is as stable as it claims.

To be fair, the OEMs appear to be claiming that the delay is due to inventories:
Because most first-tier notebook vendors are having trouble digesting their Sandy Bridge notebook inventories due to the weak global economy, while Intel is also troubled by its Sandy Bridge processor inventory, the CPU giant plans to delay mass shipments of the new processors to minimize the impact, the sources noted.

In other words, PCs didn't sell well in 2011, Intel built up a surplus of CPUs, and so it wants to delay its release.  This is all very plausible, and indeed lines up with write-offs found in Intel's earnings reports.  

But it is also possible that Intel isn't being entirely forthcoming and that Ivy Bridge wasn't being delivered at the reliable high volumes it had hoped.  And it could very well be a bit of both factors -- too high inventories, and some struggles on the process front.

Regardless, it sounds like customers will have to wait on Ivy Bridge, a bit.

That's good news for the competition.  AMD hopes to aggressively roll out its Trinity accelerated processing units (APUs) later this year.  The chips are built on a 32 nm process (GlobalFoundries), but still aim to be competitive with Ivy Bridge in terms of power consumption and graphics performance.  AMD is gambling that the CPU will lose, processing speed-wise, to Ivy Bridge, but be "good enough" for most consumers.  

Trinity in the wild
AMD's Trinity APU (center) will launch later this year and aggressively target would-be Intel Ivy Bridge buyers by offering improved graphics and power efficiency at a lower price. [Image Source: Jason Mick/DailyTech]

AMD hopes to price its chip + chipset package at hundreds of dollars beneath Intel.  Where as Intel is targeting systems $700 and up, AMD has stated to us that Trinity systems will retail for $500 or less.  Strong 2011 APU sales of AMD's initial swing at this strategy made it look like a home run.

Likewise, ARM CPU makers, including Qualcomm, Inc. (QCOM) are looking to invade laptops and compact desktops late this year, with the introduction of 28 nm ARM CPUs compatible with Microsoft Corp.'s (MSFTnew Windows 8 [1][2][3][4][5][6].  The Q4 2012 devices are expected to follow a strategy similar to AMD's -- strong power efficiency at a low price.

The delay is also good news for third party USB 3.0 chipmakers like Renesas Electronics Corp. (TYO:6723), ASMedia Technology Inc., and Etron Ltd.  As Ivy Bridge was the first Intel chip to include on-die USB 3.0 support, it was expected to render these competitors' designs obsolete.  But now, they have been bought a bit more time.

Intel's core hope in terms of maintaining its dominant position is to beat the competition in process, and trickle down its process improvements into its budget models, mitigating cost and architectural disadvantages.  Intel has made big promises [1][2] regarding Atom-powered smartphones, but without 22 nm technology it appears to be forgoing any sort of big mobile push in 2012.  The longer it waits, the more advantage it gives to the hungry rivals.  Intel should hope that the delay does not set back its very aggressive 22 nm Atom rollout.

Sources: DigiTimes, MaximumPC [faked Intel Demo]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Lack of competition from AMD?
By ultimatebob on 2/17/2012 1:16:52 PM , Rating: 1
Heh... Intel probably decided that they didn't need to rush Ivy Bridge considering that they already have the fastest desktop and mobile processors available... by a considerable margin.

Think about it... why should Intel rush putting out new faster parts when they have plenty of existing inventory that mops the floor with their competition? Hell... AMD's fastest desktop processor can't keep up with their mid-range $200 Core i5 chip in most applications.

RE: Lack of competition from AMD?
By LRonaldHubbs on 2/17/2012 1:46:20 PM , Rating: 4
why should Intel rush putting out new faster parts when they have plenty of existing inventory that mops the floor with their competition?

Ivy Bridge is a die shrink of the current CPUs, and should be cheaper to make as a result of fitting more chips on a wafer. It is in their interest to shrink the design no matter what the competition does simply because it gives them a cost advantage.

RE: Lack of competition from AMD?
By Dylock on 2/17/2012 2:10:34 PM , Rating: 2
Still, the law of supply and demand comes in here. While having a smaller chip saves Intel money, in no way would they be concerned with reducing their profit margin by flooding the thirstless market.

RE: Lack of competition from AMD?
By Daemyion on 2/17/2012 2:19:39 PM , Rating: 2
It's in Intel's interest to clear Sandy Bridge stock without reducing margins on them. That stock is already manufactured and sitting in warehouses because of slow PC sales. If AMD were competitive, Intel would move Ivy Bridge into channels quicker, and the surplus SB profit margins would suffer as a result.

RE: Lack of competition from AMD?
By kenyee on 2/17/2012 3:20:29 PM , Rating: 2
If AMD had something that competed, Intel would have been more than happy to write off excess inventory to roll out something that would stomp them.

RE: Lack of competition from AMD?
By Targon on 2/20/2012 8:10:05 AM , Rating: 3
And even with this, AMD still beats out Intel in terms of overall quality of laptops and even desktops in the low end of the market(<=$520). Yes, Intel is ahead in terms of CPU design, but Intel GPU technology is FAR FAR behind, and have you ever noticed how many problems there are with Intel based networking due primarily to poor drivers?

Since we have GPU acceleration in web browsers and Flash(yes, people around HERE hate Flash, but it is still a huge part of the web right now and normal consumers don't share the dislike of it). This makes GPU power important, so all those systems with Intel graphics end up not looking like they are so much better, even with faster CPUs in them.

This is why the AMD A6 and A8 based machines really do sell well, not because they are the fastest in terms of CPU power, but because they ARE competitive on a system level in terms of price/performance/features.

RE: Lack of competition from AMD?
By AntDX316 on 2/21/2012 5:12:54 AM , Rating: 2
I think they are talking about the 3930k and 3960x

wouldn't be surprised in not selling much cause of the price tag and the fact the 2500k-2700k is good enough for gaming

I shut HT off on my 4.9ghz 2600k and everything felt Worse

3930k is $599 plus you need a $200+ motherboard when you can get a 2600k for like $300 and a $150 motherboard, it's not really ideal cause someone who is on a budget can get other parts better with the difference saved for same cpu consumer non developer/renderer performance

By retrospooty on 2/17/2012 3:01:35 PM , Rating: 2
" It is in their interest to shrink the design no matter what the competition does simply because it gives them a cost advantage."

Your both right. Intel will make more $$$ with a die shrink, but initially, yields are always low. Becasue they have no competition, they can afford to spend another 2 months to fine tune the process and get yields higher. Guaranteed if AMD had a competitive offering you'd see it released in April. You would also see Ivy Bride on socket 2011 right off the bat, instead of waiting. Intel is so far ahead, they can afford to milk it.

RE: Lack of competition from AMD?
By TakinYourPoints on 2/17/2012 8:39:11 PM , Rating: 2
They still have existing Sandy Bridge CPUs that they can clear out for full price. Why not take advantage of it when AMD parts aren't competing?

RE: Lack of competition from AMD?
By bbbarry on 2/17/2012 11:29:05 PM , Rating: 2
Heres the thing, "Lack of Competition" No Longer applies, AMD is going to stomp Intel this year / Next year in the laptop market, While Intel will still hold the PC Market. Why you ask?
1. AMD Ultrathins are a much better deal than Intel Ultrabooks
2. AMD "Trinity" Cores can Support Light Direct X 11 Games, and it can Do it well. Most people including myself still would not recomment an Intel Laptop without a discrete Graphics card
3. Removing the Discrete graphics card means less power consumption
4. Intel's Manufacturing Performance may be ahead, But AMD's Architecture is becoming better than Intel's Current one, And with Intel all focus on Die Shrinks, This could mean a hazy future for Intel
5. Any CPU Performance above 3 GHZ in a laptop, and 3.5 GHZ in a Desktop Is more than enough. There are very rare situations where you would need more than this
6. AMD Simply, Has better Graphics, And with the next Fusion core to have GCN Architecture, It will probably pull further ahead than Intel.

Im not a fanboy, I have never owned an AMD Chip that is not graphics, I am Simply looking at it from the outside because im looking at these new "All in One" Chips for uni. And Honestly, looking at it, AMD is comming with a Vengeance and ARM is comming with the Salt

RE: Lack of competition from AMD?
By TakinYourPoints on 2/17/12, Rating: 0
RE: Lack of competition from AMD?
By Targon on 2/20/2012 8:17:49 AM , Rating: 3
A lot of the "Intel dominates laptops" is only because Intel has a huge presence in the mindset of many, not because people really have tested the machines side by side and have made a decision based on their needs.

Now, what sort of performance in real-world tests will Intel bring with Ivy Bridge? I don't care about select benchmarks, I am talking things like testing browsing speed under Firefox with Flash(there are a LOT of Flash apps out there). If AMD based machines are competitive in terms of price and offer other benefits, they will be fine. The A8 really does well in terms of heat dissipation in laptops, you don't burn yourself compared to the Intel machines which ALL feel like you would burn yourself if you let the machine sit on your lap for a few hours.

RE: Lack of competition from AMD?
By Reclaimer77 on 2/18/2012 1:48:46 AM , Rating: 1

AMD is going to stomp Intel this year / Next year in the laptop market,


Why you ask?

Because you ate shrooms when you came up with this?

RE: Lack of competition from AMD?
By bbbarry on 2/18/2012 2:41:26 AM , Rating: 1
Lol. That real is Ignorant

Think Of what people want, People want a Laptop that can do everything at a cheap price and a long battery life. I very much Doubt people will want to pay +$200 for the same thing.

DW Many people are already saying to buy AMD Stock and almost every website has moved it off Hold/neutral to Buy (Unlike poor NVIDIA)

When i think of the General Public i think we should really include Light gaming, Your right in saying if someone wanted a gaming a laptop you would go Intel + an ATI(AMD) or NVIDIA Card. but really, Who needs that anymore.

Most people here think Intel = Best because of name. Not so, Think of 2006. Back when AMD was on top. Intel was in a bad way, Then all of a sudden they pretty much mopped the floor with AMD with their new Architecture. Now it's on the other foot, Intel has been ignoring AMD and going Mobile, AMD has a new architecture. A university already made the Fusion core's performance go up by 20% by moving one function from the CPU Part to the GPU part. Mate, Trinity > Ivy Bridge this time round, Sorry to say

Oh if you want the best gaming around, Im pretty sure the best Graphics card at the moment is AMD anyways (7970) ;)

RE: Lack of competition from AMD?
By bennyg on 2/19/12, Rating: 0
RE: Lack of competition from AMD?
By bennyg on 2/19/2012 5:47:56 AM , Rating: 2
RE: Lack of competition from AMD?
By bennyg on 2/19/2012 5:32:56 AM , Rating: 2
1. More to it than just price...Intel has margin to play with, it can compete with AMD on price if it wanted to. Ultrabooks are better that's why they're more expensive. Intel makes the same profit on 1 CPU as AMD on about 5 by my guess. Revenue is for ego, profit is for reality.
2. So what... most people don't care about gaming and wouldn't have a clue what DX11 means. To that crowd, the "2Gb VRAM" on the specs sheet means more...
3. Wrong on two counts, crunching FP numbers costs power whether it's on-die IGP or dGPU... and Optimus powers down the dGPU quite effectively. Haven't yet met an all-AMD notebook of equivalent power with better battery life than an Intel/Nvidia option regardless of price...
4. Lol, I'd rather have Intel's R&D budget than AMD's any day, look at performance increase in IPC from 45nm Core2 to 45nm Nehalem and 32nm Nehalem to 32nm SB, 15% IPC from pure architecture is something AMD hasn't done from any of K8 to K10 to Phenom to Dozer. Remember when everyone was saying "Intel's dead, Prescott is a stinker with current leakage issues, it'll be the last CPU they make"... then they came out with Conroe...
5. Roflol, are you still using a P100 with Win95 because it is "good enough" for typing a Word Doc... if power becomes available immediately it's used to create a better experience (only occasionally, less optimised software)
6. If all AMD can say is it's cheaper and the IGP on their chipset is a bit faster... and Intel can still say their chips are faster, have lower power consumption, have more features, are more reliable... Intel will still own the CPU space for anything over $200.

Ask anyone who knows the slightest thing about business, you do NOT want to own the low end of the market without a bigger strategic benefit flowing on to the high end... discounting for market share is a sign of utter desperation.

Nicely done, it's nice to have a change from all the iTrolls.

By Hyghlander465 on 2/20/2012 1:44:39 PM , Rating: 2
AMD suck when it come to CPU. But GPU rock all the way.

The problem with Ivy bridge is the firmware for the GPU or embed GPU don't work. One more time Intel Israel majorly Frak up. Last year chipset problem was design in Israel Intel. But the real issues is rushing off product that are not real for the market. Maybe Intel learn! But Intel Oregon design are one.

RE: Lack of competition from AMD?
By KOOLTIME on 2/22/2012 3:00:37 PM , Rating: 2
A big missing point of raw power, is also price, AMD is mopping up Intel on that area.

cost/price/performance - ratio AMD is winning in the sales market right now. Intel may be better, but cost is defeating itself. Same reason A Ferrari is not in everyone's drive way regardless of how good it may be.

to expensive cost is why they excess inventory, not a quality issue. Most folks dont have extra hundreds to buy a intel cpu and mobo combo average prices is alot more then an amd combo.

Not a bad thing
By TheRequiem on 2/17/2012 1:21:36 PM , Rating: 2
This is actually not a bad thing, as it will prevent me from getting a new pc earlier then I should. Most of the next-generation laptops like the Alienware m17x-r4's are only supposed to have come equipped with GTX 675m's, initially, which is kind of annoying, lol. Now, Nvidia should be able to match this timeline and get us some 680m GTX's, which will be lovely, since that is also slated for Q3. I'm willing to wait for the higher-end kepler if I am going to splash on an Ivy Bridge laptop. I haven't bought a desktop in years... laptops are fast enough for me these days and have lower power consumption and I can pack em and roll with em. I have a Dell XPS 3D right now that I hook up to a 23-inch 1080p screen at home with 3D and it's fast as hell, it's a Sandy Bridge, higher-end.

RE: Not a bad thing
By geddarkstorm on 2/17/2012 2:27:59 PM , Rating: 2
I share your sentiments but... June :(.

RE: Not a bad thing
By TormDK on 2/17/2012 2:46:29 PM , Rating: 2
No, like this :

Juuuuuuuuuuune!? :(((((((((((

Ah well, I suppose the bank will get to keep my money till then, my old QX9650 and 480GTX still works.

RE: Not a bad thing
By TakinYourPoints on 2/18/2012 12:02:36 AM , Rating: 2
Same here. My i7 860 with SLI GTX 460 cards is still plenty fast, even outputting to a 2560x1440 display. I'm a-ok sitting on this for a while longer, it will make for probably the longest I've held onto a CPU/mobo/RAM for a gaming PC, three years.

Hell, if I get a Kepler GPU first and it makes it more than good enough I may hang onto this CPU even longer. I haven't thought "good enough" for so long with a gaming PC, but it looks like we're there.

I guess it is because cross-platform development has lowered the performance ceiling for games (just look at Skyrim), plus the games I currently love (Starcraft 2, DOTA 2 beta, Diablo 3 beta) don't even need all that much juice to look really good and play smoothly.

RE: Not a bad thing
By TormDK on 2/18/2012 6:54:56 AM , Rating: 2
As a proud member of the PC Master Race it's a dangerous school of thought to have - but I really consider 2012 as the year that we don't need further upgrades as long as we stay at a 1980x1080 resolution on our monitors of choice.

You could argue that the turning point already came in 2011, with the release of the 560 ti, but for a maxed out AAA game more horsepower is needed.

It'll be interesting to see what Nvidia and AMD comes with this year, but I am also hoping for a push for better monitors, because otherwise new hardware will be redundant in alot of situations.

RE: Not a bad thing
By SlyNine on 2/18/2012 10:01:27 AM , Rating: 2
I'd say 2009, Core I7's were out, 5870's came out. Hell I could say 2008 almost. Core I7 920 with 4870 is still a good gaming rig, and was not to expensive when it was released.

RE: Not a bad thing
By FaceMaster on 2/18/2012 10:31:10 AM , Rating: 2
Most games are based around consoles. They're based on the geforce 7900 / ATI X1900. Add another generation to get the same performance but at 1080p (Geforce 8000 / ATI 4000) and you've got a PC that will last until the next generation of consoles! Anything above that will just make ultra detail settings / AA / multi monitor displays run smoother. My core 2 duo + 8800 still runs most games at reasonable settings. Try doing the same thing in 2006 with a 2001 computer build!

RE: Not a bad thing
By TakinYourPoints on 2/18/2012 11:23:05 PM , Rating: 2
Your reasoning sort of works. PC hardware is clearly more powerful, but at the same time console hardware can scale better over longer periods of time since the platform is static and there is significantly less overhead compared to a normal desktop operating system.

Developers can squeeze every bit of performance out of that hardware. Compare a launch PS2 game from 1999 like The Bouncer to a 2007 game like God Of War 2. It looks like a multigenerational leap in visuals, and it stood up very well against the XBox 360 and PS3 games at the time.

Obviously there is only so much you can push hardware, and the PS2 was bled dry by the time God Of War 2 came out. That said, a 1999 PC certainly wouldn't be able to run even a 2005 game well.

Because of that, comparing CPU/GPU specs of consoles to PCs doesn't make for the best argument, only because developers can optimize for static, low overhead platforms more than they can with hardware that is a moving target with a full operating system running on top of it.

Otherwise I agree with you, a 2007 build is much more legit in 2012 compared to a 2001 build in 2007. It is the same logic that applied to office software and things like that, there was a point where hardware started to matter much less about 10 years ago, and the same is happening with games now.

RE: Not a bad thing
By jabber on 2/20/2012 12:54:15 PM , Rating: 2
I think the major turning point was the advent of proper dual core CPUs.

Once we hit those then 90% of the worlds users were sorted power wise for browsing,Word,Excel etc. in terms of being able to do more then two things at once without the whole PC grinding to a halt.

Really the truly high-end CPU market is only for true research/number crunching or benchmark junkies.

Then again do the labs buy the $1000 Intel CPU or do they buy four $250 ones instead?

RE: Not a bad thing
By TormDK on 2/19/2012 7:16:08 AM , Rating: 2
It would really depend on the resolution.

The point I was trying to make is that currently we do not go higher than 1920x1080 or 1900x1200. I recall the difference when going from 1280x1024 to 1650x1080 - the difference was night and day, not so much when I moved from 22" to 24" - I'm hoping things will pick up when I move to a higher resolution again, but I'm doubting it.

I'm not expecting to get that same sort of feeling from my next hardware purchase because right now I do not see the decent monitors that go higher than HD resolutions, so things will not be prettier to look at, it will only run better/smoother.

Which isn't bad per say, it just takes away some of the thrill of buying new cutting edge hardware.

We also have to consider the average consumer, and even if a new console is released late this year/early next year the baseline does not jump other than give the final death touch to Windows XP based gaming, and DirectX 9.0 (Hopefully!) as current hardware would allow consoles to run a DX11 runtime.

RE: Not a bad thing
By TakinYourPoints on 2/18/2012 6:08:17 PM , Rating: 2
I completely agree. The only reason I'm running SLI cards and am awaiting Kepler GPUs is because of the 2560x1440 resolution of my display. Without very high resolutions, cutting edge hardware really isn't as necessary as it used to be. If I was on a normal 1920x1080 monitor or lower like the majority of people out there, it wouldn't really matter to me.

Pushing system requirements has flattened out in recent years, partly because of console ports putting a ceiling on system requirements (as I said before), and also because it is more profitable for developers this way. Valve and Blizzard, the two most successful PC game developers out there, manage to make great looking games while also targeting a very wide range of system specs. Targeting directly at the bleeding edge and having software drive hardware upgrades might have worked for Quake 3 back in 1999, but it doesn't make financial sense anymore. Games are riskier bets than ever with increased budgets and more competition. Crysis is a perfect example of that, and I think that game may have put an end to the hardware upgrade arms race. Games started to look "good enough" at that point and the diminishing returns on spending thousands on top of the line hardware didn't make sense anymore.

Visual improvements will continue, but they will be a steady creep based around hardware in the midrange (ie - GTX 560), not the best of the best.

Intel's "faked demo"
By JarredWalton on 2/17/2012 3:41:24 PM , Rating: 3
Really, do we need to go there? "If the 22 nm chip was launching in April at production volume and had already been taped out in final form, why would Intel have to use canned video? Why couldn't it show its real product? Why the obfuscation??"

There's a simple reason for this, and it's called doing a presentation with someone that doesn't want anything to go wrong. If you actually play a game, you run the risk of random problems. Things like, oh I dunno, the driver in DiRT 3 sucking and looking like an idiot for one, or there's always a risk of BSOD (even if small).

I worked software development for a few years back in 98-01, and we put together a demo at one point for a show. A canned video would have been awesome if we could have pulled it off. As it turned out, our CEO decided to branch out from our demo script and "do something really amazing" for the audience, and because the idiot strayed from the tested path he encountered errors and ended up looking like a fool. Well, he was a fool.

Anyway, I saw IVB running DiRT 3 in person on the same ultrabook and it worked. Was it fully stable? Who knows -- they didn't let us play the game for a few hours to find out. They also didn't let us see the final score from the benchmark run, which is one more reason to not do the demo live on stage -- e.g. what if something were to go wrong and the final DiRT 3 benchmark score got revealed? The delays could be from any number of things, but knowing Intel I'm guessing Trinity is probably targeting June as well.

RE: Intel's "faked demo"
By TakinYourPoints on 2/17/2012 8:45:06 PM , Rating: 1
Really, do we need to go there?

You do know that this kind of reporting is par for the course here, right?

I have no idea why you and Anand maintain one of the best tech sites out there and at the same time keep these guys on the sidebar. It is like having The Economist run columns from the National Enquirer.

Oh Noes !
By Beenthere on 2/17/2012 3:58:39 PM , Rating: 1
Only "bad" AMD has production delays and Fab challenges... /s

Whooda thunk InHell could have Fab issues and delay again? Maybe they are still trying to fix their defective chipsets?

RE: Oh Noes !
By Skywalker123 on 2/19/2012 9:29:14 PM , Rating: 2
Whooda thunk you have a brain? You're an idiot

Not really surprising
By GreenChile on 2/17/2012 12:53:21 PM , Rating: 2
22nm has got to be tough in itself but then throw in 3D transistors and you've got a recipe for pain.

I'll believe it when it happens
By Khato on 2/17/2012 1:24:22 PM , Rating: 2
Until there are some reliable sources, this sounds like nothing more than DigiTimes misinterpreting their sources. There's a difference between Intel delaying their launch and the notebook OEMs intentionally delaying their releases in order to clear out their remaining inventory. After all, since when has Intel ever stopped selling their previous generation soon as the new one comes out?

Oh, and for those interested here's a post from an Intel representative on the subject -

Server chips?
By twhittet on 2/17/2012 3:44:17 PM , Rating: 2
Hopefully this won't impact server CPU's. We're waiting to purchase the next generation of servers, and the profit margin is larger for them anyways.

Our Dell rep just told us last week we should be hearing something March 6th, so I'll hope for that.

Don't forget
By Hector2 on 2/17/2012 4:49:18 PM , Rating: 2
Just a month ago, while Intel was announcing another record breaking quarter and a $54B revenue year, they also said they were accelerating their spending on R&D up to $10B in 2012. This to ramp 22nm and to advance 14nm.

I doubt that, suddenly, as they were kicking in the afterburners to ramp up 22nm that anything came up in their Ivy Bridge process that they weren't already aware of. They wouldn't have been planning to increase their R&D budget on new equipment & facilities if they thought the 22nm process wasn't sustainable.

But if ARM wants to hope & believe that Intel will fall apart delivering FinFETs, go ahead. There's plenty of kool-aid to go around.

By p05esto on 2/18/2012 12:16:59 AM , Rating: 2
This sucks, I held off on Sandy and have been waiting anxiously for Ivy....and NOW I have to wait another two months. This sucks. I've really been looking forward to this new build. Darn you Intel, this is the second delay/problem in only one year. AMD needs to step up and get these guys back on their toes!

By riottime on 2/18/2012 2:26:14 AM , Rating: 2
pcsx 2, mame, jpcsp, and dolphin emulators are hungry for latest, most powerful cpus like ivy bridge. amd/arm can't compete with intel in emulator performance. :)

By Gungel on 2/19/2012 7:51:40 AM , Rating: 2
Found this article on VR-Zone that said only the dual core mobile chip is delayed and that the desktop ivy bridge is going to be released as planned.

Tasteless shot at ESPN?
By chadwick21 on 2/20/2012 4:35:35 AM , Rating: 2
"But the first chinks in the armor"


Same old Mick
By BSMonitor on 2/20/2012 8:46:41 AM , Rating: 2
But the first chinks in the armor perhaps began to show at the 2012 Consumer Electronics Show, when Intel was caught faking its 22 nm Ivy Bridge DirectX 11 demo during its ultrabook pitch. Intel brushed off the trickery, but the incident raised some serious questions. If the 22 nm chip was launching in April at production volume and had already been taped out in final form, why would Intel have to use canned video? Why couldn't it show its real product? Why the obfuscation??

Nothing like Mick posts.

J Mick you are an idiot.

"I modded down, down, down, and the flames went higher." -- Sven Olsen

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki