backtop


Print 43 comment(s) - last by KOOLTIME.. on Feb 22 at 3:00 PM

Intel hopes to reach full production capacity after a two month delay

Just months ago at the 2011 Intel Developer Forum, executives with the world's largest traditional personal computer chipmaker, Intel Corp. (INTC) were all boast and bravado, saying their competitors were years behind in process.  Indeed, the talk about the dramatic gains in terms of power efficiency and clock speed using Intel's proprietary 22 nm FinFET 3D-transistor design sounded very impressive.

But the first chinks in the armor perhaps began to show at the 2012 Consumer Electronics Show, when Intel was caught faking its 22 nm Ivy Bridge DirectX 11 demo during its ultrabook pitch.  Intel brushed off the trickery, but the incident raised some serious questions.  If the 22 nm chip was launching in April at production volume and had already been taped out in final form, why would Intel have to use canned video?  Why couldn't it show its real product?  Why the obfuscation??

Well, DigiTimes is reporting that multiple OEM sources have shared that Ivy Bridge is being delayed from April to June.  While not a huge delay, the report raises questions about whether Intel's 22 nm process is as stable as it claims.

To be fair, the OEMs appear to be claiming that the delay is due to inventories:
Because most first-tier notebook vendors are having trouble digesting their Sandy Bridge notebook inventories due to the weak global economy, while Intel is also troubled by its Sandy Bridge processor inventory, the CPU giant plans to delay mass shipments of the new processors to minimize the impact, the sources noted.

In other words, PCs didn't sell well in 2011, Intel built up a surplus of CPUs, and so it wants to delay its release.  This is all very plausible, and indeed lines up with write-offs found in Intel's earnings reports.  

But it is also possible that Intel isn't being entirely forthcoming and that Ivy Bridge wasn't being delivered at the reliable high volumes it had hoped.  And it could very well be a bit of both factors -- too high inventories, and some struggles on the process front.

Regardless, it sounds like customers will have to wait on Ivy Bridge, a bit.

That's good news for the competition.  AMD hopes to aggressively roll out its Trinity accelerated processing units (APUs) later this year.  The chips are built on a 32 nm process (GlobalFoundries), but still aim to be competitive with Ivy Bridge in terms of power consumption and graphics performance.  AMD is gambling that the CPU will lose, processing speed-wise, to Ivy Bridge, but be "good enough" for most consumers.  

Trinity in the wild
AMD's Trinity APU (center) will launch later this year and aggressively target would-be Intel Ivy Bridge buyers by offering improved graphics and power efficiency at a lower price. [Image Source: Jason Mick/DailyTech]

AMD hopes to price its chip + chipset package at hundreds of dollars beneath Intel.  Where as Intel is targeting systems $700 and up, AMD has stated to us that Trinity systems will retail for $500 or less.  Strong 2011 APU sales of AMD's initial swing at this strategy made it look like a home run.

Likewise, ARM CPU makers, including Qualcomm, Inc. (QCOM) are looking to invade laptops and compact desktops late this year, with the introduction of 28 nm ARM CPUs compatible with Microsoft Corp.'s (MSFTnew Windows 8 [1][2][3][4][5][6].  The Q4 2012 devices are expected to follow a strategy similar to AMD's -- strong power efficiency at a low price.

The delay is also good news for third party USB 3.0 chipmakers like Renesas Electronics Corp. (TYO:6723), ASMedia Technology Inc., and Etron Ltd.  As Ivy Bridge was the first Intel chip to include on-die USB 3.0 support, it was expected to render these competitors' designs obsolete.  But now, they have been bought a bit more time.

Intel's core hope in terms of maintaining its dominant position is to beat the competition in process, and trickle down its process improvements into its budget models, mitigating cost and architectural disadvantages.  Intel has made big promises [1][2] regarding Atom-powered smartphones, but without 22 nm technology it appears to be forgoing any sort of big mobile push in 2012.  The longer it waits, the more advantage it gives to the hungry rivals.  Intel should hope that the delay does not set back its very aggressive 22 nm Atom rollout.

Sources: DigiTimes, MaximumPC [faked Intel Demo]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Not a bad thing
By TormDK on 2/18/2012 6:54:56 AM , Rating: 2
As a proud member of the PC Master Race it's a dangerous school of thought to have - but I really consider 2012 as the year that we don't need further upgrades as long as we stay at a 1980x1080 resolution on our monitors of choice.

You could argue that the turning point already came in 2011, with the release of the 560 ti, but for a maxed out AAA game more horsepower is needed.

It'll be interesting to see what Nvidia and AMD comes with this year, but I am also hoping for a push for better monitors, because otherwise new hardware will be redundant in alot of situations.


RE: Not a bad thing
By SlyNine on 2/18/2012 10:01:27 AM , Rating: 2
I'd say 2009, Core I7's were out, 5870's came out. Hell I could say 2008 almost. Core I7 920 with 4870 is still a good gaming rig, and was not to expensive when it was released.


RE: Not a bad thing
By FaceMaster on 2/18/2012 10:31:10 AM , Rating: 2
Most games are based around consoles. They're based on the geforce 7900 / ATI X1900. Add another generation to get the same performance but at 1080p (Geforce 8000 / ATI 4000) and you've got a PC that will last until the next generation of consoles! Anything above that will just make ultra detail settings / AA / multi monitor displays run smoother. My core 2 duo + 8800 still runs most games at reasonable settings. Try doing the same thing in 2006 with a 2001 computer build!


RE: Not a bad thing
By TakinYourPoints on 2/18/2012 11:23:05 PM , Rating: 2
Your reasoning sort of works. PC hardware is clearly more powerful, but at the same time console hardware can scale better over longer periods of time since the platform is static and there is significantly less overhead compared to a normal desktop operating system.

Developers can squeeze every bit of performance out of that hardware. Compare a launch PS2 game from 1999 like The Bouncer to a 2007 game like God Of War 2. It looks like a multigenerational leap in visuals, and it stood up very well against the XBox 360 and PS3 games at the time.

Obviously there is only so much you can push hardware, and the PS2 was bled dry by the time God Of War 2 came out. That said, a 1999 PC certainly wouldn't be able to run even a 2005 game well.

Because of that, comparing CPU/GPU specs of consoles to PCs doesn't make for the best argument, only because developers can optimize for static, low overhead platforms more than they can with hardware that is a moving target with a full operating system running on top of it.

Otherwise I agree with you, a 2007 build is much more legit in 2012 compared to a 2001 build in 2007. It is the same logic that applied to office software and things like that, there was a point where hardware started to matter much less about 10 years ago, and the same is happening with games now.


RE: Not a bad thing
By jabber on 2/20/2012 12:54:15 PM , Rating: 2
I think the major turning point was the advent of proper dual core CPUs.

Once we hit those then 90% of the worlds users were sorted power wise for browsing,Word,Excel etc. in terms of being able to do more then two things at once without the whole PC grinding to a halt.

Really the truly high-end CPU market is only for true research/number crunching or benchmark junkies.

Then again do the labs buy the $1000 Intel CPU or do they buy four $250 ones instead?


RE: Not a bad thing
By TormDK on 2/19/2012 7:16:08 AM , Rating: 2
It would really depend on the resolution.

The point I was trying to make is that currently we do not go higher than 1920x1080 or 1900x1200. I recall the difference when going from 1280x1024 to 1650x1080 - the difference was night and day, not so much when I moved from 22" to 24" - I'm hoping things will pick up when I move to a higher resolution again, but I'm doubting it.

I'm not expecting to get that same sort of feeling from my next hardware purchase because right now I do not see the decent monitors that go higher than HD resolutions, so things will not be prettier to look at, it will only run better/smoother.

Which isn't bad per say, it just takes away some of the thrill of buying new cutting edge hardware.

We also have to consider the average consumer, and even if a new console is released late this year/early next year the baseline does not jump other than give the final death touch to Windows XP based gaming, and DirectX 9.0 (Hopefully!) as current hardware would allow consoles to run a DX11 runtime.


RE: Not a bad thing
By TakinYourPoints on 2/18/2012 6:08:17 PM , Rating: 2
I completely agree. The only reason I'm running SLI cards and am awaiting Kepler GPUs is because of the 2560x1440 resolution of my display. Without very high resolutions, cutting edge hardware really isn't as necessary as it used to be. If I was on a normal 1920x1080 monitor or lower like the majority of people out there, it wouldn't really matter to me.

Pushing system requirements has flattened out in recent years, partly because of console ports putting a ceiling on system requirements (as I said before), and also because it is more profitable for developers this way. Valve and Blizzard, the two most successful PC game developers out there, manage to make great looking games while also targeting a very wide range of system specs. Targeting directly at the bleeding edge and having software drive hardware upgrades might have worked for Quake 3 back in 1999, but it doesn't make financial sense anymore. Games are riskier bets than ever with increased budgets and more competition. Crysis is a perfect example of that, and I think that game may have put an end to the hardware upgrade arms race. Games started to look "good enough" at that point and the diminishing returns on spending thousands on top of the line hardware didn't make sense anymore.

Visual improvements will continue, but they will be a steady creep based around hardware in the midrange (ie - GTX 560), not the best of the best.


"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki