backtop


Print 43 comment(s) - last by wordsworm.. on Feb 6 at 9:58 PM

iPad 3 looks to be a major bump, will likely be Apple's first LTE modem, may pack its first homemade core, too

Apple, Inc. (AAPL) announced the original iPad in January 2010, and it hit the market in April 2010.  The polished device was undeniably a game-changing product that turned an often-overlooked niche (tablets) into a booming market.   It was about this time last year that Apple sent out its occasions to a special event.  That event -- on March 1 -- turned out to be late Apple CEO and co-founder Steven P. Jobs' last major product announcement -- the iPad 2.

The iPad (first gen.) sold 15 million units in 2010.  The iPad 2 sold 40 million units in 2011 [1][2][3][4], keeping Apple far ahead of would-be Android tablet competitors.  Now excitement is mounting for the iPad 3.

I. What is Known Thus Far

The iPad 3 has been semi-confirmed to have a "Retina" display made by LG Display Comp., Ltd. (KS:034220) (sister company to LG Electronics Inc. (KS:066570)).  A report in The Korean Times quotes LG Display CEO Young Soo Kwon as confirming that Retina-display iPads are in production.  The publication writes, "[Kwon] said more smartphone manufacturers will release new models employing LG’s “Retina Display’’ that has been used in iPhones and iPads."

Retina display
The iPad 3 will finally get the LG Retina display. [Image Source: SlashGear]

The Retina display doubles the pixels, complete with a full three subpixels per pixel (the proper amount).  In an iPhone this works out to a 960x640 resolution (336 ppi).  On the new iPad 3 this will likely work out to 2048x1536 (264 ppi), assuming the aspect ratio doesn't change.

The Retina display was not an Apple idea, but wholly an LG Display idea (it's even coming to Android phones).  The idea was to create a display that was at the maximum resolution scale of the human retina at 12-inches, a common distance for smartphone use/navigation.

iPad 2 v. iPad original
iPad 2 (right) made largely cosmetic changes to Apple's popular first generation iPad (left), shrinking the packaging. [Image Source: Telecom Australia]

Despite this ambitious premise, the Retina display isn't perfect.  Raymond Soneira, president of DisplayMate Technologies comments claimed that the retina could resolve to 477 ppi at 12 inches (305 mm) from the eyes, or 36 arcseconds per pixel, in an interview with Wired magazine. Discover Magazine Bad Astronomy author Phil Plait clarified the situation further, stating, "If you have [better than 20/20] eyesight, then at one foot away the iPhone 4’s pixels are resolved. The picture will look pixelated. If you have average eyesight, the picture will look just fine."

For those super-vision folks rival South Korean display maker Samsung Electronics Comp., Ltd. (KS:005930) has paired with Taiwan's HTC Corp. (TPE:2498) for the HTC Rezound -- which packs a 342 ppi display, with a full 3 subpixels per pixel (Samsung calls this a "Super LCD" or "S-LCD".

Rezound vs. iPhone 4S
The HTC Rezound (right) packs a higher PPI display than LG's Retina display (seen left in the iPhone 4S). [Image Source: Wirefly]

The iPad 3 will not unseat the Rezound or even the iPhone 4.  Its display will be 264 ppi, meaning that it is about a third below the maximum resolution a user with average eyesight can see.  Still it should be a major step up for the iPad, which has trailed Android designs in screen resolution.

II. The Chips

New juicy details emerged three weeks ago when DigiTimes reported that the iPad 3 would fall under two models -- model numbers are J1 and J2 (iPad3,1 and iPad3,2).  While this is to be expected (Wi-Fi, Wi-Fi+Cell modem) the exciting news was that Apple was throwing in an LTE modem -- a first for any Apple device to date.

DigiTimes claims that Apple's final solution for LTE is simply to put in a huge battery, bumping the LTE model from 6,500 mAh to 14,000 mAh.  The publication indicates price will remain constant.

Boy Genius Report today appeared to confirm the dual-model format via leaked boot screen pictures from a "trusted source".

iPad 3 leakiPad 3 screen 2
iPad 3 screen 3iPad 3 screen 4
[Images Source: Boy Genius Report] (click any to enlarge)

The BGR piece raises a new mystery in that it unveils Apple's new CPU code -- S5L8945X (the Apple A4 model was S5L8930X and the A5 was S5L8940X).  BGR states that the CPU is quad-core and stops there.

Here's where it gets interesting.  Two trusted sources -- one close to ARM Holdings plc, and the other an ex-Apple low-level manager/executive, provided information that indicates that Apple is preparing to build its own custom ARM core for the first time.

I first spoke to the ex-Apple source, who told me that the firm was "building its own quad-core CPU."

At the time I was reticent of the source's claims.  After all -- this same person was the source of my Apple internet television report.  While other sources have since corroborated that an Apple internet TV is indeed a top secret R&D project at Apple, my source's timeframe of a probable fall launch, possible 2012 launch  -- while still possibly accurate -- seems suspect.  In other words, the source did seem to have some high level information, but seemed to be a bit behind/incorrect on a couple of the important details.

But a second source at CES 2012, close to ARM Holdings offered further credibility to this notion, commenting that Apple had reportedly obtained a full ARM instruction set license -- joining only a handful of companies like Marvell Technology Group, Ltd. (MRVLand Qualcomm Inc. (QCOM) who license the instruction set for use in custom designs.

Apple or its chipmaking subsidiaries -- P.A. Semi and Intrinsity -- do not officially acknowledge holding ARM licenses.  However Intrinsity was on ARM's list of public licensees as recently as 2010 [thanks, WayBack Machine!], though this was presumably an IP-core related license.

For the A4, most of the work was done by Samsung, but Apple devoted its Intrinsity staff to souping up the homely Cortex-A8, allowing it to run at faster clock speed.  Hence the "Hummingbird" core -- found on the iPhone 4 and the Galaxy S -- was launched.  For the Apple A5, it worked with Samsung on a similar approach, taking ARM Holding's Cortex-A9 core, souping it up, and sourcing the finished SoC to Samsung for production.

But Apple -- likely via the combined talents of Intrinsity -- a logic block maker -- and P.A. Semi -- who was founded by StrongARM creator Daniel W. Dobberpuhl (now retired) -- reportedly is now building a brand new core, which will not be a mere modification of an ARM Holdings IP Core.

CPU Baking with Kanye
Apple is reportedly baking its own GPUs in a bid to get better chips, woo more tablet customers. [Original Image Source: The Onion (modfications: Dailytech/ Jason Mick] 

What this means for performance, battery life, etc. remains to be seen.  If Apple indeed does opt to take this R&D design to production, the results should be interesting to witness. P.A. Semi reportedly suffered a lot of attrition -- both in retirements and defections -- after the Apple acquisition.  It remains to be seen if the combined P.A. Semi and Intrinsity teams (along with Apple's recent hires) have the experience necessary to make a core for the world's best-selling tablet.

Apple is a savvy competitor and it likely won't drop this chip until its ready.  So don't be surprised if the A6 is not a fully custom design, yet, though its equally possible that it will be.

It also remains to be seen the new chip design our sources reveal is the A6.  But it seems like the timing is certainly adding up.  Quad-core chip, check.

The humorous thing is that even if Apple is baking its own cores for the A6, it will reportedly still have to rely on Samsung for production.  Apple wants the A6 produced on the 28 nm node and was hoping to ditch Samsung -- whom it has grown to be a bitter legal [1][2][3][4] and sales rival [1][2] of.

But Taiwan Semiconductor Manufacturing Comp., Ltd.'s (TPE:2330) 28 nm process is reportedly having some serious yield issues.  So Apple had to come crawling back to its ex (fab) -- Samsung.

Some things never change.

(The new A6 is expected to incorporate a new and slightly improved graphics core from Imagination Technologies Group Plc.'s (LON:IMG) PowerVR series on-die.)

Sources: DigiTimes, Boy Genius Report, The Korean Times



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Apple Going Custom vs. Adopting Cortex A15
By ltcommanderdata on 2/1/2012 7:53:52 PM , Rating: 2
Seeing next-gen ARM SoCs based on Cortex A15 should start to show up by the end of the year, I wonder what advantages Apple would get going with a self-designed custom design, compared to concentrating simply on being the first to put Cortex A15 on the market? I wouldn't expect a custom Apple design to exceed the performance of Cortex A15 since it's already a very significant improvement from Cortex A9. Cortex A15 does have a server focus and may not be the most power efficient, so I suppose Apple could be looking for a custom architecture that provides more modest performance improvements over Cortex A9 while focusing on lower power consumption.




RE: Apple Going Custom vs. Adopting Cortex A15
By B3an on 2/1/2012 8:30:04 PM , Rating: 2
The custom A6 will likely be about the same as ARM's A15 design anyway. The same way Qualcomm's upcoming Krait SoC is also a custom in-house design but still basically a modified A15 with similar performance.


RE: Apple Going Custom vs. Adopting Cortex A15
By ekv on 2/1/2012 10:43:03 PM , Rating: 2
quote:
reportedly is now building a brand new core, which will not be a mere modification of an ARM Holdings IP Core
So if this is a custom A6, what does that do to the compiler tools? etc. It looks as if Apple is a licensee at the ISA level, so they could change the core a lot. I'm just curious how that would affect the software....


RE: Apple Going Custom vs. Adopting Cortex A15
By MGSsancho on 2/2/2012 7:27:36 PM , Rating: 2
That would be Apple's problem. Nobody but Apple would need to write ASM for their chips. possibly game makers but that is abstracted away in xcode


By ekv on 2/2/2012 7:43:07 PM , Rating: 2
Ok. So Apple then has their s/w engineers, possibly from those two companies they bought, make the changes to the compilers. Nobody sees the ASM, but then Apple has to develop and expose API's to developers, no? Otherwise how do the new-and-improved (TM) instructions help?

[though more importantly, the legal and marketing dept's have to be made aware of the special-newness in order that they may capitalize ... blah, blah, blah].


By milli on 2/2/2012 12:11:38 PM , Rating: 2
quote:
The same way Qualcomm's upcoming Krait SoC is also a custom in-house design but still basically a modified A15 with similar performance.


Not true at all!


RE: Apple Going Custom vs. Adopting Cortex A15
By Fritzr on 2/1/2012 8:55:13 PM , Rating: 3
It is not a matter of performance advantage. Apple bases it's whole image on custom design by Apple. This new chip will be a marketing advantage as Apple will now be able to claim that the new iPad is 'better' because it uses an Apple designed CPU instead of generic like 'those other folks' devices. Those 'other folks' are obviously inferior since they do not have Apple parts.


RE: Apple Going Custom vs. Adopting Cortex A15
By scrapsma54 on 2/1/2012 11:16:21 PM , Rating: 1
Apple is just mad that they can't be as awesome as Nvidia and make something as awesome as Tegra 3.


By scrapsma54 on 2/1/2012 11:23:54 PM , Rating: 2
Or asus for that matter.


RE: Apple Going Custom vs. Adopting Cortex A15
By douchefree on 2/2/2012 12:55:11 AM , Rating: 2
Funny.
Tegra 3 is still nowhere near as fast as MP4.
It's only similar to performance as Adreno 220 (Andreno 225 will push it a good deal beyond Tegra) and Mali-400.
Nvidia is not producing great silicon, and this is their third big release.


By EasyC on 2/2/2012 9:28:42 AM , Rating: 2
Yea, except not.

My TFP scores almost double that of my E3D in graphics. Tegra 3 > Adreno 220. And that's with 505,600 more pixels to account for.

Might want to do a little research next time. Just saying.


By TakinYourPoints on 2/2/2012 2:40:23 AM , Rating: 2
The Tegra 3 is neck-in-neck in general benchmarks and is significantly slower in GPU performance than the A5, an SoC that is nearly a year old.

http://www.anandtech.com/show/5163/asus-eee-pad-tr...
http://www.anandtech.com/show/5163/asus-eee-pad-tr...

How exactly is the Tegra 3 awesome when it falls short of an old part and a new SoC is right around the corner?


RE: Apple Going Custom vs. Adopting Cortex A15
By LordSojar on 2/2/2012 11:25:59 AM , Rating: 2
quote:
How exactly is the Tegra 3 awesome when it falls short of an old part and a new SoC is right around the corner?


Cherry picking benchmarks is easy to do. Also, that's pre-ICS, and ICS still had 2-3 optimization issues.

Tegra3 wipes the floor with the A5 in anything CPU centric. By simple logic, that makes perfect sense. They are A9 cores, Tegra3's run 400MHz faster and it has 2 more of them + the companion core.

So... has it occurred to you, that by simple math, they aren't in any way, "neck-in-neck". And the word "significantly" is thrown around too much. The Tegra3 is marginally slower in GPU performance than the A5. Additionally, the Tegra optimized games run better than anything else around for a reason. Optimization > hardware brute force. Considering the A5's GPU is actually two GPUs... I feel my point stands.

You're just mad that a company beat your precious fruit flavored Koolaid to the punch (yet again) with a quad core chip.

nVidia's turnaround is incredible, and they will have an A15/A7 big.LITTLE design before anyone else too. They'll improve their GPU exponentially for Tegra4 as well. Currently, there isn't much that really pushes GPUs much further than Tegra3 would need to be pushed (assuming optimized of course). So... why devote more silicon, effectively reducing battery life?


By TakinYourPoints on 2/2/2012 4:34:15 PM , Rating: 2
I checked post-ICS benchmarks. Again, nothing that impressive, especially given that this new Tegra 3 is competing against the A5 that is nearly a year old and about to be replaced.

I really don't care that it is a quad core instead of a dual core, I only care about practical results. The fact that a quad falls so far short means that nvidia still has a lot of catch-up to do in mobile.

quote:
The Tegra3 is marginally slower in GPU performance than the A5.


I wouldn't call those differences marginal, unless you think that 64fps vs 88fps or 78fps vs 148fps is minor. In that case I suggest you tell people with GTX 580 cards that they're only getting a marginal boost over a GTX 460.


RE: Apple Going Custom vs. Adopting Cortex A15
By LordSojar on 2/3/2012 9:06:58 AM , Rating: 2
quote:
I wouldn't call those differences marginal, unless you think that 64fps vs 88fps or 78fps vs 148fps is minor. In that case I suggest you tell people with GTX 580 cards that they're only getting a marginal boost over a GTX 460.


I claim marginal because you can marginalize those results. 148fps? How is that useful in any meaningful way? OH BIG NUMBER, BETTAR. No.... you need to be able to hit 45-60fps in big name games and that's it. If you have a 120Hz screen, then it becomes slightly, and I emphasize, SLIGHTLY better at 100-120fps... but that's again extremely uncommon and irrelevant for most people.

Tegra3 is great for raw power. The fact that I can run a full 1080p Youtube video flawlessly.... that's awesome. I'm in now way disappointed in my Transformer Prime. It does everything other tablets do + some, and it does it faster and better than they do. So... how is it bad or unimpressive again?

You're either stuck comparing Tegra3 to a CPU that still hasn't released, and likely won't release until around the same time Tegra4 is ready to release, or comparing it to a 1 year old CPU made my Samsung.

The A5's CPU elements in now way can keep pace with Tegra3... once you start actually doing things on a tablet that don't turn it into a glorified toy... and see, I actually do that, running statistics on my tablet, editing huge spreadsheets, generating graphs, etc. And I'll get to install Windows8 to my tablet once that releases; you'll still be stuck with your glorified Fisher Price activity center for adults. Have fun!


By TakinYourPoints on 2/4/2012 1:00:17 AM , Rating: 2
quote:
I claim marginal because you can marginalize those results. 148fps? How is that useful in any meaningful way? OH BIG NUMBER, BETTAR. No.... you need to be able to hit 45-60fps in big name games and that's it. If you have a 120Hz screen, then it becomes slightly, and I emphasize, SLIGHTLY better at 100-120fps... but that's again extremely uncommon and irrelevant for most people.


More computational cycles means better graphics. You can hit 60fps with more geometry, more visual effects. It is the same reason people use faster GPUs, it isn't to get 400fps in Quake 3, it is to get 60fps in games that push much higher end graphics.

I thought people here were techy types. Stop being an apologist and argue objectively.

quote:
The A5's CPU elements in now way can keep pace with Tegra3... once you start actually doing things on a tablet that don't turn it into a glorified toy... and see, I actually do that, running statistics on my tablet, editing huge spreadsheets, generating graphs, etc.


CPU tests are roughly in line with the A5, neck in neck in some, slightly faster or slower in others. It is a wash in my book. It is nothing like the blowout in GPU performance though.

As for productivity, I use business apps as well as things like image editing and stock charting. Hell, if you wanted business and productivity apps you'd be on iOS, that's there the developers are.

It is funny how Android tablet apologists deflect criticism of slower hardware with "but it is for productivity", when iOS development in productivity is HUGE. Microsoft Office is coming to iPad this year with zero plans for Android, and that's just the tip of the iceberg. Android tablet app development is anemic, but keep convincing yourself otherwise.

Tegra 3 is widely accepted as a joke given that its level of performance barely competitive with the A5 that is about to be replaced, and Tegra 4 is still a while away. It's not as bad a joke as AMD's Bulldozer, but it is close. I guess the Android apologists are defending it because it is their only option in tablet hardware, so I shouldn't be surprised that I see posts like yours here. It is still disappointing, you'd think that people can accept a platform based on faster hardware and more/better software rather than denying it because they are fanboys.

As for Windows 8, I'm sticking with that on my desktop. I have no interest on throwing that on ARM until performance is proven to be decent and (very important) the Metro-only apps are there.


RE: Apple Going Custom vs. Adopting Cortex A15
By DarkPhoenix on 2/3/2012 9:44:11 AM , Rating: 2
It seems that Apple fans like to remain ignorant while also staying on their high horse...

Some facts for you:

1) Tegra 3 is about 90mm2 while A5 is over 120mm2. A5 has only two cores and massive die area reserved for the GPU. That alone should be enough for you to understand that the A5 GPU is much larger and thus has much more resources than Tegra's 3 GPU, which is quite a bi smaller. On the CPU side, Tegra 3 wins without a doubt.

2) Comparing benchmarks in two entirely different systems and hardware is laughable, even more so when one of those system has content developed for that SPECIFIC piece of hardware. That system being iOS. And still Tegra 3's GPU gets quite close to Apple's A5 in many cases, even though it has less resources (especially considering point 1) and also that tiny fact that Apple fanatics such as yourself usually forget, that is software developed just for iOS that runs on just a few SoCs (and thus optimized for it), while Tegra 3 runs on generic software that runs on it and dozens of others SoCs i.e. little to no optimizations-

It would be quite interesting to see a comparison of a game optimized for A5 and that same game also optimized for Tegra 3. I'm sure A5 would still win, given point 1, but the difference wouldn't be as great, that's for sure.


By TakinYourPoints on 2/4/2012 12:48:18 AM , Rating: 2
I am fully aware that the A5 has a much larger die than the Tegra 3. It is a plus IMHO, it hasn't contributed to the iPhone 4S being huge while also allowing for PowerVR's excellent GPUs.

quote:
It would be quite interesting to see a comparison of a game optimized for A5 and that same game also optimized for Tegra 3. I'm sure A5 would still win, given point 1, but the difference wouldn't be as great, that's for sure.


Make arguments based on hypothetical situations, I'll keep using hard facts. Also ignore that the Tegra 3 is competing with hardware nearly a year old that is about to be replaced.

Love seeing the people here spin reality.


RE: Apple Going Custom vs. Adopting Cortex A15
By SniperWulf on 2/2/2012 9:35:26 AM , Rating: 2
Tegra 3 isn't all that, GPU wise. Better than most others, but still doesn't catch up to the MP4 in CrApples A5.

I really wish they had just went with a dual-core + companion core and doubled up on the GPU side of things. I mean the company's roots are in GPU's... You'd think they would be showing the rest of the world "How its done" lol


RE: Apple Going Custom vs. Adopting Cortex A15
By DarkPhoenix on 2/3/2012 9:55:03 AM , Rating: 2
Disagreed. Tegra 3's GPU is 2 to 3 times faster than Tegra 2 and Tegra 2's GPU is still quite a capable GPU. When comparing it to a Galaxy S2 SoC for example (a very popular high-end smartphone), Tegra 2 doesn't lose by much on most cases.

As for Tegra 3 losing against A5 in GPU stress scenarios, as I mentioned in another post, the die area difference alone between both SoCs should be enough to indicate what both companies wanted. Tegra 3 was designed for power efficiency while boosting performance quite a bit, when ompared to the previous version fot eh SoC - Tegra 2 - and they did that quite well. Tegra 3 consumes about the same as Tegra 2 while having a GPU 2-3 times more powerful and twice as many cores (+ companion core)
A5 was designed with much focus on the GPU portion, because of the iPad 2 that used it first. It was made with a particular product in mind, that needed to "drive" a higher resolution. Two different approaches and two very different products.

Plus the software side of things. Everything iOS is optimized to run on just 2-3 SoCs while everything android will need to run on ANY SoC and that means litle to no optimizations. That is why looking at benchmarks that put SoCs in Android Phones vs SoCs in iOS phones, should be taken with a large grain of salt...


By TakinYourPoints on 2/4/2012 1:02:46 AM , Rating: 2
That reads like a list of excuses as to why Android SoCs fall short. Practical end-user experience is all I care about. Why the hardware is slower, why the OS isn't as well optimized, why there are fewer apps, those are things I don't care about, I just know that it fails.


By kattanna on 2/2/2012 9:50:37 AM , Rating: 2
quote:
Apple bases it's whole image on custom design by Apple.


correct. this is set to become apples new chip for its devices, and will be replacing the current intel chips in its lower end laptops as well. apple wants direct control once again over their CPU, something they do not now have using intel chips.

i wonder if they will try to lock out people once again being able to boot windows off their laptops.. and require you to only use apple software. i would not put it past them.


powerpc
By hackztor on 2/1/2012 8:08:42 PM , Rating: 1
Maybe they want to try again at building processors like the g5. Could give them an advantage not having to rely on someone else, just seems not worth it.




RE: powerpc
By blueaurora on 2/1/2012 8:35:36 PM , Rating: 2
Perhaps... IBM was the brains behind the PowerPC chip and in the end it almost sunk Apple who saw the writing on the wall. Intel who designs x86 chips and is actually good at it just shows you that amateurs should stay out of the game and optimize and let others innovate.


RE: powerpc
By TakinYourPoints on 2/2/2012 2:46:50 AM , Rating: 4
Apple has always been dependent on CPU designs from outside sources. Their desktops and laptops used RISC CPUs before switching to x86, but they were still sourced from other companies. The G5 PPC was from IBM, similar to the one that also ended up in the XBox 360. Before that they used Motorola CPUs for nearly two decades.

They have asked for custom designs in the past, things like the custom CPU package that ended up in the first generation Macbook Air, but Intel was still responsible for designing and producing the part to Apple's chassis specifications.

That said, their mobile CPUs seem to have been the most "custom" so far. The A5 was a very different design from other Cortex CPUs in terms of the size of the die and the GPU used. The physical CPU die is much much larger than SoCs used in other handsets and tablets, but the tradeoff was things like excellent GPU performance.


300 DPI monitors
By wordsworm on 2/2/2012 6:04:09 AM , Rating: 2
I cannot figure out why they work so hard on small displays but have not come up with a monitor with one. 300 dpi monitors would be a huge seller to the graphic designer crowd.




RE: 300 DPI monitors
By TakinYourPoints on 2/2/2012 7:25:44 AM , Rating: 1
Yields for 10" displays of such high DPI that would go into a tablet have been relatively low. Imagine how hard (and therefore expensive) it is for 15" laptop and 24"/27" desktop displays. GPUs have also been holding this back, but we're finally getting to a point where this won't be a big problem in terms of both performance and cost.

Apple have clearly been planning for double res monitors though, OS X Lion already has 2x scaled assets included in it. All that's missing is the necessary hardware to display it on and the patch that flips the switch on the upgraded assets. There are rumors of a 15" Macbook Pro with a 2880x1800 display, we'll see if it happens. That would be pretty incredible, a higher resolution than the 2560x1440 in my 27".


RE: 300 DPI monitors
By theapparition on 2/2/2012 2:34:27 PM , Rating: 2
300dpi as a benchmark "HD" resolution has a much to do with viewing distance as it does actual resolution.

People generally hold their mobile devices much closer to the eye than the average monitor. As such, it is more important to have higher resolutions. If you sit closer to a monitor than that, then you're going to put some serious strain on your eyes.


RE: 300 DPI monitors
By wordsworm on 2/6/2012 9:58:59 PM , Rating: 2
It would be nice to have a monitor which accurately represents what you're going to get from printing your graphic designs. Heck, I wouldn't mind seeing a 600dpi monitor come out in the next 10 years. These 75-96dpi monitors are dinosaurs, or they should be anyhow.


Global Foundries ?
By Soulkeeper on 2/2/2012 8:15:01 AM , Rating: 2
will they consider Global Foundries' 28nm as a potential option ?




RE: Global Foundries ?
By ilt24 on 2/2/2012 8:43:59 AM , Rating: 2
After all the screw ups with AMD, I imagine Global Foundries has a ways to go to improve their credibility before a company like Apple or Qualcomm would consider using them for a high volume product on a leading edge process.


By artemicion on 2/2/2012 1:37:51 AM , Rating: 2
Jason, you probably should refrain from disclosing so much information about your "anonymous" sources. You just revealed three critical pieces of information about the guy (relative position in company and knowledge of two separate pieces of information).

Probably wouldn't be hard for Apple to figure out who you're talking about using those three pieces of information and sue the pants off of the poor guy, or otherwise sh!t on his career.




By JasonMick (blog) on 2/2/2012 2:45:15 PM , Rating: 2
quote:
Tens of millions of people and thousands of businesses around the world do? I wouldn't call myself a big fan of RIM, however, it is absolutely essential we have as much competition in Mobile OS'es as humanly possible to stop a possible duopoly from Google and Apple. This would stifle innovation very badly and would result in less options for consumers, higher prices, and so on. There is no reason why RIM shouldn't be able to develop a solid QNX eco-system between its new tablets and new phones and then proceed to license it out as well. I've personally seen some of the programming from this new tech and it is highly advanced and should provide a pretty impressive eco-system. It's never too late to introduce new ground-breaking technology. I think this will pit RIM against the big boys if they do it right and don't mess up the marketing. Hopefully with the recent shake-ups at RIM headquarters, we can see some decent improvement.

I have several sources close to Apple, and they hear lots of things... for all Apple's efforts and hopes of secrecy, anyone on the management/executive level who's social tends to hear a lot.

One source hearing two things doesn't mean a whole lot.
quote:
Probably wouldn't be hard for Apple to figure out who you're talking about using those three pieces of information and sue the pants off of the poor guy, or otherwise sh!t on his career.

How do you Apple hasn't already sh!t on her/his career?

Or how do you know she/he isn't self employed?

Apple has thousands of low-ranking managers and executives who have left over the last decade. Good luck trying to track down every one and cyberstalk them.

My sources are safe. I put enough information to explain why I felt the source to credible, but not anything that to my knowledge reveals her/his identity.


powerpc
By hackztor on 2/1/2012 8:07:47 PM , Rating: 2
Maybe they want to try again at building processors like the powerpc. Could give them an advantage not having to rely on someone else, just seems not worth it.




Two way street
By semiconshawn on 2/1/2012 9:13:26 PM , Rating: 2
quote:
So Apple had to come crawling back to its ex (fab) -- Samsung.


While this may be true to an extent. Samsung also needs Apple as a customer. Austin fab was upgraded primarily to foundry for Apple. They cant fill it with out Apple not yet anyway.

On a side not TSMC has to be dying knowing the Golden goose is out there to be had and they cant yield....so close and yet so far.




Bad news for Intel!
By aspartame on 2/2/2012 4:41:57 AM , Rating: 2
I don't see a reason Apple developing a new CPU architecture just for a cheap Tablet. I think Apple will soon switch from Intel to ARM, unifying all the platforms. A7 will probably power iMacs as well.




Wrong info
By milli on 2/2/2012 12:10:17 PM , Rating: 2
quote:
Hence the "Hummingbird" core -- found on the iPhone 4 and the Galaxy S -- was launched

Samsung's "Hummingbird" aka Exynos 3110 is not the same SOC as Apple's A4! They use the same building blocks but are internally completely different. Intrinsity optimised the Cortex A8 core independently for both companies.

quote:
P.A. Semi -- a former ARM core maker (they made StrongARM)

P.A. Semi is a former Power ISA core maker. Dan Dobberpuhl (founder of P.A. Semi) worked for DEC (they made StrongARM) and was there the lead designer for the DEC Alpha 21064 and StrongARM processors. The other 150 engineers of P.A. Semi basically have no past experience with the ARM ISA.




Waste of time apple
By ChipDude on 2/4/2012 10:32:00 PM , Rating: 2
This is so stupid, why did apple waste all this time on another ARM design inhouse.

Lets be honest in the 90's they used 6800, than they went with that silly group for PowerPC for their computers and what happened? DOn't forget at the time that was a pretty good group of companies with a little $, what happened?

There was this company called intel with huge volumes and ability to scale silicon and make the x86 handicap become nothing given their superior scaling of silicon. Anyone using powerPc or 6800 for anything? Nope because the silicon they are built on sucks and is uncompetitive!

You only need to look at what Intel did with two generations of Atom for the cellphone and how competitive the new Medfield to figure out in two generations Apple will have no choice but go to Intel. Why, its easy, Android will also be on ARM and there are a LOT of good ARM design teams out there, apple can't beat them, they all use the same silicon manufacture. So who will make the leap to a fundamentally superior hardware. Who has a generation lead in highK metal gate, trigate and scaling. Design those guys will fix and with one generation lead he who wants the best hardware must go to x86 and intel.. duh




But will the glass be better?
By SurfSnow on 2/1/12, Rating: -1
RE: But will the glass be better?
By ebakke on 2/2/12, Rating: 0
RE: But will the glass be better?
By xti on 2/2/2012 2:13:42 PM , Rating: 1
do a sit up, stop landing your butt on it.


By silvaensis on 2/3/2012 6:21:52 AM , Rating: 2
That's actually a fault with their design and someone not knowing how glass works.

It happens because the glass on their devices is flush with the bevel so the edge of the glass is not properly protected.

glass is made to be very strong from the front, but this same design makes it very fragile to cracking from the edges and especially the corners. A small hit just right on the edge or the corner will easily crack a screen.

Most companies drop their phones straight down in drop tests and its almost impossible to break glass this way instead of doing corner and side drop tests.

It is actually one thing I always look at in a phone/tablet I buy is to make sure the edge of the bevel on the screen extends past the glass a little, because it is not just apple products that make this mistake of form over function.


"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki