backtop


Print 31 comment(s) - last by stimudent.. on May 20 at 11:41 AM

He said he should have listened to his gut

Former Intel CEO and longtime employee Paul Otellini had his last day yesterday, and after 40 years of employment with the chipmaker, his only major regret was missing out on the iPhone.

Otellini, who became Intel CEO in 2005, said he was disappointed that he passed up the opportunity for Intel to make chips for the iPhone back before the device's 2007 release. It was his one truly regretful moment during his long career with Intel, and the one time he decided not to go with his gut instinct. 

"We ended up not winning it or passing on it, depending on how you want to view it," said Otellini. "And the world would have been a lot different if we'd done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."

Apple's early iPhones ended up having processors made by Intel's rival, ARM. ARM certainly jumped on the mobile processor bandwagon quickly, making the crucial transition from traditional PCs to the exploding mobile sector. 

"The lesson I took away from that was, while we like to speak with data around here, so many times in my career I've ended up making decisions with my gut, and I should have followed my gut," said Otellini. "My gut told me to say yes."


Paul Otellini

While Samsung (Apple's main hardware rival in the mobile industry) has been manufacturing most of Apple's smartphone and tablet processors for years, the two are looking to split after nasty patent wars and increased competition have broke out between the two. 

At the Consumer Electronics Show (CES) earlier this year, Samsung's President of LSI business Stephen Woo said that it's crucial for the South Korean electronics maker to focus on alternatives to Apple when it comes to the chip sector.

This being the case, Apple has reportedly been in talks with Intel again to manufacture its chips for iPhones and iPads. This could lessen Apple's reliance on Samsung and help Intel leap further into the mobile processing sector. 

Otellini joined Intel in 1974.  He was appointed to senior vice president and general manager of sales and marketing from 1994-1996 and then executive vice president of sales and marketing from 1996-1998. He became executive vice president and general manager of the Intel Architecture Group from 1998-2002.

Otellini announced his retirement in November 2012. Earlier this month, Brian Krzanich was named the new CEO. 

Source: The Atlantic



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: The full story?
By karimtemple on 5/17/2013 2:40:47 PM , Rating: 2
This is assuming x86 could never reach/surpass ARM power consumption characteristics -- a fallacious premise.


RE: The full story?
By name99 on 5/17/2013 2:54:35 PM , Rating: 3
Oh for crying out loud. Did you READ what I said? This is not about technology, it is about business.

The point is not that Intel cannot create a chip that competes with ARM on power/performance. The point is, Intel is STUPID for wanting to do so. Once they have a really kickass Atom, that's only useful if they sell it at ARM prices. And by selling at ARM prices, they undercut substantially their prices for NON-Atom CPUs. Sure, they can try to keep the price for i3, i5, i7 high; but inexorably people will switch from using those to using Atoms when the only real requirement is x86 compatibility.

This was my point. Intel would be better off, from a BUSINESS perspective, if they'd made a kick-ass ARM chip which would NOT cannibalize the x86 line.


RE: The full story?
By Scannall on 5/17/2013 3:00:06 PM , Rating: 2
This is a great point. I would be simple for Apple to go to x86 for iOS, since it is just a subset of OS X. But I'm not sure it would be a very good idea for Intel to try and compete with $20 ARM chips. It would drag down the prices and value of x86 chips.


RE: The full story?
By inighthawki on 5/18/2013 3:21:33 PM , Rating: 1
quote:
I would be simple for Apple to go to x86 for iOS, since it is just a subset of OS X.

And lose every app compiled for iOS and ARM, the largest and one of very few reasons to get an iphone these days.


RE: The full story?
By karimtemple on 5/17/13, Rating: 0
RE: The full story?
By name99 on 5/17/2013 3:39:13 PM , Rating: 2
"Three, Atom and i5 are not in the same product segment. Atom sales do not cut into i5 sales, lol."

Are you completely unaware of the history of computing?
You don't think i7's TODAY get sold into markets that used to be exclusively Xeon?
You don't think, for example, HTPCs could switch to Atom in a heartbeat?
What about all those turnkey systems used to control medical and industrial equipment? What about all those cash registers?
Everybody who might have considered AMD or VIA before, but wanted an Intel guarantee or whatever now gets the Intel brand name --- and at prices lower than AMD and VIA were considering.

People USED to buy i3,i5,i7 for two reasons
- performance
- compatibility.
The existence of Atom removes the compatibility motive, leaving only performance. And performance is a weak reed for future MASS sales...


RE: The full story?
By karimtemple on 5/17/2013 4:08:30 PM , Rating: 1
quote:
You don't think i7's TODAY get sold into markets that used to be exclusively Xeon?
No.

quote:
You don't think, for example, HTPCs could switch to Atom in a heartbeat?
Atom-based HTPCs have existed for years.

quote:
What about all those turnkey systems used to control medical and industrial equipment?
What about all those cash registers?
Who the hell is running this on i5?? rofl!!

quote:
Everybody who might have considered AMD or VIA before, but wanted an Intel guarantee or whatever now gets the Intel brand name --- and at prices lower than AMD and VIA were considering.
Uh... You think Intel is worried that they might push into their competition's market share?

quote:
People USED to buy i3,i5,i7 for two reasons
- performance
- compatibility.
Used to? What are they buying now?

Your concerns only seem relevant if one of several impossible scenarios occurs:

1) Software stops advancing and no longer needs advanced hardware.
2) People stop reacting positively to software improvements and refuse to upgrade their systems.
3) Performance hardware stops advancing while low-power hardware catches up.

Eventually, as always, segments will shift and converge, and certain things will cost less than before. I hardly think Intel is afraid of that. Hell, with the right strategies, there's always more money to be made in that.


RE: The full story?
By name99 on 5/17/2013 4:37:37 PM , Rating: 3
quote:
1) Software stops advancing and no longer needs advanced hardware.
2) People stop reacting positively to software improvements and refuse to upgrade their systems.
3) Performance hardware stops advancing while low-power hardware catches up.


(1) HAS largely happened. Name the last software advance in CONSUMER software that REALLY required advanced HW. Which leads to

(2) HAS already happened. Look at the damn HW sales numbers. During the 90s I updated my machine every two years or so. But my Penyryn iMac was acceptable for about 5 years. My quad-core IB iMac will probably be acceptable for even longer.

(3) HAS ALSO already happened. The focus of Intel's last few rounds of CPUs has been more power reduction than raw performance improvement. The performance improvements, while impressive if you appreciate µArch, have been nothing compared to the past.

As for your claim that i7s are NOT sold into previous Xeon markets, I can only assume that you are massively ignorant. Back in the late 90s when I was at Apple, our x86 build machine was an incredibly expensive 4-core Xeon machine. Nowadays no-one would buy a $50,000 server for a task like that --- a Mac Mini would probably be good enough, just make sure you gave it enough RAM and an SSD.
Likewise for the server in pretty much any small business.

Look, the world did not begin in 2007 --- there is a long history here than SOME of us are aware of us. Compare with a company like IBM. IBM has been very careful to segregate its lines. There are z Machines at the high end, and p or i machines in the mid-range (both using POWER CPUs). Regardless of what SW and circuit techniques are shared between these, IBM maintains ABSOLUTE SW incompatibility between them. No-one is going to be allowed to save a million dollars by buying a p machine rather than a z machine, and no-one is going to be allowed to save $100,000 by buying a Xeon rather than a p machine.
This hollowing out of Xeons that I described has already gone too far for IBM's tastes. After selling their low-end Intel business to Lenovo, they've recently taken the next step and offloaded the Xeon business to Lenovo as well.

Of course the large scale structure here is inevitable. CPUs will be improving for a while yet. BUT Intel has managed the situation so as to limit how much money they can make out of it. They've lost the chance to make money on the ARM side over the past few years, while simultaneously (now that Atom doesn't suck) speeding up the process of undermining their i3,i5,i7 line.


RE: The full story?
By bah12 on 5/17/13, Rating: -1
RE: The full story?
By name99 on 5/17/2013 6:35:04 PM , Rating: 4
Well I spent my time at Apple writing low-level high performance codec code (including plenty of assembly, based on a careful understanding of CPU µarchitecture), and spent a bunch of time with Pradeep Dubey designing what became AltiVec...

What are your credentials?

But sure, allow yourself to believe that anyone who ever touched a mac is so tainted and ignorant that they have nothing useful to say. Wilful blindness is a sure path to better understanding of the world.


RE: The full story?
By JKflipflop98 on 5/19/2013 9:48:51 PM , Rating: 2
I work in the industry as well. Your whole argument is based around the assumption that software has stopped advancing, and that's just not the case. Maybe YOUR software has stopped advancing, but the world is still moving forwards. There's still a huge demand for faster chips.


RE: The full story?
By JPForums on 5/20/2013 9:44:23 AM , Rating: 3
quote:
Well I spent my time at Apple writing low-level high performance codec code (including plenty of assembly, based on a careful understanding of CPU µarchitecture), ...
Wait, you are a low level high performance codec writer who works at the assembly level and (presumably) squeezes out every ounce of performance a CPU µarchitecture has to offer and you think faster hardware is unnecessary because software has come to a standstill. Writing codecs particularly at the low level is a tradeoff between speed, quality, and compression. Higher end hardware allows for your choice of more quality or compression while still meeting your speed requirement. I would have thought a guy like you would appreciate the advances made possible by new hardware, but alas, it seems you don't even notice that they're happening if you really believe software has hit a standstill.

Consider the general population might like the idea of transcoding high quality, high definition video on demand (in real-time) that removes the need to spend time pre-encoding videos. The market is slowly, but surely moving to higher definition (4K, 8K, etc.) video streams. Do you suppose there is a market for ultra high definition video, transcoded in real time, on demand, across a no hassle, low latency wireless link a la Tony Stark? Current display definitions and wireless display technologies are available, but there are definite hardware limitations. In your line, it is unlikely that you don't see how hardware limitations can limit the quality of video/audio codec quality. It might interest you to know, if you don't already, that there is in at least one company working on just these things looking to (eventually) bring Starkesque display technologies to a mansion near you ... and after a few years and some price drops, a house near you. Given your previous statements in this thread, you may be surprised to learn that current processors aren't up to the task. The codecs in consideration are increasingly shifting the burden away from link throughput and into the processor.
quote:
What are your credentials?
I could come up with several examples of why credentials don't automatically make you an undisputed expert, but in this case, I'll simply mention that being an expert in low level codec programming doesn't necessarily give you insight into other fields in which CPU processing power in the mass market is relevant. If its really that important to you, I build hardware that people like you program software for.

Home automation may catch on when it becomes easy, convenient, and inexpensive. One hold up is voice recognition. While voice recognition is available on your phone, the phone doesn't do the actual work and is often more than a little bit annoying to deal with. More than a few people would be less than happy if their home automation system stopped working whenever the internet or servers had issues (or if their activities were snooped). Hence, more local compute power for better voice recognition is required. If it were cheap and reliable, I'm sure some people would love systems that could unlock their house, car, etc. facial/voice/pattern/all recognition (removing the need for keys). All of these would require more work and processing power to work reliably enough to use as a primary security device. Finally, lest I receive lashings for my heresy, I must bring recognition to the ever insatiable PC gaming market.


RE: The full story?
By karimtemple on 5/17/2013 5:40:06 PM , Rating: 2
quote:
Name the last software advance in CONSUMER software that REALLY required advanced HW.
Video games. A/V compression. AI. Voice recognition. Multitasking.

As hardware advances, things become consumer-grade that were previously professional-grade. Hobbyists can now do movie projects and photo editing and studio-quality music that even just 10 years ago was not possible for them.

There is a future where computers are so fast, they can process software designed to reliably perform human tasks like look at the road and drive for me and avoid hitting bicyclists and squirrels. I could just tell it to touch up the photo I took. That's consumer hardware.

That's one of about a million scenarios that wouldn't happen if Intel and others were as afraid of segment shift as you appear to be.

quote:
HAS already happened. Look at the damn HW sales numbers.
The economy was bullish in the 90's and is garbage now. There's also this thing called saturation.

quote:
HAS ALSO already happened.
I don't know what to tell you. ARM has not advanced at a higher rate than x86.

quote:
As for your claim that i7s are NOT sold into previous Xeon markets
My answer was due to your use of the phrase "Xeon market." If someone needs Xeon performance, they need Xeon performance. It seems a little obtuse to say "I replaced a Xeon with an i7" when your Xeon could've been from 2002 or 2012.

Ultimately, it just seems fairly backwards to say that Intel shouldn't produce a low-power part simply because it'll get so fast the performance parts will become irrelevant. Not adapting is the worst business strategy I've ever heard of and it absolutely has less than zero merit in the computer field, where entire industries are eaten alive because they failed to adapt to the tech. The tech is going to emerge regardless -- it behooves Intel to be there when it does. They make CPUs.


RE: The full story?
By someguy123 on 5/18/2013 4:28:09 PM , Rating: 2
I don't understand why people say this. The average user has always maintained their system for a relatively long period of time, regardless of actual technological advancements. First computer I've ever used was a 66mhz mac, which my father ultimately held on to for years even after I built my own athlon system that was FAR superior at 500mhz. Even after seeing the speed he just shrugged it off.

It wasn't until he was basically forced to go all digital thanks to internet adoption that he finally bought a new HP, which was already outdated at the time. Average user and commonly used software have next to no correlation with the push for advancement other than eventual adoption out of trendiness or lack of alternatives. PC shipments are down but portable computer shipments are WAY up, and these days they're quite powerful and feature rich, making the distinction arbitrary. If we were to base everything on the average the internet would be nothing more than a database of research journals.


RE: The full story?
By chemist1 on 5/19/2013 3:15:34 AM , Rating: 2
quote:
(1) HAS largely happened. Name the last software advance in CONSUMER software that REALLY required advanced HW.


Generally speaking, I find home computers are still far too slow to optimally run even standard office programs. I.e., I don't get instantaneous response -- I'm often left waiting, sometimes for a second or two, sometimes for dozens of seconds, for the CPU to complete its task. This is because most office software is single-core, while hardware advances have consisted not in making the single cores significantly faster, but rather in offering multiple cores. Specifically, I have a fairly modern machine: late-2011 MacBook Pro with a 2.4 GHz (3.5 Ghz turbo) quad-core i7 Sandy Bridge CPU, a Samsung 830 SSD, and 16 GB RAM. Yet working in, for instance, Word or Adobe Illustrator still often leaves me waiting several seconds for the computer to complete certain tasks (during which time one of the cores is pegged at 100%, so these tasks are CPU-bound, not I/O bound).


RE: The full story?
By chemist1 on 5/19/2013 3:20:02 AM , Rating: 2
More exactly, my machine has 8 virtual cores, and Word and Illustrator only use one of these. I have the latest version of Word, but my Illustrator is old--CS3--so my comments might not apply to the latest version of the latter.


RE: The full story?
By vision33r on 5/19/2013 11:34:49 PM , Rating: 2
Virtualization is the killer of power computing. These days its all about companies buying less servers and buying only a few powerful cpus to cluster them and make them into virtual machine hosts.

Back then you needed 20 servers to handle file, print, and directory services. Nowadays you do that with just 4, soon you may only need just 2 for redundancy.

Intel did not want dual-core, they had hyperthreading, and AMD lead them into multi-core designs.

Today, Intel's six-eight core design blow away anything AMD has but it also kills their margins when companies buy only a few of them and they buy even less servers as a result of virtualization.


“Then they pop up and say ‘Hello, surprise! Give us your money or we will shut you down!' Screw them. Seriously, screw them. You can quote me on that.” -- Newegg Chief Legal Officer Lee Cheng referencing patent trolls














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki