backtop


Print 31 comment(s) - last by stimudent.. on May 20 at 11:41 AM

He said he should have listened to his gut

Former Intel CEO and longtime employee Paul Otellini had his last day yesterday, and after 40 years of employment with the chipmaker, his only major regret was missing out on the iPhone.

Otellini, who became Intel CEO in 2005, said he was disappointed that he passed up the opportunity for Intel to make chips for the iPhone back before the device's 2007 release. It was his one truly regretful moment during his long career with Intel, and the one time he decided not to go with his gut instinct. 

"We ended up not winning it or passing on it, depending on how you want to view it," said Otellini. "And the world would have been a lot different if we'd done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."

Apple's early iPhones ended up having processors made by Intel's rival, ARM. ARM certainly jumped on the mobile processor bandwagon quickly, making the crucial transition from traditional PCs to the exploding mobile sector. 

"The lesson I took away from that was, while we like to speak with data around here, so many times in my career I've ended up making decisions with my gut, and I should have followed my gut," said Otellini. "My gut told me to say yes."


Paul Otellini

While Samsung (Apple's main hardware rival in the mobile industry) has been manufacturing most of Apple's smartphone and tablet processors for years, the two are looking to split after nasty patent wars and increased competition have broke out between the two. 

At the Consumer Electronics Show (CES) earlier this year, Samsung's President of LSI business Stephen Woo said that it's crucial for the South Korean electronics maker to focus on alternatives to Apple when it comes to the chip sector.

This being the case, Apple has reportedly been in talks with Intel again to manufacture its chips for iPhones and iPads. This could lessen Apple's reliance on Samsung and help Intel leap further into the mobile processing sector. 

Otellini joined Intel in 1974.  He was appointed to senior vice president and general manager of sales and marketing from 1994-1996 and then executive vice president of sales and marketing from 1996-1998. He became executive vice president and general manager of the Intel Architecture Group from 1998-2002.

Otellini announced his retirement in November 2012. Earlier this month, Brian Krzanich was named the new CEO. 

Source: The Atlantic



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: The full story?
By bah12 on 5/17/2013 5:02:55 PM , Rating: -1
Man I was so with you until this post. Now that I know you are a MAC user, I've lost all respect for you as a performance/CPU expert. Seriously anyone choosing to use any of the macs listed in the post over their PC counterparts of the same era deserves very little credit when talking about performance.

I'll give you the benefit of the doubt, maybe you were just strapped to their crappy software as this poor editor.
http://www.youtube.com/watch?v=cvXZVJXIyqM


RE: The full story?
By name99 on 5/17/2013 6:35:04 PM , Rating: 4
Well I spent my time at Apple writing low-level high performance codec code (including plenty of assembly, based on a careful understanding of CPU µarchitecture), and spent a bunch of time with Pradeep Dubey designing what became AltiVec...

What are your credentials?

But sure, allow yourself to believe that anyone who ever touched a mac is so tainted and ignorant that they have nothing useful to say. Wilful blindness is a sure path to better understanding of the world.


RE: The full story?
By JKflipflop98 on 5/19/2013 9:48:51 PM , Rating: 2
I work in the industry as well. Your whole argument is based around the assumption that software has stopped advancing, and that's just not the case. Maybe YOUR software has stopped advancing, but the world is still moving forwards. There's still a huge demand for faster chips.


RE: The full story?
By JPForums on 5/20/2013 9:44:23 AM , Rating: 3
quote:
Well I spent my time at Apple writing low-level high performance codec code (including plenty of assembly, based on a careful understanding of CPU µarchitecture), ...
Wait, you are a low level high performance codec writer who works at the assembly level and (presumably) squeezes out every ounce of performance a CPU µarchitecture has to offer and you think faster hardware is unnecessary because software has come to a standstill. Writing codecs particularly at the low level is a tradeoff between speed, quality, and compression. Higher end hardware allows for your choice of more quality or compression while still meeting your speed requirement. I would have thought a guy like you would appreciate the advances made possible by new hardware, but alas, it seems you don't even notice that they're happening if you really believe software has hit a standstill.

Consider the general population might like the idea of transcoding high quality, high definition video on demand (in real-time) that removes the need to spend time pre-encoding videos. The market is slowly, but surely moving to higher definition (4K, 8K, etc.) video streams. Do you suppose there is a market for ultra high definition video, transcoded in real time, on demand, across a no hassle, low latency wireless link a la Tony Stark? Current display definitions and wireless display technologies are available, but there are definite hardware limitations. In your line, it is unlikely that you don't see how hardware limitations can limit the quality of video/audio codec quality. It might interest you to know, if you don't already, that there is in at least one company working on just these things looking to (eventually) bring Starkesque display technologies to a mansion near you ... and after a few years and some price drops, a house near you. Given your previous statements in this thread, you may be surprised to learn that current processors aren't up to the task. The codecs in consideration are increasingly shifting the burden away from link throughput and into the processor.
quote:
What are your credentials?
I could come up with several examples of why credentials don't automatically make you an undisputed expert, but in this case, I'll simply mention that being an expert in low level codec programming doesn't necessarily give you insight into other fields in which CPU processing power in the mass market is relevant. If its really that important to you, I build hardware that people like you program software for.

Home automation may catch on when it becomes easy, convenient, and inexpensive. One hold up is voice recognition. While voice recognition is available on your phone, the phone doesn't do the actual work and is often more than a little bit annoying to deal with. More than a few people would be less than happy if their home automation system stopped working whenever the internet or servers had issues (or if their activities were snooped). Hence, more local compute power for better voice recognition is required. If it were cheap and reliable, I'm sure some people would love systems that could unlock their house, car, etc. facial/voice/pattern/all recognition (removing the need for keys). All of these would require more work and processing power to work reliably enough to use as a primary security device. Finally, lest I receive lashings for my heresy, I must bring recognition to the ever insatiable PC gaming market.


"We shipped it on Saturday. Then on Sunday, we rested." -- Steve Jobs on the iPad launch














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki