backtop


Print 130 comment(s) - last by flipsu5.. on Aug 18 at 1:45 AM

Intel says Atom is doing very well in the marketplace

When Intel launched its tiny Atom processor, it intended for the small, low-cost CPU to find its way into many cheaper consumer electronic devices like the new class of netbooks, mobile phones, and Mobile Internet Devices (MIDs). Intel's Atom has found its way into a number of small, lower-cost devices so far since its launch including the Asus Eee PC, MSI Wind, and Acer Aspire One.

Intel is looking to the low-cost Atom processor to help it grow its business and profits in the face of a slowing trend in the PC market. According to Intel Chief Financial Officer Stacy Smith, the Atom CPU is doing very well. Reuters quotes Smith as saying in an interview, "Atom is off to a very, very rapid start, far exceeding our expectations when we started the year. It's the perfect recession product to have in the marketplace."

According to Intel, the Atom processor is well placed for the mobile market and emerging markets. The low-cost nature the processor makes it desirable as the CPU to be used in low-cost secondary computers or in low-cost systems aimed at children. Smith does maintain that Intel won't know the complete size of the market for the Atom processor for about six months. Smith also says that the Atom processor seems to be growing the market rather than cannibalizing existing PC sales.

The much lower price and lower performance of the Atom CPU compared to Intel's more common Core 2 processors has Intel saying that it is not worried that the Atom processor will cannibalize existing CPU sales, with the possible exception of low-end Celeron sales. However, Smith told Reuters that the Atom could cannibalize low-end Celeron sales and that he was all for that.

Smith said, "If it's [the Atom] cannibalizing from the Celeron part of the market, I'll take that any day." Reuters also reports that Smith maintains Intel would be able to meet its third-quarter predictions of $10 billion to $10.6 billion in overall revenue.

Intel has offered no insight into its profitability from the Atom processor, but Smith did say Intel is able to get 2,500 Atom processors per silicon wafer. According to Reuters that should mean Intel makes a healthy profit on its Atom processors. Intel is also looking at the embedded market is a serious marketplace for its Atom processor.

Smith says that interest among embedded customers for the Atom processor has been very strong. He does point out though that it could take years for Intel realize revenue for the embedded market because of longer design cycles. Intel does say that once the Atom has been designed into a car or cable box the processor would remain there for years.

Intel has been aggressively introducing new models in its Atom CPU line since the initial introduction. In April, Intel announced five new Atom models and more models are on the way.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Hmm.
By Diosjenin on 8/13/2008 3:56:57 PM , Rating: 2
Wasn't it, oh, let's say July 22nd when this very site ran a piece explaining how "many of the industry's largest players are fearful that in undoing the tradition high-profit-margin, high power hardware model that the industry has operated on for years; they may be put out of business"?

That sure did seem to be the impression Intel and AMD and every notebook OEM out there were giving off. Now all of a sudden Atom is Intel's savior?

Did something special happen in the last three weeks that I was totally unaware of? What gives here?




RE: Hmm.
By masher2 (blog) on 8/13/2008 4:03:13 PM , Rating: 5
There's no dichotomy in the two statements. The high-power, high-profit industry is dying a slow death, and that does worry many players. So what can they do about it, other than push low-power chips like the Atom?

Such a strategy allows Intel to salvage as much as it can from the industry's evolution.


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By Master Kenobi (blog) on 8/13/2008 5:12:15 PM , Rating: 3
No worries, AMD lacks an Atom competitor.
High End. Intel Core 2 Series.
Mid Range. Intel Core 2 Series.
Low End. Intel Core 2 Series.
Sub-Notebook. Intel Atom Series.

Looks to me like AMD still needs to play catch up.


RE: Hmm.
By Mitch101 on 8/13/2008 5:56:17 PM , Rating: 5
There is still the Via Nano to take on Atom.

HTPC End - AMD Dual Core + 780G or 790GX Mobo


RE: Hmm.
By Chadder007 on 8/13/2008 6:48:12 PM , Rating: 2
Yeah, but who has signed up to use the Nano yet?


RE: Hmm.
By TomZ on 8/13/2008 7:21:10 PM , Rating: 3
Good point. On the one hand, you have Intel, a company with large resources available for developing product, manufacturing product, and working for design wins.

On the other hand, you have VIA, which has really never been anything more than a niche player in the low end of the CPU market.

Even if the Nano was very superior technically (which it isn't), there are still a lot of reasons an OEM/ODM would prefer to go with Intel.


RE: Hmm.
By Mitch101 on 8/15/2008 2:54:52 PM , Rating: 2
The ATOM Splitter
Athlon 64 2000+
http://www.tomshardware.com/reviews/Atom-Athlon-Ef...

Didn't see this coming either.


RE: Hmm.
By Pirks on 8/15/2008 10:22:17 PM , Rating: 2
Hahaha, I wonder if TomZ got a heart attack after reading this article and realizing who's gonna "wipe the floor" as he said :))) Reality is a dangerous thing, Tom, you better stop reading this thread NOW! Or else... ;))


RE: Hmm.
By Mitch101 on 8/16/2008 11:53:03 AM , Rating: 2
Cant call it a win yet. Lima isn't released yet and Dual Core Atoms arrive in September. But its good to see AMD isn't completely blind sided by this. One can only hope that this isn't another outlet for AMD cash bleeding. I suspect AMD's plan isn't to make money on the CPU's as much as it is to make the money on the chipsets.


RE: Hmm.
By flipsu5 on 8/18/2008 1:45:57 AM , Rating: 2
quote:
On the other hand, you have VIA, which has really never been anything more than a niche player in the low end of the CPU market.


Nano is actually positioned in between Atom and Celeron. Quite successfully I might add. HP has already placed the order.

It is true it is a niche player, but believe it or not, Atom will only fill a niche in the near future. It may even be an empty niche.

Intel probably didn't need to put up the Atom. Now it risks cannibalization from Nehalem at high end and Atom at low end. In past generations, it has only been high-end where cannibalization took place. The enthusiasts wait for the newer better product. Now there is added opportunity for abandonment of product lines, as non-enthusiasts wait for the cheapest possible product.


RE: Hmm.
By flipsu5 on 8/18/2008 1:30:05 AM , Rating: 2
HP


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By TomZ on 8/13/2008 6:15:21 PM , Rating: 3
Wake up! The E8500 consumes about 33W active and 3W idle, and probably blows the doors off your X2.


RE: Hmm.
By rudolphna on 8/13/2008 7:38:39 PM , Rating: 1
and yet its still marketed as a 65w CPU.. hmmmm.... makes me wonder, dont ya think maybe the "45 watt" AMD CPUs consume less than the E8500? :)


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By TomZ on 8/13/2008 8:29:10 PM , Rating: 1
You asked for an Intel chip less than 45W, and I provided one. Now you call me a fanboy. Get a life!

I don't give a rats @ss about the AMD side, since we know that right now Intel has (a) better overall performance across the board, and (b) better performance per watt across the board. This is shown in tons of articles and benchmarks across the 'net.

When/if AMD turns the tables again in the future, I'll be back with AMD, but only a fool would buy AMD right now and claim anything other than lowest cost.


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By Khato on 8/13/2008 9:40:46 PM , Rating: 3
http://www.anandtech.com/cpuchipsets/showdoc.aspx?...

Yes, it's about a year old, but have fun trying to convince anyone that the newest version, the 2.5 GHz 4850e, is going to win out on either performance or power compared to Intel. And if you want to continue using the TDP quoted in design spreadsheets, then why don't you include the same footnote as exists in those spreadsheets - "Thermal Design Power (TDP) should be used for processor thermal solution design targets. The TDP is not the maximum power that the processor can dissipate."

You do have a perfectly valid argument if you go for the performance per dollar metric. After all, unlike AMD, Intel actually makes a profit.


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By maroon1 on 8/14/2008 12:22:30 PM , Rating: 3
quote:
I asked for an Intel chip rated at 45W, and you gave me one rated at 65W


It is rated at 65W, but it consume just 33W at full load

http://xbitlabs.com/images/cpu/intel-wolfdale/pcon...

Most (if not all) of intel core 2 chips consume less power that their rated TDP

quote:
Maybe AMD chip rated at 45W consumes about 25W, how about that?


Did you say "maybe" ?

Please show us a proof

Furthermore, E8500 performs much better than those AMD chips that are rated at 45W. So, I believe E8500 is an easy winner in performance per watt


RE: Hmm.
By Pirks on 8/14/08, Rating: 0
RE: Hmm.
By maroon1 on 8/14/2008 4:58:21 PM , Rating: 2
quote:
Only after you, sir.


Why ?

You come here to attack Intel because it doesn't have 45W rated CPU, and you make an assumptions that AMD 45W consume less power than Intel 65W rated CPUs. Yet you ask others to show you a proof !!

Why you don't show us any proof for any of your claims ?

At least we showed you that E8500 which is one of the fastest Wolfdale consume just 33W at load, even though it is rated at 65W

TDP is a poor measure of power consumption, its not actual power consumption.


RE: Hmm.
By Pirks on 8/14/2008 6:26:48 PM , Rating: 1
quote:
At least we showed you that E8500 which is one of the fastest Wolfdale consume just 33W at load, even though it is rated at 65W
Are you going to show me that 45W rated AMD X2 consumes more power than E8500 ON THE SAME CHIPSET, or are you going to keep the same old blabber going?

There's no sense to continue discussion if you're out of arguments.

Everybody boasts that 33W number for Intel but noone can prove this is actually LESS than 45W AMD X2 chip. Why don't you stop repeating yoursef if you can't add any substance/new facts/numbers to the discussion?


RE: Hmm.
By FaceMaster on 8/14/2008 8:30:32 PM , Rating: 2
AMD = 45W, Intel = 33 Watt under load, despite being rated at 65 W. Now, prove to me that the AMD 445 W under load consumes less than 33 watts. Don't change the subject, just find the facts! (This will make a change for Pirks...)


RE: Hmm.
By Pirks on 8/14/2008 9:15:56 PM , Rating: 1
If you wanna prove that AMD consumes more power despite the lower rating - the burden of proof is on you. I have lower power rating on my side, hence if Intel typically consumes 50% of its TDP rating - same must be true for AMD, meaning that AMD X2 must be consuming 25W - which is LESS than Intel CPU's consumption.


RE: Hmm.
By Lightnix on 8/15/2008 12:33:39 AM , Rating: 2
TDP ratings mean very little as the way that companies gauge them is not done by a set standard, but by their own proprietary methods which may not be the same as another manufacturer, or even their own previous methods. Trying to prove that a CPU is more energy efficient based on TDP rating, which by the way is not a measure of power consumption - but a guide toward roughly how much heat a CPU cooler will need to deal with, is futile. I mean for one thing, AMD and Intel don't use the same microfabrication techinques - one might be more efficient and not put out so much waste heat (which, we remember is what the TDP rating is about).

I turn your attention once again to the xbitlabs.com article,

http://www.xbitlabs.com/articles/cpu/display/intel...

The E8200 processor not only consumes WAY under its TDP, but also outperforms the Athlon X2 6400+ by a fair margin in most tests. It might not be hard, concrete proof, but it's fairly rational reasoning. The AMD CPU pretty much cannot draw a significantly smaller amount of power than the Intel CPU, as the E8200 isn't really drawing a significant amount of power itself (in a desktop rig anyway).

Even if the 4850e only, theoretically speaking, drew 15W, that's only 12W less than the E8200, which in reality is a tiny fraction of a system's overall power consumption, and wouldn't become noticeable unless you were running off a battery or something.


RE: Hmm.
By Pirks on 8/15/2008 4:27:49 AM , Rating: 1
quote:
outperforms the Athlon X2 6400+
I don't care about 125W AMD chip, we're discussing 45W chips here. Well, if there's no proof then there's no proof, and I suggest everyone should keep their own beliefs until said proof eventually pops up on the Net. I mean some detailed analysis written by Anand himself that really makes the picture clear with regard to 45W AMD chips. Or something similar.


RE: Hmm.
By FaceMaster on 8/15/2008 5:47:36 AM , Rating: 2
quote:
if Intel typically consumes 50% of its TDP rating - same must be true for AMD


You have no logic. Prove that AMD consumes less than its TDP. Please.


RE: Hmm.
By Pirks on 8/15/2008 2:12:46 PM , Rating: 2
Prove that AMD consumes not less than its TDP, please.


RE: Hmm.
By FaceMaster on 8/15/2008 8:01:01 PM , Rating: 2
AMD says that its TDP- 45 Watts- that's the point of a TDP. It's up to you to help AMD out of this one, fanboy!


RE: Hmm.
By Pirks on 8/15/2008 10:28:51 PM , Rating: 2
It's up to you to prove that 65W Intel CPUs consume less power than 45W AMD CPUs, fanboy!


RE: Hmm.
By FaceMaster on 8/16/2008 6:17:43 AM , Rating: 2
quote:
Wake up! The E8500 consumes about 33W active and 3W idle, and probably blows the doors off your X2.


There. 33 W is lower than 45 W. I assume you're going to keep ignoring this evidence because you can't come up with any of your own, because you're a blatant fanboy. I am going to terminate this arguement as there is no way you are being serious. Anybody with at least a bit of sense would have looked at the facts and have realised that Intel are superior to AMD at the moment. I am not going to reply any more because you are childish and don't understand obvious facts, have a nice day and feel free to buy AMD processors, I want them to continue... but I'm going to stick with Intel for now, which seem superior in every way. *Expects 'hahahahahahha you lose and are running away!' reply, proving that Pirks is just here to flame*


RE: Hmm.
By Pirks on 8/16/2008 1:02:54 PM , Rating: 2
I assume you're going to keep ignoring the fact that noone has posted 45W AMD CPU consumption numbers on the same chipset, because you're a blatant Intel fanboy


RE: Hmm.
By mindless1 on 8/13/2008 8:51:20 PM , Rating: 5
Actually it's been well known for quite a while that at any given performance level, Intel CPUs uses less power. Been that way for quite a while now, actually was always nearly equal or less but for the exception of higher clocked P4s and early P3s.

Getting a Core2Duo to run at 45W or less is easy as pie, and for desktop use it'll still outperform. It's not that I'm an Intel fan, actually I root for AMD since they're the underdog that's needed to keep CPU market prices sane and competition moving along to faster models sooner.


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By TheSpaniard on 8/14/2008 3:02:46 AM , Rating: 2
why do you keep saying maybe?

listen, the Intel ones can be UNDERCLOCKED to match the AMDs performance level and will be using less power. if you underclock the AMDs chip we might as well start comparing it to an overclocked ATOM*

*Disclaimer: I do not know if any amount of overclocking will make an ATOM perform on par with the previously mentioned AMD processor


RE: Hmm.
By Pirks on 8/14/08, Rating: 0
RE: Hmm.
By mindless1 on 8/14/2008 2:20:00 PM , Rating: 2
I don't have a burden to prove something you can't be bothered to research on your own, what do I really care if you have the Intel alternative when you seem perfectly happy with what you already have instead?

Yes, some Intel processors are more expensive than your AMD CPU would cost today (I have no idea when you bought it so that makes it fairly hard to do price comparisons but we were talking power and performance not price).

The funny thing is, I was talking about power and performance without even considering C1E or EIST. Without those, giving AMD an advantage by keeping it's native power reduction features, it might actually come out ahead if a system were left sitting idle but we were talking performance and in that regard it's no contest.

Did you bother to look at benchmarks then compare Intel's spec sheets? They don't exactly hide this info. THAT is a lot of numbers so go ahead and look it up since you seem to want to argue with a 3rd party instead of learning it for yourself.


RE: Hmm.
By Pirks on 8/14/2008 3:50:46 PM , Rating: 2
quote:
Did you bother to look at benchmarks then compare Intel's spec sheets?
Have you seen benchmarks comparing latest 45W AMD X2s with 65W Intel CPUs _ON THE SAME CHIPSET_?


RE: Hmm.
By TheWay64 on 8/14/2008 7:05:34 PM , Rating: 2
quote:
Because Intel fanboys.


Funny

You accuse others of being an Intel fanboy, yet all your posts clearly show that you are an AMD fanboy !

quote:
can't prove their beliefs with numbers


So, what ?

Maybe because there is no review that compares 45nm Core 2 Duo directly to an 45W AMD procossor !

It seems that you can't prove that 45W AMD consume less power either.

Furthmore, you are one who started to make those claims not us


RE: Hmm.
By Pirks on 8/14/2008 7:46:18 PM , Rating: 2
Yeah, I said that 45W AMD CPUs are rated less than 65W Intel CPUs, hence they must be consuming less. Lower rating ==>> lower power consumption, got it? Intel fanboys here searched across the whole Internet and still could not prove the opposite. I guess we can close the discussion now.

Intel fanboys, come back with numbers, otherwise don't bother to post your beliefs and assumptions here, everybody knows them already.


RE: Hmm.
By Oregonian2 on 8/14/2008 8:35:42 PM , Rating: 2
Less if they're being rated using the same measurement methods and assumptions (which I know their standard methods aren't). Difference could be even bigger or it could not exist at all. Are the 65 and 45 numbers you mention measured the same way with the same assumptions?


RE: Hmm.
By Pirks on 8/15/2008 9:55:00 PM , Rating: 2
It's hard to believe Intel or AMD would severely overestimate power requirement of their chips. Intel would never rate their chips 65W if in reality they would never cross for example the 35W mark. Same's true for AMD, hence there's usually a simple logic behing CPU power ratings. A power rating says: this CPU may consume UP TO XXX WATTS, meaning that usually it'll consume less. So it's totally logical to assume that lower maximum power rating also means lower typical power consumption. So far I have seen absolutely NO proof of the opposite. There are only a few Intel fanatics spilling their usual BS around here, but no concrete hard facts/numbers proving my common sense logic wrong.


RE: Hmm.
By mindless1 on 8/15/2008 4:12:26 AM , Rating: 2
I already told you how to come up with numbers, hard engineering data on spec sheets. Oops, even then we'd have to ignore the higher performane so basically you have to do a little bit of math to see that if the Intel CPU is underclocked enough to be performing equally, then that former 65W CPU obviously isn't 65W anymore.

I think that's where you went wrong, not understanding that once you degrade the performance of an Intel CPU enough to seem equal, you have also decreased it's power consumption quite a lot.


RE: Hmm.
By Pirks on 8/15/2008 10:32:15 PM , Rating: 2
quote:
once you degrade the performance of an Intel CPU enough to seem equal, you have also decreased it's power consumption quite a lot
I already said that I'm not discussing underclocking, as none in their sane mind would do that.


RE: Hmm.
By mindless1 on 8/17/2008 12:16:13 AM , Rating: 2
None in their sane mind would only look at power/heat without a consideration of the performance and that you can have the same or lower power/heat with higher performance.

It's like you picked the losing horse in every respect except price? Cheap and low power is good, but there's not a large price difference for Intel's lower end at this point either. It's really easy to have what you claim is important except you want to narrowly enough define something as to make it pretty much non-applicable to anyone's needs.


RE: Hmm.
By TheWay64 on 8/15/2008 12:32:17 PM , Rating: 2
quote:
Yeah, I said that 45W AMD CPUs are rated less than 65W Intel CPUs, hence they must be consuming less. Lower rating ==>> lower power consumption, got it?


No, I didn't get it because

You are ingnoring that fact that the TDP is not the actual power consumption

Some processors consume much less power that their rated TDP (like the 45nm Core 2 Duo)

Some processors consume the same or close to their rated TDP


RE: Hmm.
By Pirks on 8/15/2008 10:42:46 PM , Rating: 2
quote:
Some processors consume much less power that their rated TDP (like the 45nm Core 2 Duo). Some processors consume the same or close to their rated TDP
But you don't know what the numbers are for Intel 65W CPUs and AMD 45W CPUs on the same chipset, so what's your point then?


RE: Hmm.
By William Gaatjes on 8/14/2008 1:51:50 AM , Rating: 2
When Nehalem comes, it will be a more apples to apples comparison.

The IMC from AMD needs power too. In reality part of the power consumption of the Intel northbridge needs to be added to the power an Intel cpu uses to make some comparison.
But There is no way an AMD cpu on 65nm can match Intel on 45nm. Now hypothetical speaking, let's say we could have an phenom on Intels 45nm process. Then the penryn would not be that special anymore.

Intel has a good design but they have an even better process. Intel's biggest problem was leakcurrents and they solved that problem with the 45nm hafnium gate process.

That's why idle power is so incredibly low now on intel cpu's. But when put to full load, 40-50W increase is not that strange.


RE: Hmm.
By omnicronx on 8/14/2008 10:27:26 AM , Rating: 2
quote:
In reality part of the power consumption of the Intel northbridge needs to be added to the power an Intel cpu uses to make some comparison.
People always forget this, but it must be taken into consideration as AMD chips have the onboard memory controller. You are exactly right on the Nehalem front too, as both sides will be using an onboard controller.


RE: Hmm.
By Pirks on 8/15/2008 2:33:36 PM , Rating: 2
Also the local Intel fanboys always forget that it's the platform that matters, not just the CPU. When you take 45W AMD X2 and 780G as a platform together - how much power is going to be consumed by competing Intel platform having same level of CPU and GPU performance? I won't be surprised if Intel loses on both the power and price points. That's why Intel fanboys are so vitriolic about my posts here and downrate me furiously whenever they can. They just can't stand that Intel turned out to be pretty sucky as a platform, so what can they do besides downrating me and repeating the same unproven assumptions like "AMD chips always consume as much as their TDP rating says"? There you go, a concise and simple explanation of what is going on in this thread.


RE: Hmm.
By boogle on 8/14/2008 4:30:55 AM , Rating: 2
http://www.xbitlabs.com/articles/cpu/display/intel...

Intel uses less power.

Basically Intel gives an entire family a single TDP for the simple reason that OEMs are likely to offer one computer with the entire range of CPUs in that family at different price points. So the fastest CPU in the range dictates the TDP for that series - not the slowest.

TDP is a poor measure of power consumption, its not actual power consumption and never will be. Especially since AMD recently changed the definition of TDP with Phenom to be 'typical power consumption' rather than 'absolute max heat dissipation' so some of AMD's official figures are actually below the real power consumption of the CPU.

Either way - with most PCs you need to compare idle power consumption since the CPU will be doing that most of the time. You also need to look at performance too - if one CPU uses 100W for 10 seconds, is that not better than a CPU that uses 10W for 2 minutes? Especially if the idle power consumption of the two is comparable. Incidentally Intel CPUs use less power in idle than AMD CPUs, so, yeah, umm, if the PC is doing very little work and on for long periods of time Intel CPUs are massively more efficient.


RE: Hmm.
By Pirks on 8/14/08, Rating: 0
RE: Hmm.
By JoshuaBuss on 8/14/2008 1:06:22 PM , Rating: 2
if all you're concerned with is low power, why aren't you running an atom?

you're not listening to anyone here pirks.. yes, you might be using less power, but you're getting less performance for the power you're using than if you'd switch to intel.


RE: Hmm.
By Pirks on 8/14/2008 1:32:00 PM , Rating: 2
quote:
why aren't you running an atom?
Why would anyone go for Atom on a desktop? Atom is a purely mobile CPU.
quote:
you're getting less performance for the power you're using than if you'd switch to intel
Why noone showed me any proof of that, any decent numbers comparing power consumption of 45W AMD X2s with Intel CPUs _ON THE SAME CHIPSET_? Why only words of blind fanboyish belief and not a single scientific argument? Smells fishy, don't you think?


RE: Hmm.
By boogle on 8/15/2008 5:10:05 AM , Rating: 2
I did have a nice post ready, but having seen the numerous other posts it seems you're really enjoying yourself arguing with all these people. So I'll refrain from feeding your addiction any longer.

I did find a photo of you though: http://www.caveyourtrolls.com/img3.jpg

Uncanny!


RE: Hmm.
By Pirks on 8/15/2008 8:45:04 PM , Rating: 2
That's what fanboys start posting when they are out of arguments


RE: Hmm.
By maroon1 on 8/14/2008 12:13:11 PM , Rating: 2
quote:
[enjoying total silence of my new 45 watt Athlon X2] Wake me up when Intel introduces 45 watt desktop CPUs :P


My friend Most of the intel Core 2 processors consume much less power that their TDP

E8500 consume less only 33W at load

http://xbitlabs.com/images/cpu/intel-wolfdale/pcon...


RE: Hmm.
By Pirks on 8/14/2008 4:05:03 PM , Rating: 2
My friend, most of the AMD processors consume much less power that their TDP too :P How about AMD X2 consuming just 25W at load, huh?


RE: Hmm.
By maroon1 on 8/14/2008 4:41:04 PM , Rating: 2
Talking is easy

Please should as a proof that supports your claim.


RE: Hmm.
By Pirks on 8/14/2008 6:36:06 PM , Rating: 2
So you gave up 'cause you can't find any proof that E8500 consumes less power than my 45W rated AMD X2? Or are you just going to keep endlessly repeating that "33W" mantra? That won't help 'cause this mantra is worthless without 45W AMD X2 consumption numbers on the same chipset. Good luck repeating yourself ;-)


RE: Hmm.
By StevoLincolnite on 8/13/2008 6:05:10 PM , Rating: 2
I personally thought low-end was reserved for the Pentium 2xxx series and Celerons were for the Ultra Low-end and not the Core 2 Series.

(Unless you meant the Pentium and Celeron series as part of the Core 2 series because of the similar architecture?)

Personally though, I find the low-end to be a better deal on AMD's side of the fence with the Athlon 64 X2 when you do not introduce the overclocking Potential on the Pentium 2xxx series.


RE: Hmm.
By Master Kenobi (blog) on 8/13/2008 8:26:11 PM , Rating: 2
Yes, the current Pentium and Celerons are nothing more than Core 2's that were cannibalized in various ways to make them "low end chips".


RE: Hmm.
By FaceMaster on 8/13/2008 8:14:39 PM , Rating: 2
Pirks are you insane? It's almost as if you have something against Intel. Not that this is a recurring theme in your posts or anything. Although I love competition, well... I love competition and at the moment AMD may be getting stronger (than they were say, a year ago) but they're still not capable of taking on Intel.

Intel chips overclock better as well. Oh wait, overclocking is pointless because nobody does it.


RE: Hmm.
By Pirks on 8/13/08, Rating: 0
RE: Hmm.
By FaceMaster on 8/13/2008 9:52:54 PM , Rating: 3
quote:
just one or two additional FPS in a high-end game like Crysis


Haha wouldn't that be like, a 100% increase?

Okay seriously now

http://www.overclockersclub.com/reviews/intel_q930...

9 fps boost on Crysis. Higher resolutions rely on GPU power more in a game like Crysis. Observe the Phenom 9600's scores before you say that CPUs don't matter, mind. I know you'll go 'YEAH ITS FASTER THAN ALL OF THEM' ... yeah, by 1 fps in one of the benchmarks, in the others it SERIOUSLY lags behind, particularly on lower resolutions (Which I assume is what 'non enthusiasts' would run it at)

http://www.overclockersclub.com/reviews/intel_q930...

Yum more tasty overclocking scores

http://www.overclockersclub.com/reviews/intel_q930...

I rest my case.

Next?


RE: Hmm.
By Pirks on 8/13/2008 10:17:16 PM , Rating: 2
quote:
9 fps boost on Crysis
1 fps with the resolution I play (1920x1200). Thanks for proving my words with your diagrams. Next?


RE: Hmm.
By EricMartello on 8/14/2008 12:38:25 AM , Rating: 3
quote:
1 fps with the resolution I play (1920x1200). Thanks for proving my words with your diagrams. Next?


In the tests listed on that website at that resolution, the FPS are being limited by the video card (Nvidia 8800GT) and not the CPU...which is why the variance is minimal. It's not a relevant comparison of CPU power...nor does it support your claim that "overclocking doesn't matter anymore". If, in fact, you were to run 8800GT in SLI, you would once again see larger discrepancies in the FPS results, due to the CPU's performance.

quote:
wow [looking again at your diagrams] I had no idea phenoms smash, piss and spit on intel chips at higher resolutions, thanks a LOT for your diagrams AGAIN! [sending them quickly to a few Intel zealots among friends of mine] they will definiely enjoy this, hehehe ;)


If only that were the case...let me break it down for you:

Your AMD X2 is to a low-end Intel C2D as a 1987 Hyundai SCoupe is to a Nissan Maxima.

One is cheaper and gets better fuel economy, but the other one gives you a lot more for the money in terms of performance, efficiency and actual usability - be it for work or play. :)


RE: Hmm.
By Pirks on 8/14/2008 10:51:38 AM , Rating: 1
quote:
the FPS are being limited by the video card (Nvidia 8800GT) and not the CPU
If you buy a pair of 4870X2s and your CPU becomes more overclockable then you definitely have money to swap your CPU for faster one, hence your argument is moot - no overclocking is necessary. Who buys $1000 of video hardware and meager budget CPU to match? Only people from your dreams :)
quote:
Your AMD X2 is to a low-end Intel C2D as a 1987 Hyundai SCoupe is to a Nissan Maxima
Besides Intel fanboys, who needs "Maxima" when it's smashed by a cheaper AMD machine at higer gaming resolutions?


RE: Hmm.
By EricMartello on 8/14/2008 3:52:15 PM , Rating: 2
quote:
If you buy a pair of 4870X2s and your CPU becomes more overclockable then you definitely have money to swap your CPU for faster one, hence your argument is moot - no overclocking is necessary. Who buys $1000 of video hardware and meager budget CPU to match? Only people from your dreams :)


Some people buy a cheaper CPU and overclock it to match the performance of a more expensive one...or you can also get a CPU like the Q6600 which is not really a "budget" CPU, but overclocks tremendously...lots of people are able to run it at 2.8-3.0 GHz with air cooling and minimal voltage increase.

Two 8800GTs might cost you $600-$700 tops...so for well under $1,500 total you could have a powerful gaming system that excels in pretty much all categories. Even if you're trying to make a power-efficient system, Intel still gives you more computing power per watt than AMD.

quote:

Besides Intel fanboys, who needs "Maxima" when it's smashed by a cheaper AMD machine at higer gaming resolutions?


Can you show me a link to some objective tests where the AMD system is "smashing" an Intel system at higher resolutions? I'm sure you would have provided a link already, if what you stated was in fact true.


RE: Hmm.
By Pirks on 8/14/2008 4:22:11 PM , Rating: 2
quote:
Some people buy a cheaper CPU and overclock it to match the performance of a more expensive one
When you play games in a high resolution, you don't care about CPU, you care only about GPU, so overclocking CPU has a negligible gain, why bother?
quote:
Two 8800GTs might cost you $600-$700
Why buy two 8800GT when you can buy faster 4870 X2 for less?
quote:
Even if you're trying to make a power-efficient system, Intel still gives you more computing power per watt than AMD
Nobody gave me a proof of that with regard to 45W AMD X2s, it's just an urban legend spread around by Intel marketing and happily consumed by numerous Intel fanboys.
quote:
Can you show me a link to some objective tests where the AMD system is "smashing" an Intel system at higher resolutions?
http://www.overclockersclub.com/reviews/intel_q930... <- look at 1920x1200 resolution

http://www.overclockersclub.com/reviews/intel_q930... <- look at 1680x1050 resolution

Enjoying these graphs now, dontcha, fanboy? ;-)


RE: Hmm.
By EricMartello on 8/14/2008 4:50:15 PM , Rating: 2
quote:
When you play games in a high resolution, you don't care about CPU, you care only about GPU, so overclocking CPU has a negligible gain, why bother?


Nah. CPU and GPU always matter, it's just that at lower resolutions the CPU is the bottleneck, and as resolutions increase the GPU becomes the bottleneck.

quote:
Why buy two 8800GT when you can buy faster 4870 X2 for less?


Oooh, let's ride your logic train. Why buy a 3D video card at all? You can get onboard video for even less!

quote:
Nobody gave me a proof of that with regard to 45W AMD X2s, it's just an urban legend spread around by Intel marketing and happily consumed by numerous Intel fanboys.


A simple google search for "amd x2 vs intel c2d" will yield pretty much all the proof you need from a variety of sources. An Intel C2D processor will absolutely mop the floor with your X2. Most benchmarks show an average of 40-50% more processing power from the C2D vs an X2, at the cost of using 30% more electricity (20 watts). Let's keep in mind that many CPUs in AMD's line-up are also using much more than 45W.

quote:

look at 1920x1200 resolution
look at 1680x1050 resolution

Enjoying these graphs now, dontcha, fanboy? ;-)


Um, you seem to forget that at those resolutions on the systems in that test, the GPU is the limiting factor. I can attribute the miniscule FPS improvement of the Phenom processor to its HyperTransport - the on-CPU memory controller will give it a slight edge here, and that on-CPU memory has always been a good feature of AMD's architecture.

However, your links do fail to show that AMD has a CPU that can "smash" an Intel CPU, and this is the second time the reason is being explained to you. BTW- the Phenom 9600BE in that test uses 95W and not 45W.

Lastly, YOUR CPU is not a Phenom...so even if the Phenom was faster than its Intel counterparts, it would be irrelevant because your budget box wouldn't even ping the charts at all.


RE: Hmm.
By Pirks on 8/14/2008 6:50:18 PM , Rating: 2
quote:
as resolutions increase the GPU becomes the bottleneck
Yeah, and since I game on high resolutions I don't care about CPU overclock. With lower resolution even the cheap 2.5 GHz 45W AMD X2 gives you enough horsepower to game comfortably, hence no need to overclock in either case.
quote:
You can get onboard video for even less
But will it be faster? Nope.

You failed at reading comprehension 'cause you missed word "faster" in my previous post. Try to read carefully next time.
quote:
A simple google search for "amd x2 vs intel c2d" will yield pretty much all the proof you need
If that were the case I'd be drowning in a sea of links that local Intel fanboy crowd would pour all over me. Why is this not happening?
quote:
YOUR CPU is not a Phenom
So what? I can stick Phenom in my AM2+ mobo any time I want, IF I WANT TO. The problem is I still can't find any application for which I will truly need fast quad-core. If I wanna get a CPU that smashes Intel quads on higher resolution gaming - I can get it anytime, while Intel fanboys will continue sucking their fingers and gawking desperately at those diagrams I showed you above, with Phenom pissing on Intel quad with high resolution games.


RE: Hmm.
By EricMartello on 8/14/2008 9:41:12 PM , Rating: 2
quote:
Yeah, and since I game on high resolutions I don't care about CPU overclock. With lower resolution even the cheap 2.5 GHz 45W AMD X2 gives you enough horsepower to game comfortably, hence no need to overclock in either case.


There is a bit of irony in your claim that you "game at 1920x1200" yet cheaped out and put together a budget system...but simply because you don't want to overclock does not a) make it pointless or b) make CPU power irrelevant. Those Phenom benchmarks you keep linking to are not indicative of YOUR system's performance. Your CPU is probably a bottleneck at 1280x764.

quote:
But will it be faster? Nope.

You failed at reading comprehension 'cause you missed word "faster" in my previous post. Try to read carefully next time.


You're response was pointless. A faster video card is a waste of money if you're using a slow CPU.

quote:
If that were the case I'd be drowning in a sea of links that local Intel fanboy crowd would pour all over me. Why is this not happening?


You're already drowning in a pool of your own stupidity...we're just trying to throw you a line and help you out.

quote:
So what? I can stick Phenom in my AM2+ mobo any time I want, IF I WANT TO. The problem is I still can't find any application for which I will truly need fast quad-core. If I wanna get a CPU that smashes Intel quads on higher resolution gaming - I can get it anytime, while Intel fanboys will continue sucking their fingers and gawking desperately at those diagrams I showed you above, with Phenom pissing on Intel quad with high resolution games.


You mean if you could AFFORD to...and I still haven't seen this mythical "smashing" of Intel by AMD. All I see is a guy who can't afford fast PC components trying to pretend like he can play games at full resolution. :) hahaha


RE: Hmm.
By Pirks on 8/15/2008 9:29:07 PM , Rating: 2
quote:
Your CPU is probably a bottleneck at 1280x764
You obviously never played Crysis on high resolution, that's why you sound so funny to me :)
quote:
A faster video card is a waste of money if you're using a slow CPU
Again, try Crysis on high resolutuion before making me laugh yet another time
quote:
we're just trying to throw you a line
So where is it? Why I'm not seeing any relevant links and numbers? Sorry, but I'm drowning in fanboyish bullsh1t poured all over me by the local Intel worshipers.
quote:
I still haven't seen this mythical "smashing" of Intel by AMD
Because you're an Intel fanatic and refuse to look at the diagrams I shown you above. Whatever.


RE: Hmm.
By Pirks on 8/13/08, Rating: -1
RE: Hmm.
By FaceMaster on 8/14/2008 5:15:13 AM , Rating: 2
Pirks just shut up for once.


RE: Hmm.
By FaceMaster on 8/14/2008 10:17:27 AM , Rating: 2
Thank you.


RE: Hmm.
By Pirks on 8/14/2008 10:39:38 AM , Rating: 2
Why are you talking to yourself?


RE: Hmm.
By FaceMaster on 8/14/2008 8:27:20 PM , Rating: 2
quote:
Why are you talking to yourself?


Proof that reason falls on deaf ears with Pirks about.


RE: Hmm.
By Pirks on 8/15/2008 3:46:54 PM , Rating: 2
How can your talking to yourself prove anything?


RE: Hmm.
By masher2 (blog) on 8/14/2008 1:19:59 AM , Rating: 2
> "when you overclock your CPU you get just one or two additional FPS in a high-end game...Hence for any PRACTICAL purpose...overclocking is dead"

Believe it or not, some people do use computers for more than playing games. :p


RE: Hmm.
By Pirks on 8/14/2008 10:58:18 AM , Rating: 2
Seen any overclocked HPC clusters lately, masher? ;-)


RE: Hmm.
By Chocobollz on 8/14/2008 1:17:56 PM , Rating: 2
quote:
Believe it or not, some people do use computers for more than playing games. :p


I'm sure you mean like.. uhm.. watching pr0n? xD Well just say it already, no need to shy, actually I'm watching them often too LOL.


RE: Hmm.
By EricMartello on 8/14/2008 12:28:30 AM , Rating: 5
quote:
Wow masher, your words soothe my AMD loving soul, really :-) I expect AMD gaining more and more as people pay less and less for overpriced and totally unnecessary Intel chips.

[typing this from my freshly assembled silent and cheap 45-Watt AMD X2 + 780G microATX box where Vista flies like a rocket]

Intel's overpriced high-end CPUs being in trouble is a very very good sign indeed!


http://en.wikipedia.org/wiki/CPU_power_dissipation...

Pirks, if power usage is such a concern to you, you could have purchased any of the C2D T-Series which are 35W or less and at least 50% faster than your X2...of course, fast and efficient processors are unnecessary according to you...so it is better to buy a slower CPU that uses more power because you "love the brand". That's just being smart! :)


RE: Hmm.
By Pirks on 8/14/2008 11:11:30 AM , Rating: 1
Why do you Intel fanboys like to compare Intel mobile CPUs with AMD desktop CPUs? Out of arguments, huh?


RE: Hmm.
By EricMartello on 8/14/2008 4:03:12 PM , Rating: 2
quote:
Why do you Intel fanboys like to compare Intel mobile CPUs with AMD desktop CPUs? Out of arguments, huh?


You have yet to post a valid argument. To summarize you:

"I love AMD because they currently suck. I'm too poor to afford and Intel system so I wasted my money on an X2 which is based on 4 year old technology. The only good thing I can say about my X2 is that it's rated at 45W. Anyone who buys an Intel CPU is a 'fanboy', not a person who is concerned with higher performance or efficiency."

Funny thing is, there are instances where I will recommend and use AMD CPUs for clients - those are for people who do not need high performance AND are on a budget. AMD is the cheapest option, but in terms of performance per dollar, Intel has it beat even though an Intel system may cost a bit more.


RE: Hmm.
By Pirks on 8/14/2008 6:17:44 PM , Rating: 2
quote:
You have yet to post a valid argument
You have yet to post a valid counterargument. Come back when you have numbers and benchmarks showing desktop Intel vs. desktop 45W AMD X2 power consumption ON THE SAME CHIPSET. And you can keep all your other fanboyish blabber to yourself, I'm not interested.


RE: Hmm.
By EricMartello on 8/14/2008 9:44:58 PM , Rating: 2
quote:
You have yet to post a valid counterargument. Come back when you have numbers and benchmarks showing desktop Intel vs. desktop 45W AMD X2 power consumption ON THE SAME CHIPSET. And you can keep all your other fanboyish blabber to yourself, I'm not interested.


Facts are not disputable, hence I have no need to "argue" with anything you said...it makes me wonder why you are still here spamming the comments.


RE: Hmm.
By Pirks on 8/15/2008 10:35:42 PM , Rating: 2
Yep, facts that AMD CPUs are rated lower than Intel CPUs are not disputable, hence AMD must be consuming less power than Intel, unless you find a proof of the opposite. So you're absolutely right - it's indeed time for you to stop spamming the comments, 'cause you can't add anything valuable to the discussion here.


RE: Hmm.
By ineedaname on 8/14/2008 11:28:42 AM , Rating: 2
AMD already announced that it would not make an atom competitor. IMO they just shot themselves in the foot.


RE: Hmm.
By Pirks on 8/14/2008 12:46:14 PM , Rating: 2
No, it's you who shot yourself in the mouth ;) Read http://aving.net/usa/news/default.asp?mode=read&c_... and then answer the following question - why the heck does AMD need an Atom competitor WHEN THEY HAVE ONE ALREADY???


RE: Hmm.
By TomZ on 8/14/2008 1:49:04 PM , Rating: 2
Only you would see equivalence between the 2-4W Atom and the 30-35W Turion.

Turion can't even compete in its own market space (laptops) - and you think it will have even a hope in competing in the MID space, for which the Atom was specifically designed? LOL.


RE: Hmm.
By Pirks on 8/14/2008 2:02:10 PM , Rating: 2
LOL indeed. Just look at the double standards of this Intel fanboy TomZ. He says that Intel chips USUALLY consume much less than they are rated for, but then immediately forgets that the same is true for AMD chips too, hence making TomZ's claim of 35W for the 1.2 GHz Turion really worth a good LOL :)


RE: Hmm.
By TomZ on 8/14/2008 2:58:53 PM , Rating: 2
So are you saying that Turion's power consumption is anywhere near the Atom's? That is the implication of your statement.

And for the record, the power figures I put forward for Atom and Turion are both max/TDP. I.e., apples-to-apples. Actual power for both processors will be lower, but under no circumstances would I expect Turion to come anywhere near Atom's power.

It's funny you call me the fanboy, when it's actually clear to everybody reading this thread who the irrational supporter really is.


RE: Hmm.
By Pirks on 8/14/2008 3:44:40 PM , Rating: 2
quote:
Turion's power consumption is anywhere near the Atom's?
Why would anyone start selling netbooks with Turion inside if that was not the case?
quote:
under no circumstances would I expect Turion to come anywhere near Atom's power
The netbook manufacturer that made Turion based netbook doesn't seem to agree with you, so you can keep your beliefs to yourself, I'm not interested. I only care about REAL products like this netbook, and not about unproved assumptions and opinions.


RE: Hmm.
By Oregonian2 on 8/14/2008 8:39:48 PM , Rating: 2
Does AMD get 2500 Turion's per wafer to compete with the Atom in die costs?


RE: Hmm.
By Pirks on 8/15/2008 10:13:03 PM , Rating: 2
Are you serious about comparing die costs for the dualcore out of order CPU and a single core in order CPU?


RE: Hmm.
By TomZ on 8/14/2008 9:02:45 PM , Rating: 2
quote:
The netbook manufacturer that made Turion based netbook doesn't seem to agree with you, so you can keep your beliefs to yourself, I'm not interested. I only care about REAL products like this netbook, and not about unproved assumptions and opinions.

Oh yes, let's all watch for this new product from "Raon Digital" - they're going to set the world on fire with this product.

Actually, I think the biggest market will come from AMD fanboys like you who will continue to buy AMD even though the performance is less and the power consumption is higher. You're their perfect customer!

In reality, companies like Asus, etc. using Atom will wipe the floor with "Raon Digital."


RE: Hmm.
By Pirks on 8/15/2008 9:40:07 PM , Rating: 2
Blah blah blah. Intel fanboys screeching from envy looking at how AMD doesn't even need any Atom competitors right now 'cause it already has one and about to start selling netbooks with it. All while Intel fanatics like you keep hissing and spitting, trying to convince anyone stupid enough that single core Atom can withstand any sort of competition with Turion. There's a lot of fanatical idiots who's gonna believe that obvious BS around here... so you gonna get a big cheerful crowd of listeners, Tom ;-)


RE: Hmm.
By crystal clear on 8/14/2008 8:43:31 AM , Rating: 2
quote:
The high-power, high-profit industry is dying a slow death, and that does worry many players.


Yes !if it applies to AMD but not to Intel, as Intel ensures its high profit margins are intact by switching over from 65nm to 45nm & then later to 32nm .

There is NO evidence of the slow death you refer to.


RE: Hmm.
By Flunk on 8/13/2008 4:16:25 PM , Rating: 2
Intel is not positioning the Atom as a notebook CPU, but as one for embedded devices. Intel is very worried about cannibalizing Celeron sales, but what they really want the Atom to do is take the place of the ARM, MIPS and PowerPC based processors used in the embedded market.


RE: Hmm.
By Pirks on 8/13/2008 4:23:21 PM , Rating: 2
quote:
Intel is not positioning
Who cares about Intel marketing suits "positioning" something? Definitely not the Asus, MSI and many many other netbook manufacturers :P


RE: Hmm.
By Flunk on 8/13/2008 4:28:30 PM , Rating: 2
Intel cares, that was my point. It explains the reason how the chip can be selling so well but Intel being worried about it cannibalizing their other products.

How is your comment relevant to the topic at hand?


RE: Hmm.
By Pirks on 8/13/2008 4:58:00 PM , Rating: 2
quote:
How is your comment relevant to the topic at hand?
If you didn't notice words "Intel" and "netbook" in my post - you might want to increase your web browser's font size.


RE: Hmm.
By Oregonian2 on 8/14/2008 8:50:44 PM , Rating: 2
Notebooks, netbooks, embedded, laptop, what's the difference? The MSI wind, EEE, etc are low-cpu powered devices for low compute powered applications no matter what what one wants to call the box. I'll be getting a MSI wind for my wife soon when the 6-cell pink ones come out, more than likely. But it's not a processor power centric thing. The atom is a real wimp, but it's adequate for the purpose what ever one wants to call it. It's not anywhere near adequate for laptop or even low-end laptop uses (my semantics). There's a lot of people who look at the Atom powered boxes as absurd crappy devices that nobody can possibly want -- because it makes for a lousy laptop (their only concept of what a laptop look-alike could possibly be). My wife has been using a "laptop", one with a 300 Mhz pentium in it. Works fine for her uses, but is too heavy, large, and battery doesn't last long. Just need the same thing but lighter, smaller, and longer lasting battery. Atom based device-of-whatever-name looks just fine.


RE: Hmm.
By rudolphna on 8/13/2008 7:41:02 PM , Rating: 1
i would definetly like to see an Intel Atom in the Wii (you would get much higher clock speeds over the current 750Mhz PPC-based unit)at a lower power consumption to boot.


RE: Hmm.
By piroroadkill on 8/13/2008 9:19:12 PM , Rating: 3
What? Change the Wii's architecture completely, breaking all compatibility for no reason at all in this generation? That makes so little sense


RE: Hmm.
By Doormat on 8/13/2008 4:29:02 PM , Rating: 2
Also, better for Intel and individually, their own partners to cannibalize their own revenue with new products than to let someone else come in and eat their lunch and be left with nothing.

The Atom will always have its place - but I dont think its a big threat until 23nm when they can make it OOO and fast enough to run Vista.


RE: Hmm.
By masher2 (blog) on 8/13/2008 5:22:23 PM , Rating: 2
> "but I dont think its a big threat until 23nm when they can make it OOO and fast enough to run Vista"

Some manufacturers are already selling Atom-powered Vista machines. I think you'll see Atom make serious inroads into the laptop market at the 32nm node.


RE: Hmm.
By nitrous9200 on 8/13/2008 8:13:48 PM , Rating: 2
I just put together an Atom 230 on an Intel D945GCLF board w/ 2GB RAM. It runs Vista Home Basic great and is very quiet in addition to being very power efficient.


RE: Hmm.
By nitin213 on 8/13/2008 8:17:51 PM , Rating: 2
I remember another report saying that Atom CPUs for netbooks will no longer be produced after 2Q09?
If that is indeed the case where did these 32nm Atoms come from..

IF your answer is going to be Moorestown.. well.... I really do not see where is it going to put the massive power consuming chipset.. langwell just does not cut.. the same way puoblo doesnt


RE: Hmm.
By Targon on 8/14/2008 10:08:43 AM , Rating: 2
There are different sales models that come into play at this point. There is still a need/demand for more processing power, which means that the high-end parts will still need to be developed. There is the "cheap computer" market as well, for those who only want to do word processing. And of course, there is the market for ulta-portable devices, including the "smart phone" and PDA market.

If the business model of any company is focused on the high end of the market, that model will fall apart as the low to mid level devices slowly grow in power to be "fast enough" for a greater percentage of the market. This is why AMD can still survive, because for most people, the AMD processors are fast enough to do the job for most people. It is important to continue development at the top end, because that segment of development can be applied to the mid and lower range of the market as well.

So, computer sales are down at this point, mostly because the economy, but also because of the normal upgrade cycle. Most people will wait 4-5 years between computer purchases, and if we are in year 2 since the last upgrade cycle for many people, sales for those people will be down. In another two years, sales will be up again.

We are also at a bit of a lull in the software industry. The transition from single-threaded applications to multi-threaded has been very slow due to companies wanting to re-use existing software designs rather than develop from the ground up. Once more multi-threaded applications are released, the advantage of a quad or oct-core processor might be seen, but for now, most people will not see an advantage in going to more cores.


By Comdrpopnfresh on 8/13/2008 5:19:43 PM , Rating: 3
Intel doesn't want the atom selling too much, because it will cut into celeron sales, and profit. They've managed to keep that from happening by putting it on a highly restrictive and aged platform-the 945c- it has also reduced their sale of complementary parts. Manufacturers can't put more than a dimm slot on a board, and it is almost always a so-dimm topping off at 533mhz. No pci-e either. Gotta be VGA too. But now that Via has the nano- an open, versatile, and faster platform chip and supporting platform, they need a PR move to keep sales and apparent openness up. No one wants to hop on a platform when the company producing it cripples it. If Intel really wanted to increase sales, and let Atom take over celeron- they'd open up the socket, and sell chips rather than bundles. They might even make it socket compatible with exisiting mainstream boards. One thing is for sure- if they're genuine on their words, they would open the platform up with haste, before the dual-core(what is it, diamondville?) chips come out.




By Donkeyshins on 8/13/2008 5:56:41 PM , Rating: 2
Well...lookie here (http://www.silentpcreview.com/forums/viewtopic.php... An ECS board with an Atom processor...and what's this? A PCIe slot? And two DIMM slots? Must be a desert mirage.

For that matter, the D945GCLF2 with a dual-core Atom processor is becoming available even as we speak (also has GigE).

I do agree with you that Intel sucks for pairing this with the aging 945G chipset - putting a 2.4W processor on a board sporting a 22W chipset is just retarded.


By Comdrpopnfresh on 8/13/2008 6:27:43 PM , Rating: 2
I only trust google as far as I can throw it, let alone the searcher for that matter. First off- it is an ECS product. That's like the Congo developing fusion electrical generation... lol. Nor does it seem to have been released. When I said pci-e, I generally would think a full-size slot of at least the x16, graphics enabling, variety would come to mind. The only thing this board really does is allow for tv-tuning on pci-e. Still a cpu soldered to still a 945c, and still vga on a 2.5-generation ago mobile graphics chipset.


By rudolphna on 8/13/2008 7:47:17 PM , Rating: 2
I dont know what your talking about. Ive never had a problem with ECS. Im still running an old ECS K7S5a (SiS 735) downstairs and its perfectly stable (always has been, never crashed) and thats running the chipset at 30Mhz overclocked for the past 5 years or so. Never hiccuped. My parents computer is a gateway using an ECS mainboard. Nothing wrong with ECS at all.


By mindless1 on 8/13/2008 9:02:58 PM , Rating: 2
... and for those two you have running still I've throw out boxes of ECS boards including several K7S5A as those in particular were very susceptible to capacitor failure regardless of using good PSUs, good cooling.


By Chocobollz on 8/14/2008 2:20:38 PM , Rating: 2
What Congo? Congo from Taiwan? ~_~ Ok, I know that some peoples thought most of the ECS's products were bad but I think their product were pretty rock solid, except the fact that they're slow compared to competitions (ASUS, Foxconn, Gigabyte, MSI). Maybe you haven't heard that ECS is one of the 5 biggest mainboard makers in Taiwan? Last time I heard they're at position #3, just below ASUS and Foxconn.


By Comdrpopnfresh on 8/14/2008 3:30:47 PM , Rating: 2
You seem to be thinking branding. I beleive foxconn makes A LOT of boards- some of asus's


By Khato on 8/13/2008 6:40:22 PM , Rating: 2
quote:
Intel doesn't want the atom selling too much, because it will cut into celeron sales, and profit.


I fail to see any proof for this assumption, especially as the article states the opposite. Why? My guess would be that the profit on a single Atom is as high if not higher than a celeron.

quote:
If it's [the Atom] cannibalizing from the Celeron part of the market, I'll take that any day


By mindless1 on 8/13/2008 9:09:13 PM , Rating: 2
% profit might be as high but total profit? Probably not so much then when you consider the sale of one directly replaces the sale of the other, yes Intel is right to suspect it will cut into profits IF the market were remaining stagnant but it is not. The market has reached a place where the full powered desktop processor, then the old or cut down Celeron version of that, is more performance, heat, power, size, than our need to try and miniaturize everything will allow for.

I don't even see Via as having a competing product, the TDP equivalent is too high on theirs. Intel most desperately needs a new chipset now.


By Khato on 8/13/2008 10:00:32 PM , Rating: 2
Oh, there's no question that in % profit atom beats out Celeron. See, the issue with Celeron processors has always been that they require the same assembly and test as the higher end processors. After all, it's been the case before that some Celeron processors were simply cores of the high-end line that had a defect in their cache or the like. Anyway, the only cost difference between a Celeron and a core 2 extreme is whatever the die size difference is. And while that is likely a decent difference, you have to figure that each Celeron still costs a fair deal to produce.

But now you have Atom. First, I'd expect the die size is about a third of a Celeron. Next, it's designed to be low cost, aka the assembly and test costs are reduced as much as possible through the design. Still, can only guess at what the actual cost of production is really... What's more interesting is how much they're reportedly selling them for. According to wikipedia, that ranges from $29 for the 4W cheapo, to $44 for the net-top aimed 2.5W, to $45 to $160 for the MID low power ones. Heh, now guess what the list price for a Celeron E1400 is - $53. Granted, it does bite into profit if they're replacing a mobile celeron, since those actually have decent margins, but the desktop versions? Good riddance =D


By Comdrpopnfresh on 8/14/2008 3:37:25 PM , Rating: 2
quote:
See, the issue with Celeron processors has always been that they require the same assembly and test as the higher end processors


That's exactly what makes them profitable. If a chip doesn't qualify for core2 sales, it can get binned as a cely. They recoup their losses, and make another product line from it. What happens when an atom fails to meet its binning specs? Also, how many fab lines are aimed at producing core2? Atom is a niche in Intel manufacturing, so margins of profit based off of raw-material costs vs selling point don't tell the story at all.


Lets get it crystal clear
By crystal clear on 8/14/2008 7:39:08 AM , Rating: 2
quote:
It's the perfect recession product to have in the marketplace."


Yes atlast there is a general acceptance & admission that the USA has moved on from an Economic Slowdown to a full fledged recession.

Yes there will be a spill over effect on other major economies like the UK,which is experiencing an economic slowdown.But overall other econmies are healthy to pick up the slack in Intel sales worldwide.

quote:
Intel is looking to the low-cost Atom processor to help it grow its business and profits in the face of a slowing trend in the PC market.


Wrong there is no slowing trend in the PC market rather most OEMs are experiencing & forecasting an increase in sales & growth potentials worldwide.

Dell CEO Michael Dell says worldwide PC sales will hit 300 million by the end of 2008 despite concerns about the slowdown of the economy in the United States and elsewhere. IDC originally called for total worldwide desktop and notebooks sales to reach 296 million by the end of 2008. Dell and HP remain the market leaders in PC sales.


http://www.eweek.com/c/a/Desktops-and-Notebooks/De...

Contradictions-the title says...

Intel Looks to Tiny Atom Processor for Profits in the Face of Slowing PC Sales


and this in the article says-

Smith also says that the Atom processor seems to be growing the market rather than cannibalizing existing PC sales.


In short Intel is forecasting a strong growth (worldwide)even though they may not boast about it.

The Atom has created a market for itself without hurting sales of other Intel products namely Notebooks & Desktops.

There is NO slowdown for Intel....




RE: Lets get it crystal clear
By DaveLessnau on 8/14/2008 9:04:48 AM , Rating: 4
Year after year, I keep seeing the same headlines about slowing PC sales. I dug through some old Gartner press releases and came up with the following actual (not forecast) PC sales numbers:

2007: 271.2 million (13% increase)
2006: 239.2 million (9% increase)
2005: 218.5 million (15% increase)
2004: 189.5 million (12% increase)
2003: 169.1 million (31% increase)
2002: 128.9 million

After that, I had trouble finding the numbers so I gave up. Every quarter and every year, the headlines blare "the sky is falling." Yet, every year, we get really nice growth. Doesn't the press ever check up on its numbers before printing stuff?


By Master Kenobi (blog) on 8/14/2008 10:50:51 AM , Rating: 2
quote:
Doesn't the press ever check up on its numbers before printing stuff?

No, lest the facts get in the way of sensationalism and headlines. We can't have that now can we?


By crystal clear on 8/14/2008 4:42:27 AM , Rating: 2


With the Intel Nehalem chip on its way, Intel is cutting the prices of some of its high-end desktop processors and adding some chips into its lineup. The new Intel chips include the Intel Core 2 Quad Q9650 processor for high-end and gaming desktop PCs.

The changes to Intel’s chip pricing were officially released Aug. 10, and those changes include cutting the price of the high-end Intel Core 2 Quad Q9550 (2.83GHz) by 40 percent from $530 to $316. Intel also added a new Core 2 Quad chip into the mix called the Q9650 (3.0GHz) at a price of $530.

Other additions to the high-end lineup include the Intel Core 2 Quad Q9400 (2.66GHz) at a price of $266. All three of these desktop chips are built on the company’s 45-nanometer manufacturing process.

Finally, Intel added two additional Core 2 Duo desktop chips into the lineup. These additions include the Core 2 Duo E8600 (3.33GHz) for $266 and the E7300 (2.66GHz) for $133.

New price cuts recently announced are after the earlier one announced on July 20 08 - Effective AUG 10 2008.

http://files.shareholder.com/downloads/INTC/386730...




By crystal clear on 8/14/2008 5:07:46 AM , Rating: 2
SANTA CLARA, Calif., Aug. 13, 2008 – Intel Corporation today announced the availability of the Extensible Host Controller Interface (xHCI) draft specification revision 0.9 in support of the USB 3.0 architecture, also known as SuperSpeed USB. The xHCI draft specification provides a standardized method for USB 3.0 host controllers to communicate with the USB 3.0 software stack.

This specification describes the registers and data structures used to interface between system software and the hardware, and are developed to be compatible with the USB 3.0 specification being developed by the USB 3.0 Promoter Group. The Intel xHCI draft specification revision 0.9 is being made available under RAND-Z (royalty free) licensing terms to all USB 3.0 Promoter Group and contributor companies that sign an xHCI contributor agreement;
http://www.intel.com/pressroom/archive/releases/20...


"A politician stumbles over himself... Then they pick it out. They edit it. He runs the clip, and then he makes a funny face, and the whole audience has a Pavlovian response." -- Joe Scarborough on John Stewart over Jim Cramer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki