backtop


Print 87 comment(s) - last by EricMartello.. on Sep 10 at 7:23 PM


New AMD graphics branding

The evolution of AMD/ATI branding
AMD's market research shows that it's time to get rid of the ATI brand

It's been a long four years, but AMD has finally hits its stride after its acquisition of ATI Technologies way back in 2006. After agreeing to purchase ATI for $5.4B, AMD was besieged with quarterly losses stemming from the purchase, constant pressure from NVIDIA in the graphics market, and beatdowns from Intel (who wasn't exactly playing by the rules of fair business) in the processor market.

With most of its troubles now behind it, AMD is looking to kill off the long-standing ATI brand and bring Radeon and FirePro graphics solutions solely under the AMD umbrella according to AnandTech.

According to AMD's own research in markets from around the world, it came to the following three conclusions:

  1. AMD preference triples when respondent is aware of ATI-AMD merger
  2. AMD brand [is] stronger than ATI vs. graphics competitors
  3. Radeon and FirePro brand awareness and consideration [is] very high

The move will also help to further consolidate AMD's branding which has pretty much gotten out of hand in the past few years [see figure on right]. AMD will begin the transition later this year to phase out ATI branding and move to a more simplified product branding lineup. By 2011, AMD's product lineup will consist of AMD's Opteron for server processors, Vision (which consists of a CPU/GPU hybrid) for consumer processors, and Radeon/FirePro for graphics.

With AMD now taking the discrete graphics market lead from NVIDIA (51.1 percent for AMD versus 44.5 percent for NVIDIA) and preparing to take the fight straight to Intel with three new CPU designs, the next year should be a fruitful one for enthusiasts.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Huh?
By FaceMaster on 8/30/2010 7:26:06 AM , Rating: 5
quote:
AMD brand [is] stronger than ATI vs. graphics competitors


I doubt that.




RE: Huh?
By hughlle on 8/30/2010 7:37:59 AM , Rating: 5
Personally i have a LOT of respect for ATI products, more so than i have AMD.


RE: Huh?
By Drag0nFire on 8/30/10, Rating: -1
RE: Huh?
By clovell on 8/30/10, Rating: 0
RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By tomorrow on 8/30/2010 5:46:10 PM , Rating: 5
Bang-for-the-buck is for people without money or who have low standards

Wrong.Bang for buck users get 80% of your PC-s performance for third of the money of yours.

Bang for buck is for people who want a fast pc but don't want to get bankrupt in the process of buying it.


RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By tastyratz on 8/30/2010 11:36:26 PM , Rating: 2
There is an extremely finite actual business need for modern day systems and their speed because in general people cant harness a fraction of it. Case in point: All that power and you use it to post on dailytech.

In reality for someone to get 90% of the performance for even 50% of the investment they can upgrade their computer twice as often as you and stay in current generation systems ahead of the game more of the time than not.

Every purchase is an investment, and if the most expensive top of the line computer is your investment unless its chugging 24x7 graphics operations I see your business model failing. You cant tell me your machine will yield more than 5 minute of office productivity in a standard business environment. Running pixar? then we can talk. till then its the same porn halo and google you get on a cheap machine.


RE: Huh?
By Amiga500 on 9/2/2010 3:23:21 AM , Rating: 1
What if I buy 3 machines for the price of your 1 machine, hook them on a network and then have 240% your performance? After all... you do need heavy multi-thread apps to take advantage of all those threads you have.

Who is the person with no standards (or should that be no sense) then?

Yes, a computer is an investment - hence it is important not to pay a disproportionate amount of money for what you are getting. Would you pay 200% more for a top of the line Ferrari compared to its smaller brother which has 90% the performance? Yes, you might - if you'd more money than sense.


RE: Huh?
By EricMartello on 9/3/10, Rating: 0
RE: Huh?
By xti on 9/3/2010 10:11:30 AM , Rating: 1
You realize that neither AMD nor Intel cares about you and your bajillion dollar computer? Thats great and all that you are happy about your purchase, but manufactures are focusing on their fastest sellers, not their greatest margin ones that they sell once in a blue moon.

If I was poor I would still be content than I can understand simple concepts.


RE: Huh?
By Etern205 on 9/3/2010 1:34:52 PM , Rating: 1
Sad news for you as your precious 980x is being replaced by the 990x.

Looks like your e-penis has shrunk by 3 inches. :D :D :D


RE: Huh?
By EricMartello on 9/3/2010 4:17:09 PM , Rating: 1
quote:
Sad news for you as your precious 980x is being replaced by the 990x. Looks like your e-penis has shrunk by 3 inches. :D :D :D


What's your point? For the time being the 980x is the best CPU available from Intel, but there's always going to be something better coming up in the future. For people like me who continually improve, what makes you think I wouldn't get the next best CPU when it becomes available?

Looks like you should just stick to fingering your e-vagina. :D :D :D


RE: Huh?
By Etern205 on 9/4/2010 12:03:29 AM , Rating: 2
Real men use Xeons so why don't you take your pathetic low-end 980x and shove it.
:D :D :D


RE: Huh?
By EricMartello on 9/5/2010 3:14:17 PM , Rating: 1
Real men know when to use a Xeon and when to use a 980x...and again, who said I didn't also use Xeons in systems where they would be approriate? You sure are making a lot of fail assumptions here.


RE: Huh?
By Etern205 on 9/9/2010 9:59:59 AM , Rating: 2
What does Xeons have to do with a 980x, there are 6 cores version of Xeons which is a higher-bin version of the 980x and has dual QPI.

It's obvious that someone gave that 980x as a gift and now your running around like some bigshot who know absolutely about hardware.

And stop trolling.


RE: Huh?
By EricMartello on 9/10/2010 7:23:21 PM , Rating: 1
If you're so enlightened about hardware then you'd know that using CPUs for their intended roles more often than not yields better results. The Xeon is intended for workstation/server duty and is optimized for such work. The 980x is an "ethusiast's CPU" that can overclock like nothing else and has more ubiquitous motherboard options.

As for performance, Passmark disagrees with you:
http://www.cpubenchmark.net/high_end_cpus.html

The 980x is on top by a decent margin...and while the Xeon may excel at tasks for which it was optimized - the 980x is still the best CPU for my needs.

Conclusion:
If you're running a Xeon in your desktop rig / gaming system then you're definitely fingering you e-vag...you spent about $700 more over a 980x for a CPU that runs slower just so you could say you have a Xeon. LOL


RE: Huh?
By clovell on 8/31/2010 4:19:52 PM , Rating: 1
I thought.... I thought I just said that.... ????


RE: Huh?
By clovell on 8/31/2010 6:40:28 PM , Rating: 3
I see the fanboys are screwing with the rating system.


RE: Huh?
By priusone on 8/31/2010 7:28:44 PM , Rating: 3
A buddy of mine just spent $2,100 building a computer that would 'run circles around my POS computer'. It sure does. Talk about one fast machine. Man, it sure makes my $369 Walmart machine look like crap. It makes my two netbooks, which are hooked up to my LCD TV's, seam like they are running even slower. It even makes my $1000 Dell Studio 15x run somewhat slow. Problem is, my living room netbook and my bedroom netbook are both setup like media centers and are controllable by either my Droid or my Dell. My Walmart PC is basically a glorified file server, but it does have a $50 ATI Radeon 4570 for the occasional trip to the Wasteland.

Yeah, my friend loves giving me grief about my lowly PC offerings, but they suit me just fine. But my 9TB of storage trounces his 1TB. Different priorities I guess.

Now as far as AMD doing away with the ATI namesake, we could take your itty bitty market share $2100 machine or our extremely large $369,$399,$350,$1000 market share, and what do you know, there are way more of US. Your average PC user probably doesn't know the difference between discreet and integrated graphics cards. So, be it ATI or AMD, chances are they have heard of AMD. You and Curtis can have a blast playing WOW with max settings, but I'm going go out camping while the weather holds out.


RE: Huh?
By Major HooHaa on 9/3/2010 12:15:21 PM , Rating: 2
The components in my P.C. may once have cost £2,000+ to buy, for the level of performance they give, but I got them for a quite reasonable sum.

Oh and I read a quote from some bloke at AMD, saying that no one cares about branding. But what about the "Intel Inside" campaign? It helped Intel get where it is today.

In one of Terry Pratchett's Diskworld novels, there was even a strange primitive computer called 'Hex' with an "Anthill Inside" logo on the side. They could get it to work, if they could just get enough bugs into the system.


RE: Huh?
By spread on 8/30/2010 6:23:17 PM , Rating: 5
Congratulations on spending $500 on a motherboard. I can buy one with 95% of the performance for around $125.

That's called bang for buck.


RE: Huh?
By Motoman on 8/30/2010 7:21:45 PM , Rating: 2
Precisely. And for virtually everything that a user would do with a computer, using a $50 motherboard would not make any perceptible difference in performance to the user.


RE: Huh?
By RW on 8/30/10, Rating: -1
RE: Huh?
By RW on 8/30/10, Rating: -1
RE: Huh?
By Cheesew1z69 on 8/30/2010 10:30:14 PM , Rating: 1
Um...lol


RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By rdeegvainl on 8/30/2010 11:08:22 PM , Rating: 4
Yeah, I totally know what U mean brah!!! Like just the other day, I was all like looking at this can of corn, and it cost like only 35 cents... u could totally tell that the can of corn that cost a dollar was better... I mean it had 2 more oz of corn, only some cheapo's would buy that other corn. Me on the other hand, I buy the higher quality can of corn with 2 more oz of corn...
/end retardation

Seriously, once you figure out that 90% of people only use a computer for surfing the web, you may be able to have a conversation that isn't full of your ineptitude and e-peen.
Oh, and when you pay for that quad sli mobo and buy four video cards, I laugh when just 6 months later a new card would perform just as much, with a quarter of the heat and electricity usage.
But by all means, get what makes you happy, keep up with the joneses or nobody is gonna respect you.


RE: Huh?
By EricMartello on 9/3/2010 4:33:31 PM , Rating: 1
quote:
Yeah, I totally know what U mean brah!!! Like just the other day, I was all like looking at this can of corn, and it cost like only 35 cents... u could totally tell that the can of corn that cost a dollar was better... I mean it had 2 more oz of corn, only some cheapo's would buy that other corn. Me on the other hand, I buy the higher quality can of corn with 2 more oz of corn... /end retardation


If CPU A performed equally to CPU B but CPU B was 50% cheaper, I would buy CPU B. You see, my retarded friend, the issue is purely performance and not price. AMD has NOTHING that can touch the 980x and probably will not until the tech behind the 980x is outdated. AMD has been lagging behind since the A64 days and has not had a compelling offering since that time. So if you're going to make a comparison at least make an effort to do it right.

quote:
Seriously, once you figure out that 90% of people only use a computer for surfing the web, you may be able to have a conversation that isn't full of your ineptitude and e-peen. Oh, and when you pay for that quad sli mobo and buy four video cards, I laugh when just 6 months later a new card would perform just as much, with a quarter of the heat and electricity usage. But by all means, get what makes you happy, keep up with the joneses or nobody is gonna respect you.


The point of this conversation is not to make you feel as inferior as you really are, but rather it is to illustrate the ever-widening gap between the available performance of Intel vs AMD CPUs. Intel hands-down dominates AMD and despite their best attempts AMD's best cannot come close to Intel's best.

Now even if all you do is general-purpose stuff on your comp, the benefits of high-end hardware do not end. The lifespan of a 980x system will be several years if not more, since it is so much faster than peasant systems, it will continue to be fast enough thus preventing the need for hardware upgrades. Each year you go by without having to upgrade adds value to the purchase. I have no doubt that my system will be more than capable of running any game/app 5 years down the line just as well as it can today...on the other hand, your "bang for the buck" system will show its age within a year or less and start to "feel slow" with newer apps and games.

Yes, my CPU costs more than your car and my motherboard is worth more than your life...but you shouldn't let that make you feel bad. You just need to accept that you fall into that 90% group that you mentioned, and that most people will only ever be average at best in their lives.


RE: Huh?
By mino on 8/30/2010 12:36:23 PM , Rating: 2
No so much in the business space. That is where this rebrand is targeted.

To those asses still preaching along the "No one got fired for buying IBM" lines.


RE: Huh?
By 2bdetermine on 8/30/2010 3:48:53 PM , Rating: 1
Let's see - What happen if AMD go out of business tomorrow?

I'll leave it to your imagination.


RE: Huh?
By Cheesew1z69 on 8/30/2010 3:55:32 PM , Rating: 2
Let's see....they won't....


RE: Huh?
By zebrax2 on 8/30/2010 7:41:18 AM , Rating: 4
Ask around what AMD and ATI is and you'll surely get more response on AMD. With CPU's people at least look at the brand name. For GPU on the other hand all they care about is if it has a larger memory


RE: Huh?
By blueaurora on 8/30/10, Rating: -1
RE: Huh?
By freeagle on 8/30/2010 8:01:19 AM , Rating: 5
quote:
Larger memory .... no.... video memory doesn't impact performance nearly as much as the GPU design and type.


That's undoubtedly true, but all my friends that don't really understand computer hardware ( read "majority" ) are comparing GPUs based on the amount of memory.


RE: Huh?
By Motoman on 8/30/2010 2:22:48 PM , Rating: 2
That is correct. To the vast majority of average consumers, a video card is as good as the amount of memory on it.

As absurdly wrong as that is, it's what the market perception is.

...which is why Ma & Pa don't get it when they go to BBY and buy the new $400 Dell thing for sonny, only for him to complain that it won't play WoW.


RE: Huh?
By spread on 8/30/2010 6:26:11 PM , Rating: 2
I blame this on the vast majority of poor sales people. It was such a pain shopping around for a laptop this year.

"Yes, yes I know it has 1 jigglybits of memory, but on WHAT? What IS in there?" -Me talking to a sales rep.


RE: Huh?
By zebrax2 on 8/30/2010 8:05:55 AM , Rating: 2
Lay persons don't really know that do they. That's why you see custom low-end cards with 1gb memory that it will never be able to use but sell like hotcakes even though they have a larger markup.

Don't get me wrong i know what you are talking about but that is just how it is with people.


RE: Huh?
By Flunk on 8/30/2010 8:42:33 AM , Rating: 5
But the guy in Best Buy told me that memory is all that matters!


RE: Huh?
By rudolphna on 8/30/2010 9:27:55 AM , Rating: 1
Unfortunately this is the case with most best buys. I work in PC sales at an upstate new york BB store, and I've run into the same phenomenon, that most people shop for graphics cards, or computers by the amount of video memory. I then have to explain that just because the Integrated graphics chip, or Radeon 5450 has 1GB of memory, doesn't mean that it is a fast card.

It would be pointless to get into a discussion about GPU architecture in store, so I just simplify to saying that certain models are faster than others, and that memory only really affects the size screen you can run (Which is true). Most people looking to upgrade from integrated graphics but don't game get a recommendation for a 5450, or 5670. Anybody playing games, depending on monitor size, and what games, get a 5670, 5750 or a 5770.

Usually with these comes a BFG Tech, or Thermaltake power supply since many would try to run it on the POS 250W power supply that comes in most OEM computers.


RE: Huh?
By Cheesew1z69 on 8/30/2010 9:56:33 AM , Rating: 2
quote:
that memory only really affects the size screen you can run (Which is true).
Citation for this?


RE: Huh?
By Quadrillity on 8/30/2010 9:58:35 AM , Rating: 2
I agree; I would word it differently when talking to customers though. Sometimes you just have to answer, "this one is faster than this one" lol.


RE: Huh?
By FITCamaro on 8/30/2010 10:25:49 AM , Rating: 2
Memory size can affect how high a resolution you can run with an acceptable framerate. Higher resolutions use more memory from the increased texture sizes.

Of course a 5450 with 1GB of memory won't beat a 5870 with 512MB of memory at 1920x1200.


RE: Huh?
By Cheesew1z69 on 8/30/10, Rating: -1
RE: Huh?
By leexgx on 8/30/2010 12:01:40 PM , Rating: 2
more then 1gb is only used if your using Lots of AA and an big monitor (like 32in res size) or 3 monitor Setup (Surround/Eyefinity) and even then you may need AA to push the card past 1gb ram use (most games Frame buffers are 512MB so they work on every ones system but norm the Texture setting that does that)

2 GTX460 in SLI that are 1gb can handle Surround View at High screen sizes to an Point (AA+ high Res) you most likely need an Nvidia card that has an bigger Frame bufffer then (GTX470/480), HardOCP have not done an Surround with GTX460 yet not that i can find

on ATI side Eyefinity 6 port cards do Benefit from the 2gb as it does go over the 1gb on Super high Res

any way Normal customers on Lower end cards and screen sizes 512MB is norm plenty but the every day customer looks at price and ram size


RE: Huh?
By torpor on 8/30/2010 1:48:34 PM , Rating: 2
Monitor size is irrelevant.

Displayed resolution is highly relevant.


RE: Huh?
By Hieyeck on 8/30/2010 4:03:24 PM , Rating: 2
oh geez, all the terrible downrates for truths. We need to ban Best Buy IPs from rating here. Yes, memory size is a only a part of a GPUs performance, and while it's not a significant part, it's still a substantial part. Most monitors can't handle max supported resolutions, but still to say it'll support those resolutions, it'll need that memory.

Some rough math:
5870 advertises 2560*1600 = A little over 4 million pixels. Each color needs about 1 byte (32-bit color), each pixel has 3 colors (RGB) + misc data (such as opacity, etc.) - 4 bytes. 16MB for one frame. Needs to buffer... say 30 frames (half of 60hz - don't quote me. I don't know the exact relationship). 30 frames is already just shy of 500MB of data needed to be buffered. This doesn't include all the algorithims and other processes (model transforms, transparencies, video rendering and decoding) used to create this data to give you a moving display.


RE: Huh?
By priusone on 8/31/2010 7:38:10 PM , Rating: 2
You can bring on all the math you want, but you are ALL forgetting the most important number. WATTS. Your typical inexpensive tower has a 250Watt power supply. My cousin wanted to be able to play some of the newer FPS's and his coworker had him sold on a $300 graphics card. Oh sure, the card was awesome, but the thing needed well over 200Watts itself. The guy does a good job logging, but you can't imagine the hell I endured in convincing him that the card would toast his PSU and possibly the rest of the system.

In the end, he got a decent graphics card, but only after we put in a new PSU. Do you think the average Best Buy worker makes sure and ensures the customers PSU can handle the new graphics card? Doubtful.


RE: Huh?
By leexgx on 8/30/2010 11:32:07 AM , Rating: 2
think he (Flunk) was Joking about that comment he made was not an Question


RE: Huh?
By FaceMaster on 8/30/2010 3:28:13 PM , Rating: 2
quote:
Larger memory .... no.... video memory doesn't impact performance nearly as much as the GPU design and type. (look at any design in a 512mb and 1gb variant versus the step up gpu)


I agree with you entirely about this point, I'm guessing you got voted down for the later part about bad driver support, since I feel that ATI has been roughly equal to nvidia in this respect. I mean, I remember when the Geforce 8800 series came out and Source games all had this thick white mist everywhere. People told others to buy an ATI equivilent instead. Aside from the 2800 series being shocking, they too had the same problem! (And later on from Nvidia- though in defence, they did fix it relatively shorter time-span wise.) I don't feel there's a need for fanboys in an area as nerdy as graphics cards.

Of course, in CPUs it's a totally different matter and I wish that Intel would burn in hell. /sarcasm


RE: Huh?
By PAPutzback on 8/30/2010 9:06:51 AM , Rating: 2
WOW. That's nuts. I figured anyone that knew how to install a graphics card or any other performance piece would have a clue about benchmarks and therefore would have picked up some basic knowledge about CPU\GPU cycles, memory bandwidth, SSD vs HDD and the basics.

AMD vs ATI, doesn't matter to me. But I have always enjoyed just typing in ATI and hitting ctrl+enter to take me right to the graphics drivers.


RE: Huh?
By VoodooChicken on 8/30/2010 10:30:58 AM , Rating: 2
Ask around, and most people will ask "What the @#$! is a GPU???"


RE: Huh?
By Fracture on 8/30/2010 3:39:56 PM , Rating: 2
I'd dump the AMD name too. You don't give prizes to the last horse in the race, you stick with the winner (ATI).


RE: Huh?
By gamefoo21 on 8/31/2010 12:49:29 PM , Rating: 1
Sooo... Now their graphics cards are going to run hot, be slower, but be cheap since they can't compete in any other ways... Still love the HP Tablet/Convertibles with the 2.6ghz AMD X2's and you'll find a 50% throttler on the cpu, if you remove the throttle the laptop overheats as soon as you try an do anything on it... lol

I think this is a stupid move, one that's being done to try and prop up AMD's lackluster CPU business.


RE: Huh?
By Reclaimer77 on 8/31/2010 5:29:57 PM , Rating: 2
Yeah I'm really perplexed by this move. Another bad move in a history of bad moves by AMD. But that's ok, I guess they can always blame Intel somehow when this blows up in their faces too.


"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki