Print 87 comment(s) - last by EricMartello.. on Sep 10 at 7:23 PM

New AMD graphics branding

The evolution of AMD/ATI branding
AMD's market research shows that it's time to get rid of the ATI brand

It's been a long four years, but AMD has finally hits its stride after its acquisition of ATI Technologies way back in 2006. After agreeing to purchase ATI for $5.4B, AMD was besieged with quarterly losses stemming from the purchase, constant pressure from NVIDIA in the graphics market, and beatdowns from Intel (who wasn't exactly playing by the rules of fair business) in the processor market.

With most of its troubles now behind it, AMD is looking to kill off the long-standing ATI brand and bring Radeon and FirePro graphics solutions solely under the AMD umbrella according to AnandTech.

According to AMD's own research in markets from around the world, it came to the following three conclusions:

  1. AMD preference triples when respondent is aware of ATI-AMD merger
  2. AMD brand [is] stronger than ATI vs. graphics competitors
  3. Radeon and FirePro brand awareness and consideration [is] very high

The move will also help to further consolidate AMD's branding which has pretty much gotten out of hand in the past few years [see figure on right]. AMD will begin the transition later this year to phase out ATI branding and move to a more simplified product branding lineup. By 2011, AMD's product lineup will consist of AMD's Opteron for server processors, Vision (which consists of a CPU/GPU hybrid) for consumer processors, and Radeon/FirePro for graphics.

With AMD now taking the discrete graphics market lead from NVIDIA (51.1 percent for AMD versus 44.5 percent for NVIDIA) and preparing to take the fight straight to Intel with three new CPU designs, the next year should be a fruitful one for enthusiasts.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By FaceMaster on 8/30/2010 7:26:06 AM , Rating: 5
AMD brand [is] stronger than ATI vs. graphics competitors

I doubt that.

RE: Huh?
By hughlle on 8/30/2010 7:37:59 AM , Rating: 5
Personally i have a LOT of respect for ATI products, more so than i have AMD.

RE: Huh?
By Drag0nFire on 8/30/10, Rating: -1
RE: Huh?
By clovell on 8/30/10, Rating: 0
RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By tomorrow on 8/30/2010 5:46:10 PM , Rating: 5
Bang-for-the-buck is for people without money or who have low standards

Wrong.Bang for buck users get 80% of your PC-s performance for third of the money of yours.

Bang for buck is for people who want a fast pc but don't want to get bankrupt in the process of buying it.

RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By tastyratz on 8/30/2010 11:36:26 PM , Rating: 2
There is an extremely finite actual business need for modern day systems and their speed because in general people cant harness a fraction of it. Case in point: All that power and you use it to post on dailytech.

In reality for someone to get 90% of the performance for even 50% of the investment they can upgrade their computer twice as often as you and stay in current generation systems ahead of the game more of the time than not.

Every purchase is an investment, and if the most expensive top of the line computer is your investment unless its chugging 24x7 graphics operations I see your business model failing. You cant tell me your machine will yield more than 5 minute of office productivity in a standard business environment. Running pixar? then we can talk. till then its the same porn halo and google you get on a cheap machine.

RE: Huh?
By Amiga500 on 9/2/2010 3:23:21 AM , Rating: 1
What if I buy 3 machines for the price of your 1 machine, hook them on a network and then have 240% your performance? After all... you do need heavy multi-thread apps to take advantage of all those threads you have.

Who is the person with no standards (or should that be no sense) then?

Yes, a computer is an investment - hence it is important not to pay a disproportionate amount of money for what you are getting. Would you pay 200% more for a top of the line Ferrari compared to its smaller brother which has 90% the performance? Yes, you might - if you'd more money than sense.

RE: Huh?
By EricMartello on 9/3/10, Rating: 0
RE: Huh?
By xti on 9/3/2010 10:11:30 AM , Rating: 1
You realize that neither AMD nor Intel cares about you and your bajillion dollar computer? Thats great and all that you are happy about your purchase, but manufactures are focusing on their fastest sellers, not their greatest margin ones that they sell once in a blue moon.

If I was poor I would still be content than I can understand simple concepts.

RE: Huh?
By Etern205 on 9/3/2010 1:34:52 PM , Rating: 1
Sad news for you as your precious 980x is being replaced by the 990x.

Looks like your e-penis has shrunk by 3 inches. :D :D :D

RE: Huh?
By EricMartello on 9/3/2010 4:17:09 PM , Rating: 1
Sad news for you as your precious 980x is being replaced by the 990x. Looks like your e-penis has shrunk by 3 inches. :D :D :D

What's your point? For the time being the 980x is the best CPU available from Intel, but there's always going to be something better coming up in the future. For people like me who continually improve, what makes you think I wouldn't get the next best CPU when it becomes available?

Looks like you should just stick to fingering your e-vagina. :D :D :D

RE: Huh?
By Etern205 on 9/4/2010 12:03:29 AM , Rating: 2
Real men use Xeons so why don't you take your pathetic low-end 980x and shove it.
:D :D :D

RE: Huh?
By EricMartello on 9/5/2010 3:14:17 PM , Rating: 1
Real men know when to use a Xeon and when to use a 980x...and again, who said I didn't also use Xeons in systems where they would be approriate? You sure are making a lot of fail assumptions here.

RE: Huh?
By Etern205 on 9/9/2010 9:59:59 AM , Rating: 2
What does Xeons have to do with a 980x, there are 6 cores version of Xeons which is a higher-bin version of the 980x and has dual QPI.

It's obvious that someone gave that 980x as a gift and now your running around like some bigshot who know absolutely about hardware.

And stop trolling.

RE: Huh?
By EricMartello on 9/10/2010 7:23:21 PM , Rating: 1
If you're so enlightened about hardware then you'd know that using CPUs for their intended roles more often than not yields better results. The Xeon is intended for workstation/server duty and is optimized for such work. The 980x is an "ethusiast's CPU" that can overclock like nothing else and has more ubiquitous motherboard options.

As for performance, Passmark disagrees with you:

The 980x is on top by a decent margin...and while the Xeon may excel at tasks for which it was optimized - the 980x is still the best CPU for my needs.

If you're running a Xeon in your desktop rig / gaming system then you're definitely fingering you spent about $700 more over a 980x for a CPU that runs slower just so you could say you have a Xeon. LOL

RE: Huh?
By clovell on 8/31/2010 4:19:52 PM , Rating: 1
I thought.... I thought I just said that.... ????

RE: Huh?
By clovell on 8/31/2010 6:40:28 PM , Rating: 3
I see the fanboys are screwing with the rating system.

RE: Huh?
By priusone on 8/31/2010 7:28:44 PM , Rating: 3
A buddy of mine just spent $2,100 building a computer that would 'run circles around my POS computer'. It sure does. Talk about one fast machine. Man, it sure makes my $369 Walmart machine look like crap. It makes my two netbooks, which are hooked up to my LCD TV's, seam like they are running even slower. It even makes my $1000 Dell Studio 15x run somewhat slow. Problem is, my living room netbook and my bedroom netbook are both setup like media centers and are controllable by either my Droid or my Dell. My Walmart PC is basically a glorified file server, but it does have a $50 ATI Radeon 4570 for the occasional trip to the Wasteland.

Yeah, my friend loves giving me grief about my lowly PC offerings, but they suit me just fine. But my 9TB of storage trounces his 1TB. Different priorities I guess.

Now as far as AMD doing away with the ATI namesake, we could take your itty bitty market share $2100 machine or our extremely large $369,$399,$350,$1000 market share, and what do you know, there are way more of US. Your average PC user probably doesn't know the difference between discreet and integrated graphics cards. So, be it ATI or AMD, chances are they have heard of AMD. You and Curtis can have a blast playing WOW with max settings, but I'm going go out camping while the weather holds out.

RE: Huh?
By Major HooHaa on 9/3/2010 12:15:21 PM , Rating: 2
The components in my P.C. may once have cost £2,000+ to buy, for the level of performance they give, but I got them for a quite reasonable sum.

Oh and I read a quote from some bloke at AMD, saying that no one cares about branding. But what about the "Intel Inside" campaign? It helped Intel get where it is today.

In one of Terry Pratchett's Diskworld novels, there was even a strange primitive computer called 'Hex' with an "Anthill Inside" logo on the side. They could get it to work, if they could just get enough bugs into the system.

RE: Huh?
By spread on 8/30/2010 6:23:17 PM , Rating: 5
Congratulations on spending $500 on a motherboard. I can buy one with 95% of the performance for around $125.

That's called bang for buck.

RE: Huh?
By Motoman on 8/30/2010 7:21:45 PM , Rating: 2
Precisely. And for virtually everything that a user would do with a computer, using a $50 motherboard would not make any perceptible difference in performance to the user.

RE: Huh?
By RW on 8/30/10, Rating: -1
RE: Huh?
By RW on 8/30/10, Rating: -1
RE: Huh?
By Cheesew1z69 on 8/30/2010 10:30:14 PM , Rating: 1

RE: Huh?
By EricMartello on 8/30/10, Rating: -1
RE: Huh?
By rdeegvainl on 8/30/2010 11:08:22 PM , Rating: 4
Yeah, I totally know what U mean brah!!! Like just the other day, I was all like looking at this can of corn, and it cost like only 35 cents... u could totally tell that the can of corn that cost a dollar was better... I mean it had 2 more oz of corn, only some cheapo's would buy that other corn. Me on the other hand, I buy the higher quality can of corn with 2 more oz of corn...
/end retardation

Seriously, once you figure out that 90% of people only use a computer for surfing the web, you may be able to have a conversation that isn't full of your ineptitude and e-peen.
Oh, and when you pay for that quad sli mobo and buy four video cards, I laugh when just 6 months later a new card would perform just as much, with a quarter of the heat and electricity usage.
But by all means, get what makes you happy, keep up with the joneses or nobody is gonna respect you.

RE: Huh?
By EricMartello on 9/3/2010 4:33:31 PM , Rating: 1
Yeah, I totally know what U mean brah!!! Like just the other day, I was all like looking at this can of corn, and it cost like only 35 cents... u could totally tell that the can of corn that cost a dollar was better... I mean it had 2 more oz of corn, only some cheapo's would buy that other corn. Me on the other hand, I buy the higher quality can of corn with 2 more oz of corn... /end retardation

If CPU A performed equally to CPU B but CPU B was 50% cheaper, I would buy CPU B. You see, my retarded friend, the issue is purely performance and not price. AMD has NOTHING that can touch the 980x and probably will not until the tech behind the 980x is outdated. AMD has been lagging behind since the A64 days and has not had a compelling offering since that time. So if you're going to make a comparison at least make an effort to do it right.

Seriously, once you figure out that 90% of people only use a computer for surfing the web, you may be able to have a conversation that isn't full of your ineptitude and e-peen. Oh, and when you pay for that quad sli mobo and buy four video cards, I laugh when just 6 months later a new card would perform just as much, with a quarter of the heat and electricity usage. But by all means, get what makes you happy, keep up with the joneses or nobody is gonna respect you.

The point of this conversation is not to make you feel as inferior as you really are, but rather it is to illustrate the ever-widening gap between the available performance of Intel vs AMD CPUs. Intel hands-down dominates AMD and despite their best attempts AMD's best cannot come close to Intel's best.

Now even if all you do is general-purpose stuff on your comp, the benefits of high-end hardware do not end. The lifespan of a 980x system will be several years if not more, since it is so much faster than peasant systems, it will continue to be fast enough thus preventing the need for hardware upgrades. Each year you go by without having to upgrade adds value to the purchase. I have no doubt that my system will be more than capable of running any game/app 5 years down the line just as well as it can today...on the other hand, your "bang for the buck" system will show its age within a year or less and start to "feel slow" with newer apps and games.

Yes, my CPU costs more than your car and my motherboard is worth more than your life...but you shouldn't let that make you feel bad. You just need to accept that you fall into that 90% group that you mentioned, and that most people will only ever be average at best in their lives.

RE: Huh?
By mino on 8/30/2010 12:36:23 PM , Rating: 2
No so much in the business space. That is where this rebrand is targeted.

To those asses still preaching along the "No one got fired for buying IBM" lines.

RE: Huh?
By 2bdetermine on 8/30/2010 3:48:53 PM , Rating: 1
Let's see - What happen if AMD go out of business tomorrow?

I'll leave it to your imagination.

RE: Huh?
By Cheesew1z69 on 8/30/2010 3:55:32 PM , Rating: 2
Let's see....they won't....

RE: Huh?
By zebrax2 on 8/30/2010 7:41:18 AM , Rating: 4
Ask around what AMD and ATI is and you'll surely get more response on AMD. With CPU's people at least look at the brand name. For GPU on the other hand all they care about is if it has a larger memory

RE: Huh?
By blueaurora on 8/30/10, Rating: -1
RE: Huh?
By freeagle on 8/30/2010 8:01:19 AM , Rating: 5
Larger memory .... no.... video memory doesn't impact performance nearly as much as the GPU design and type.

That's undoubtedly true, but all my friends that don't really understand computer hardware ( read "majority" ) are comparing GPUs based on the amount of memory.

RE: Huh?
By Motoman on 8/30/2010 2:22:48 PM , Rating: 2
That is correct. To the vast majority of average consumers, a video card is as good as the amount of memory on it.

As absurdly wrong as that is, it's what the market perception is.

...which is why Ma & Pa don't get it when they go to BBY and buy the new $400 Dell thing for sonny, only for him to complain that it won't play WoW.

RE: Huh?
By spread on 8/30/2010 6:26:11 PM , Rating: 2
I blame this on the vast majority of poor sales people. It was such a pain shopping around for a laptop this year.

"Yes, yes I know it has 1 jigglybits of memory, but on WHAT? What IS in there?" -Me talking to a sales rep.

RE: Huh?
By zebrax2 on 8/30/2010 8:05:55 AM , Rating: 2
Lay persons don't really know that do they. That's why you see custom low-end cards with 1gb memory that it will never be able to use but sell like hotcakes even though they have a larger markup.

Don't get me wrong i know what you are talking about but that is just how it is with people.

RE: Huh?
By Flunk on 8/30/2010 8:42:33 AM , Rating: 5
But the guy in Best Buy told me that memory is all that matters!

RE: Huh?
By rudolphna on 8/30/2010 9:27:55 AM , Rating: 1
Unfortunately this is the case with most best buys. I work in PC sales at an upstate new york BB store, and I've run into the same phenomenon, that most people shop for graphics cards, or computers by the amount of video memory. I then have to explain that just because the Integrated graphics chip, or Radeon 5450 has 1GB of memory, doesn't mean that it is a fast card.

It would be pointless to get into a discussion about GPU architecture in store, so I just simplify to saying that certain models are faster than others, and that memory only really affects the size screen you can run (Which is true). Most people looking to upgrade from integrated graphics but don't game get a recommendation for a 5450, or 5670. Anybody playing games, depending on monitor size, and what games, get a 5670, 5750 or a 5770.

Usually with these comes a BFG Tech, or Thermaltake power supply since many would try to run it on the POS 250W power supply that comes in most OEM computers.

RE: Huh?
By Cheesew1z69 on 8/30/2010 9:56:33 AM , Rating: 2
that memory only really affects the size screen you can run (Which is true).
Citation for this?

RE: Huh?
By Quadrillity on 8/30/2010 9:58:35 AM , Rating: 2
I agree; I would word it differently when talking to customers though. Sometimes you just have to answer, "this one is faster than this one" lol.

RE: Huh?
By FITCamaro on 8/30/2010 10:25:49 AM , Rating: 2
Memory size can affect how high a resolution you can run with an acceptable framerate. Higher resolutions use more memory from the increased texture sizes.

Of course a 5450 with 1GB of memory won't beat a 5870 with 512MB of memory at 1920x1200.

RE: Huh?
By Cheesew1z69 on 8/30/10, Rating: -1
RE: Huh?
By leexgx on 8/30/2010 12:01:40 PM , Rating: 2
more then 1gb is only used if your using Lots of AA and an big monitor (like 32in res size) or 3 monitor Setup (Surround/Eyefinity) and even then you may need AA to push the card past 1gb ram use (most games Frame buffers are 512MB so they work on every ones system but norm the Texture setting that does that)

2 GTX460 in SLI that are 1gb can handle Surround View at High screen sizes to an Point (AA+ high Res) you most likely need an Nvidia card that has an bigger Frame bufffer then (GTX470/480), HardOCP have not done an Surround with GTX460 yet not that i can find

on ATI side Eyefinity 6 port cards do Benefit from the 2gb as it does go over the 1gb on Super high Res

any way Normal customers on Lower end cards and screen sizes 512MB is norm plenty but the every day customer looks at price and ram size

RE: Huh?
By torpor on 8/30/2010 1:48:34 PM , Rating: 2
Monitor size is irrelevant.

Displayed resolution is highly relevant.

RE: Huh?
By Hieyeck on 8/30/2010 4:03:24 PM , Rating: 2
oh geez, all the terrible downrates for truths. We need to ban Best Buy IPs from rating here. Yes, memory size is a only a part of a GPUs performance, and while it's not a significant part, it's still a substantial part. Most monitors can't handle max supported resolutions, but still to say it'll support those resolutions, it'll need that memory.

Some rough math:
5870 advertises 2560*1600 = A little over 4 million pixels. Each color needs about 1 byte (32-bit color), each pixel has 3 colors (RGB) + misc data (such as opacity, etc.) - 4 bytes. 16MB for one frame. Needs to buffer... say 30 frames (half of 60hz - don't quote me. I don't know the exact relationship). 30 frames is already just shy of 500MB of data needed to be buffered. This doesn't include all the algorithims and other processes (model transforms, transparencies, video rendering and decoding) used to create this data to give you a moving display.

RE: Huh?
By priusone on 8/31/2010 7:38:10 PM , Rating: 2
You can bring on all the math you want, but you are ALL forgetting the most important number. WATTS. Your typical inexpensive tower has a 250Watt power supply. My cousin wanted to be able to play some of the newer FPS's and his coworker had him sold on a $300 graphics card. Oh sure, the card was awesome, but the thing needed well over 200Watts itself. The guy does a good job logging, but you can't imagine the hell I endured in convincing him that the card would toast his PSU and possibly the rest of the system.

In the end, he got a decent graphics card, but only after we put in a new PSU. Do you think the average Best Buy worker makes sure and ensures the customers PSU can handle the new graphics card? Doubtful.

RE: Huh?
By leexgx on 8/30/2010 11:32:07 AM , Rating: 2
think he (Flunk) was Joking about that comment he made was not an Question

RE: Huh?
By FaceMaster on 8/30/2010 3:28:13 PM , Rating: 2
Larger memory .... no.... video memory doesn't impact performance nearly as much as the GPU design and type. (look at any design in a 512mb and 1gb variant versus the step up gpu)

I agree with you entirely about this point, I'm guessing you got voted down for the later part about bad driver support, since I feel that ATI has been roughly equal to nvidia in this respect. I mean, I remember when the Geforce 8800 series came out and Source games all had this thick white mist everywhere. People told others to buy an ATI equivilent instead. Aside from the 2800 series being shocking, they too had the same problem! (And later on from Nvidia- though in defence, they did fix it relatively shorter time-span wise.) I don't feel there's a need for fanboys in an area as nerdy as graphics cards.

Of course, in CPUs it's a totally different matter and I wish that Intel would burn in hell. /sarcasm

RE: Huh?
By PAPutzback on 8/30/2010 9:06:51 AM , Rating: 2
WOW. That's nuts. I figured anyone that knew how to install a graphics card or any other performance piece would have a clue about benchmarks and therefore would have picked up some basic knowledge about CPU\GPU cycles, memory bandwidth, SSD vs HDD and the basics.

AMD vs ATI, doesn't matter to me. But I have always enjoyed just typing in ATI and hitting ctrl+enter to take me right to the graphics drivers.

RE: Huh?
By VoodooChicken on 8/30/2010 10:30:58 AM , Rating: 2
Ask around, and most people will ask "What the @#$! is a GPU???"

RE: Huh?
By Fracture on 8/30/2010 3:39:56 PM , Rating: 2
I'd dump the AMD name too. You don't give prizes to the last horse in the race, you stick with the winner (ATI).

RE: Huh?
By gamefoo21 on 8/31/2010 12:49:29 PM , Rating: 1
Sooo... Now their graphics cards are going to run hot, be slower, but be cheap since they can't compete in any other ways... Still love the HP Tablet/Convertibles with the 2.6ghz AMD X2's and you'll find a 50% throttler on the cpu, if you remove the throttle the laptop overheats as soon as you try an do anything on it... lol

I think this is a stupid move, one that's being done to try and prop up AMD's lackluster CPU business.

RE: Huh?
By Reclaimer77 on 8/31/2010 5:29:57 PM , Rating: 2
Yeah I'm really perplexed by this move. Another bad move in a history of bad moves by AMD. But that's ok, I guess they can always blame Intel somehow when this blows up in their faces too.

Correct move
By corduroygt on 8/30/2010 11:09:58 AM , Rating: 5
This is a win-win:

-Enthusiasts know that AMD is just rebranded ATI and will still buy the GPU's
-Uninformed customers have no clue what ATI is, but they heard of AMD and trust it more (according to AMD's research)

So I don't get the backlash against this...looks like a good move

RE: Correct move
By geddarkstorm on 8/30/2010 1:47:57 PM , Rating: 5
This is the internet, there is always a backlash about everything and anything. Including backlash.

RE: Correct move
By Motoman on 8/30/10, Rating: -1
RE: Correct move
By Cheesew1z69 on 8/30/2010 3:58:00 PM , Rating: 3
Well, they did buy ATI, it's theirs to change as they please :)

Smart move
By KIAman on 8/30/2010 11:13:41 AM , Rating: 2
To all the naysayers, can you recall what the acronym, ATI is?

Exactly, AMD is pretty easy as Advanced Micro Devices.

I rest my case.

RE: Smart move
By Chocobollz on 8/30/2010 1:22:45 PM , Rating: 1
It was easy because AMD makes it so the general public would knew their name both as AMD and Advanced Micro Devices, while ATI simply only use "ATI". Why? Because it's easier to say (and to remember), so it doesn't matter if they don't know what its acronym is. It's not like they'd care about it.

And also, there's something more with the ATI logo than just the name, it's the color. As you know, ATI's logo is mostly in red, and red means "passion". I think it was one of the reason why ATI (and their employees) always have a strong will to get back up after a failure, because ATI = passion.

Now, AMD has a black color for its logo. What does it means? From quick searches on Google, it points out that black most likely means: conservative/serious/conventional. I don't think it's a good color to describe a company which is one of the top GPU manufacturer in the world. Even NVIDIA picked a better color (one of the meanings of green is "envy", which is a good match for their names).

And there's one thing I've just realized, the top 3 GPU companies is: ATI, NVIDIA, and Intel. And if you put them into their respective color, they represent: Red, Green, and Blue. Does that sounds familiar? ;-) That's a perfect balance! I think this is a bad omen for AMD, really.. ;-)

RE: Smart move
By justjc on 8/31/2010 6:16:29 AM , Rating: 2
Did you have a look at the article?
It looks like, from the presented logos, that AMD will stick with ATIs red and black logos. A great way to differentiate between the two main products(green CPU and red GPU)

RE: Smart move
By Cheesew1z69 on 8/30/2010 3:38:21 PM , Rating: 2
In 1985, ATI was founded as Array Technologies Incorporated by Lee Ka Lau

RE: Smart move
By KIAman on 8/30/2010 4:51:39 PM , Rating: 2
Way to misquote...

In 1985, ATI was founded as Array Technologies Incorporated by Lee Ka Lau[1], Benny Lau and Kwok Yuen Ho[2].

And reference the wiki entry as the source...

Thanks for proving my point.

RE: Smart move
By georges1977 on 8/31/2010 12:06:03 AM , Rating: 1
In 1972, a crack commando unit was sent to prison by a military court for a crime they didn't commit. These men promptly escaped from a maximum security stockade to the Los Angeles underground. Today, still wanted by the government, they survive as soldiers of fortune. If you have a problem, if no one else can help, and if you can find them, maybe you can hire... The ATI-Team.

RE: Smart move
By tedrodai on 9/1/2010 10:25:35 AM , Rating: 2
Yeah, it stands for "pretty damn good graphics cards", at least right now.

AMD currently stands for "wait, don't forget about us!".

Best thing about this?
By nafhan on 8/30/2010 10:39:13 AM , Rating: 3
You'll be able to buy a computer that has both an Intel AND an AMD sticker on it! That just strikes me as kind of funny.
Anyway, with Fusion products right around the corner this needed to happen.

RE: Best thing about this?
By nafhan on 8/30/2010 10:50:08 AM , Rating: 2
Never mind, read the Anandtech article and saw this:
AMD states the AMD-less logos are purely at the request of OEMs who sell systems with Intel CPUs and AMD GPUs. I suspect Intel’s logo program may have some stipulations on being used adjacent to a sticker with an AMD logo on it, although AMD told me it was purely at the request of the OEMs trying to avoid confusion.

RE: Best thing about this?
By Silver2k7 on 8/31/2010 3:14:57 PM , Rating: 2
bah thats just too bad.. the Intel PC's with Radeons should have AMD on the stickers :-)

RE: Best thing about this?
By Amiga500 on 9/2/2010 3:29:14 AM , Rating: 2
Or "Graphics by AMD" on the sticker - but I would only put those stickers on the high end cards.

(After all, no point AMD carrying the can for ma/pa getting wee jimmy that brand new dell with a 5450 on board then wondering why it cannot play games - they would only blame AMD and then go with *shudders* intel integrated graphics next time out)

As has been said before...
By morphologia on 9/2/2010 3:39:09 PM , Rating: 2
I'm like the 16th person to say this, but I must weigh in...
AMD is not doing this because the AMD brand is more recognized or successful than ATI...they're doing it in hopes of boosting the AMD brand by riding the coattails of the ATI brand's popularity.

The AMD brand has a loyal, near-cultlike following, but is largely regarded as the underdog. ATI, meanwhile, has permeated its market so thoroughly that they've almost pushed nVidia to underdog status. It seems clear that they're not really jump-starting their graphics biz by using their CPU brand, they're having their CPU biz gain momentum from their GPU brand like a lazy skateboarder hanging onto a moving car.

(DISCLAIMER: Hanging onto a car while skateboarding is dangerous and dumb. People die that way. Don't do it.)

RE: As has been said before...
By xti on 9/3/2010 10:21:24 AM , Rating: 2
AMD is the big fish out of ATI and AMD.
Discrete market is a small pond, it doesnt matter how many of the niche market follows ATI over nVidia, its still a small pond.

Now you have a big fish in a small pond.

lets put it this way, if intel bought nvidia, and they finally came around and said "we dropping the nvidia line and calling all 260gtx the Intel 260 from now on" - what would be the perception?

that is what AMD is going for.

wait a minute...
By inperfectdarkness on 8/30/2010 4:31:16 PM , Rating: 2
so all client-level cpu's are now going to have the same word prefix?


By espaghetti on 8/31/2010 2:11:54 AM , Rating: 2
When I was younger, you were hot.
Now someone has decided you are to old...
Good luck becoming the cougar your were destined to be.

By GoneToPlaid on 8/30/10, Rating: -1
By mattclary on 8/30/2010 8:47:56 AM , Rating: 2
12 computers killed by lightning?!?

1. You have really bad luck

By GoneToPlaid on 8/31/2010 8:05:52 AM , Rating: 2
Thanks Matt,

I use UPS backups on all of my computers, but when lightning hits the power line 200 feet from the house then there isn't too much that one can do -- except curse mother nature!

By mattclary on 8/30/2010 8:51:08 AM , Rating: 1
And, this will help a lot...


By Blight AC on 8/30/2010 9:04:21 AM , Rating: 4
I guess you missed it, but it's just a rebrand. They aren't dropping the Radeon GPU's, just the ATI name on them.

By Cheesew1z69 on 8/30/10, Rating: -1
By jvillaro on 8/30/2010 12:38:30 PM , Rating: 2
It is a shame that AMD plans to kill off ATI -- just when they were finally getting into stride with the brand and technology acquisition. AMD -- don't kill ATI -- at least try to sell ATI to VIA since VIA updates motherboard and chipset drivers even for products which are years old.

Slap yourself...
The original poster got it wrong and he´s just explaining

By sgtdisturbed47 on 8/30/2010 9:27:20 AM , Rating: 2
That's protection built-in to the PSU, which isn't brand-specific for CPU protection.

Anyway, I was about to call B.S. on this article but after they clarified that it is the brand they are killing, not the product, then I was able to hold myself back from laughing uncontrollably.

By Taft12 on 8/30/2010 12:08:39 PM , Rating: 1
Save your thanks for the motherboard OEM's (Asus, Gigabyte, ...)

They use the same manufacturing process for Intel and AMD boards.

By justjc on 8/31/2010 6:57:50 AM , Rating: 1
They're not killing ATI, they're killing a company name.

AMD will continue to make GPUs, the only difference is that future products like the Radeon HD6xxx series will be coming from AMD instead of the subsidiary company ATI. Because of this you can expect AMD graphics drivers to support ATI branded cards for quite some time. At least that is the case for those of us who have motherboards using ATI chipsets which AMD re branded back in 2006.

This is the dumbest move EVER...
By T2k on 8/30/10, Rating: -1
RE: This is the dumbest move EVER...
By justjc on 8/31/2010 8:46:04 AM , Rating: 1
Are you sure it's ATI that people recognize?

I'm quite sure it's Radeon and FirePro, the two brands AMD will continue using, as those are the names that is used in the card name and short descriptions of the cards specifications.

Honestly when did you last buy a graphics card, non ATI brand, that used ATI in the card name?
Even enthusiasts don't buy ATI cards, they buy Radeon cards, that they might know contain a chip designed by AMD owned ATI that is produced at a TSMC plant.
Based on a quick Radeon search at Toms Hardware Forums I can conclude that the ATI name quite often is missing there as well, making the case that it doesn't really matter to the forums users if the producer is AMD or ATI as long as it's a Radeon card.

AMD have very little to loose ditching the ATI name, but a lot of brand recognition to win for the AMD brand.

By ll333r0y on 9/1/2010 12:44:57 PM , Rating: 2
Love the ATi name... sux to see it go.

"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki