Print 87 comment(s) - last by EricMartello.. on Sep 10 at 7:23 PM

New AMD graphics branding

The evolution of AMD/ATI branding
AMD's market research shows that it's time to get rid of the ATI brand

It's been a long four years, but AMD has finally hits its stride after its acquisition of ATI Technologies way back in 2006. After agreeing to purchase ATI for $5.4B, AMD was besieged with quarterly losses stemming from the purchase, constant pressure from NVIDIA in the graphics market, and beatdowns from Intel (who wasn't exactly playing by the rules of fair business) in the processor market.

With most of its troubles now behind it, AMD is looking to kill off the long-standing ATI brand and bring Radeon and FirePro graphics solutions solely under the AMD umbrella according to AnandTech.

According to AMD's own research in markets from around the world, it came to the following three conclusions:

  1. AMD preference triples when respondent is aware of ATI-AMD merger
  2. AMD brand [is] stronger than ATI vs. graphics competitors
  3. Radeon and FirePro brand awareness and consideration [is] very high

The move will also help to further consolidate AMD's branding which has pretty much gotten out of hand in the past few years [see figure on right]. AMD will begin the transition later this year to phase out ATI branding and move to a more simplified product branding lineup. By 2011, AMD's product lineup will consist of AMD's Opteron for server processors, Vision (which consists of a CPU/GPU hybrid) for consumer processors, and Radeon/FirePro for graphics.

With AMD now taking the discrete graphics market lead from NVIDIA (51.1 percent for AMD versus 44.5 percent for NVIDIA) and preparing to take the fight straight to Intel with three new CPU designs, the next year should be a fruitful one for enthusiasts.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Huh?
By Cheesew1z69 on 8/30/2010 10:38:21 AM , Rating: -1
Sure, it can affect the size of the monitor and refresh rate, but it affects a lot more then just that, but to say that's all it affects to a customer, is silly. If that was all it affected, why put up to 2gb on a card? A resolution would not require 2gb of memory on any card.

RE: Huh?
By leexgx on 8/30/2010 12:01:40 PM , Rating: 2
more then 1gb is only used if your using Lots of AA and an big monitor (like 32in res size) or 3 monitor Setup (Surround/Eyefinity) and even then you may need AA to push the card past 1gb ram use (most games Frame buffers are 512MB so they work on every ones system but norm the Texture setting that does that)

2 GTX460 in SLI that are 1gb can handle Surround View at High screen sizes to an Point (AA+ high Res) you most likely need an Nvidia card that has an bigger Frame bufffer then (GTX470/480), HardOCP have not done an Surround with GTX460 yet not that i can find

on ATI side Eyefinity 6 port cards do Benefit from the 2gb as it does go over the 1gb on Super high Res

any way Normal customers on Lower end cards and screen sizes 512MB is norm plenty but the every day customer looks at price and ram size

RE: Huh?
By torpor on 8/30/2010 1:48:34 PM , Rating: 2
Monitor size is irrelevant.

Displayed resolution is highly relevant.

RE: Huh?
By Hieyeck on 8/30/2010 4:03:24 PM , Rating: 2
oh geez, all the terrible downrates for truths. We need to ban Best Buy IPs from rating here. Yes, memory size is a only a part of a GPUs performance, and while it's not a significant part, it's still a substantial part. Most monitors can't handle max supported resolutions, but still to say it'll support those resolutions, it'll need that memory.

Some rough math:
5870 advertises 2560*1600 = A little over 4 million pixels. Each color needs about 1 byte (32-bit color), each pixel has 3 colors (RGB) + misc data (such as opacity, etc.) - 4 bytes. 16MB for one frame. Needs to buffer... say 30 frames (half of 60hz - don't quote me. I don't know the exact relationship). 30 frames is already just shy of 500MB of data needed to be buffered. This doesn't include all the algorithims and other processes (model transforms, transparencies, video rendering and decoding) used to create this data to give you a moving display.

RE: Huh?
By priusone on 8/31/2010 7:38:10 PM , Rating: 2
You can bring on all the math you want, but you are ALL forgetting the most important number. WATTS. Your typical inexpensive tower has a 250Watt power supply. My cousin wanted to be able to play some of the newer FPS's and his coworker had him sold on a $300 graphics card. Oh sure, the card was awesome, but the thing needed well over 200Watts itself. The guy does a good job logging, but you can't imagine the hell I endured in convincing him that the card would toast his PSU and possibly the rest of the system.

In the end, he got a decent graphics card, but only after we put in a new PSU. Do you think the average Best Buy worker makes sure and ensures the customers PSU can handle the new graphics card? Doubtful.

"Well, we didn't have anyone in line that got shot waiting for our system." -- Nintendo of America Vice President Perrin Kaplan

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki