backtop


Print 87 comment(s) - last by EricMartello.. on Sep 10 at 7:23 PM


New AMD graphics branding

The evolution of AMD/ATI branding
AMD's market research shows that it's time to get rid of the ATI brand

It's been a long four years, but AMD has finally hits its stride after its acquisition of ATI Technologies way back in 2006. After agreeing to purchase ATI for $5.4B, AMD was besieged with quarterly losses stemming from the purchase, constant pressure from NVIDIA in the graphics market, and beatdowns from Intel (who wasn't exactly playing by the rules of fair business) in the processor market.

With most of its troubles now behind it, AMD is looking to kill off the long-standing ATI brand and bring Radeon and FirePro graphics solutions solely under the AMD umbrella according to AnandTech.

According to AMD's own research in markets from around the world, it came to the following three conclusions:

  1. AMD preference triples when respondent is aware of ATI-AMD merger
  2. AMD brand [is] stronger than ATI vs. graphics competitors
  3. Radeon and FirePro brand awareness and consideration [is] very high

The move will also help to further consolidate AMD's branding which has pretty much gotten out of hand in the past few years [see figure on right]. AMD will begin the transition later this year to phase out ATI branding and move to a more simplified product branding lineup. By 2011, AMD's product lineup will consist of AMD's Opteron for server processors, Vision (which consists of a CPU/GPU hybrid) for consumer processors, and Radeon/FirePro for graphics.

With AMD now taking the discrete graphics market lead from NVIDIA (51.1 percent for AMD versus 44.5 percent for NVIDIA) and preparing to take the fight straight to Intel with three new CPU designs, the next year should be a fruitful one for enthusiasts.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Huh?
By Flunk on 8/30/2010 8:42:33 AM , Rating: 5
But the guy in Best Buy told me that memory is all that matters!


RE: Huh?
By rudolphna on 8/30/2010 9:27:55 AM , Rating: 1
Unfortunately this is the case with most best buys. I work in PC sales at an upstate new york BB store, and I've run into the same phenomenon, that most people shop for graphics cards, or computers by the amount of video memory. I then have to explain that just because the Integrated graphics chip, or Radeon 5450 has 1GB of memory, doesn't mean that it is a fast card.

It would be pointless to get into a discussion about GPU architecture in store, so I just simplify to saying that certain models are faster than others, and that memory only really affects the size screen you can run (Which is true). Most people looking to upgrade from integrated graphics but don't game get a recommendation for a 5450, or 5670. Anybody playing games, depending on monitor size, and what games, get a 5670, 5750 or a 5770.

Usually with these comes a BFG Tech, or Thermaltake power supply since many would try to run it on the POS 250W power supply that comes in most OEM computers.


RE: Huh?
By Cheesew1z69 on 8/30/2010 9:56:33 AM , Rating: 2
quote:
that memory only really affects the size screen you can run (Which is true).
Citation for this?


RE: Huh?
By Quadrillity on 8/30/2010 9:58:35 AM , Rating: 2
I agree; I would word it differently when talking to customers though. Sometimes you just have to answer, "this one is faster than this one" lol.


RE: Huh?
By FITCamaro on 8/30/2010 10:25:49 AM , Rating: 2
Memory size can affect how high a resolution you can run with an acceptable framerate. Higher resolutions use more memory from the increased texture sizes.

Of course a 5450 with 1GB of memory won't beat a 5870 with 512MB of memory at 1920x1200.


RE: Huh?
By Cheesew1z69 on 8/30/10, Rating: -1
RE: Huh?
By leexgx on 8/30/2010 12:01:40 PM , Rating: 2
more then 1gb is only used if your using Lots of AA and an big monitor (like 32in res size) or 3 monitor Setup (Surround/Eyefinity) and even then you may need AA to push the card past 1gb ram use (most games Frame buffers are 512MB so they work on every ones system but norm the Texture setting that does that)

2 GTX460 in SLI that are 1gb can handle Surround View at High screen sizes to an Point (AA+ high Res) you most likely need an Nvidia card that has an bigger Frame bufffer then (GTX470/480), HardOCP have not done an Surround with GTX460 yet not that i can find

on ATI side Eyefinity 6 port cards do Benefit from the 2gb as it does go over the 1gb on Super high Res

any way Normal customers on Lower end cards and screen sizes 512MB is norm plenty but the every day customer looks at price and ram size


RE: Huh?
By torpor on 8/30/2010 1:48:34 PM , Rating: 2
Monitor size is irrelevant.

Displayed resolution is highly relevant.


RE: Huh?
By Hieyeck on 8/30/2010 4:03:24 PM , Rating: 2
oh geez, all the terrible downrates for truths. We need to ban Best Buy IPs from rating here. Yes, memory size is a only a part of a GPUs performance, and while it's not a significant part, it's still a substantial part. Most monitors can't handle max supported resolutions, but still to say it'll support those resolutions, it'll need that memory.

Some rough math:
5870 advertises 2560*1600 = A little over 4 million pixels. Each color needs about 1 byte (32-bit color), each pixel has 3 colors (RGB) + misc data (such as opacity, etc.) - 4 bytes. 16MB for one frame. Needs to buffer... say 30 frames (half of 60hz - don't quote me. I don't know the exact relationship). 30 frames is already just shy of 500MB of data needed to be buffered. This doesn't include all the algorithims and other processes (model transforms, transparencies, video rendering and decoding) used to create this data to give you a moving display.


RE: Huh?
By priusone on 8/31/2010 7:38:10 PM , Rating: 2
You can bring on all the math you want, but you are ALL forgetting the most important number. WATTS. Your typical inexpensive tower has a 250Watt power supply. My cousin wanted to be able to play some of the newer FPS's and his coworker had him sold on a $300 graphics card. Oh sure, the card was awesome, but the thing needed well over 200Watts itself. The guy does a good job logging, but you can't imagine the hell I endured in convincing him that the card would toast his PSU and possibly the rest of the system.

In the end, he got a decent graphics card, but only after we put in a new PSU. Do you think the average Best Buy worker makes sure and ensures the customers PSU can handle the new graphics card? Doubtful.


RE: Huh?
By leexgx on 8/30/2010 11:32:07 AM , Rating: 2
think he (Flunk) was Joking about that comment he made was not an Question


"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki