Print 62 comment(s) - last by latrosicarius.. on Dec 5 at 1:14 PM

AMD diverts resources from the Quad FX platform development the same year it promises Quad FX is the enthusiast future

AMD excited many technology enthusiasts last year when it introduced the Quad FX platform. AMD representatives touted this platform as the next big thing from AMD.

AMD apparently talked up its Quad FX enthusiast platform so well that Intel decided to roll out a competing product in its unreleased Skulltrail platform. When the Quad FX platform first hit market in January of 2007 it seemed doomed from the start to many with steep price premiums for the mainboards and the processors. These price premiums led to the lethargic adoption of the platform.

According to The Tech Report AMD representative Suzy Pruitt commented on the future of Quad FX. “The short answer is that while there are still engineering resources focused on future platform offerings that build off Quad FX, the current energy and effort has gone into programs and product initiatives like Spider and AMD has discontinued future planning and development of its eight-core enthusiast platform at this time.”

Pruitt continued, “We will continue to support customers that have an existing Quad FX with DSDC and are also working on an upgrade path for those customers. While AMD is not actively promoting AMD Opteron processor as a 2P enthusiast solution, we recognized that there are enthusiasts who are looking for two-socket solutions and think an Opteron platform is well-suited to meet that demand at this time.”

After all the promises ad statements by AMD that Quad FX was the companies enthusiast future, AMD has apparently decided to all but kill the platform off. The few enthusiasts who plunked down the big dollars required to adopt the platform should be feeling a bit uncomfortable right now.

AMD promises to continue support for the platform. However, AMD also promised the platform was the future and the company has all but killed it off the same year. The best Quad FX owners can look forward to is an upgrade to Opteron processors that work with the Quad FX mainboards.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Bad news for the junkie
By TimberJon on 11/29/2007 4:44:30 PM , Rating: 3
I REALLY feel bad for the guy that bought all the top stuff just to brag about it. Like the article said, Quad-FX was hyped up to the max.

Get me interchangeable processors on a Graphics card and I will invest in that.

RE: Bad news for the junkie
By MPE on 11/29/2007 5:15:52 PM , Rating: 2
Get me interchangeable processors on a Graphics card and I will invest in that.

That sounds good but I am unsure how effective that would be when considering things like memory, controllers, video processor and etc would remain the same.

Not to mention, I doubt video card manufacturers would extend their products life cycle (aka less cards sold).

RE: Bad news for the junkie
By MonkeyPaw on 11/29/2007 5:51:27 PM , Rating: 3
Not to mention that the cost of an entire graphics card is less than that of the system it goes into. For example, an 8800GT is $220-250, and is considered upper-mid-range. Comparably, an upper-mid CPU+Mobo+RAM combo is going to easily cost $400+. Buying an 8800GTX for $500-799 is like buying the best motherboard and CPU you can buy, and we all know that costs way more. I know it's not the most elegant example, but the reality is that while interchangeable parts are great for customization, they don't make things cheaper than if the entire system is integrated. The moral of the story is that it's cheaper in the long run because the manufacturer mass-assembled the card for us. Incorporating slots and sockets on a daughterboard would increase prices across the board, not to mention that now cards would have to have BIOS's with more complexity to detect changes. No thank you.

RE: Bad news for the junkie
By 16nm on 11/29/2007 8:04:06 PM , Rating: 3
I want the entire system integrated onto the processor. I want to just plug my monitor, keyboard and mouse directly into the processor and go. I want it cheap and FAST! How long do I have to wait for this?

RE: Bad news for the junkie
By GeorgeOrwell on 11/29/2007 8:51:27 PM , Rating: 5
You can buy it today. It's called One Laptop Per Child.

It is so advanced, you do not even have to plug in your monitor, keyboard, or mouse.

You just turn it on and your future will be so fast, you'll have to stomp on your disc brakes.

RE: Bad news for the junkie
By timmiser on 11/29/2007 6:56:55 PM , Rating: 2
In the multicore cpu future, I think the era of the graphics card will come to an end. With more available cores on the cpus, I can see the graphic processing being done on the cpu and therefore no need for a graphics card.

RE: Bad news for the junkie
By afkrotch on 12/3/2007 12:36:14 PM , Rating: 2
I don't see a cpu core being used for graphics. General procs just weren't designed for such. What I can see is leaving out a core and replacing it with a gpu core. I see the standard home proc being more along the lines of a mainframe style proc.

Take for example this old mainframe proc.

Imagine each block being a seperate core and being capable of placing whatever style of processor you want. You can have 50 of them set to be a general cpu, 20 for gpu, 20 for physics, and 10 for system memory. Or you can simply customize it to whatever suits your needs.

RE: Bad news for the junkie
By latrosicarius on 12/5/2007 1:09:21 PM , Rating: 2
It matters what instruction set the processor is designed to execute.

When the x86 architecture first came out, there were very few "graphics-intensive" applications like games, so screen-rendering instructions were not a big-enough deal to build into the CPU's instruction set.

After games started requiring more and more speed, they had to "invent" (or at least improve) hardware acceleration in order to keep up--thus, the GPU was born.

If they can simply add GPU instructions to the CPU's instruction set, you will be able to have "true hardware" support for games without a graphics card. The CPU will be like a graphics card.

And with multiple cores, the OS could dynamically assign more cores to execute rendering threads if the game requires more resources, just as you can dynamically assign normal threads to available cores today.

RE: Bad news for the junkie
By mmntech on 11/29/07, Rating: 0
RE: Bad news for the junkie
By Gul Westfale on 11/29/2007 10:53:48 PM , Rating: 2
weren't the original quadFX chips little more than rebranded opterons? so what is stopping the owners of these boards from upgrading to faster opterons?

RE: Bad news for the junkie
By murphyslabrat on 12/3/2007 12:29:51 PM , Rating: 2
Actually, I have owned two ATI RAGE 128 PRO AGP's that had expandable memory via so-dimm socket. Also, the same GPU integrated into mobos often had the same feature. Also, some mobos with IGP's had expandability via headless AGP VRAM card. An example of this was with dell's optiplex series.

My point is that this has been done, but to buy a low-volume part to marginally increase performance (look at comparisons between 256MB and 512MB graphics card variants), while increasing cost to an, at best, relative ratio....

Not a good tradeoff

RE: Bad news for the junkie
By DallasTexas on 11/30/2007 8:17:29 AM , Rating: 4
Agree but it's fun to do a forum search for 'Quad FX' into Nov 2006 and see what posters commented on it. It is hysterical.

" will wipe Conroe ..."
"..Intel is dead..."
"..this is the system to get.."

RE: Bad news for the junkie
By Strunf on 12/1/2007 9:21:33 PM , Rating: 2
Funny if you do a forum search for "Phenom" you get just about the same comments.

RE: Bad news for the junkie
By MaK2000 on 12/1/2007 11:17:15 PM , Rating: 2
And Phenom is getting the same rave reviews. All hype and still slower than Conroe, muchless Penryn.

RE: Bad news for the junkie
By Denithor on 11/30/2007 8:49:12 AM , Rating: 1
Get me interchangeable processors on a Graphics card and I will invest in that.

This is actually similar to a thought I've had for a while now. How come they don't put a GPU socket directly on the motherboard instead of an entire card that connects by a relatively slow socket with limited bandwidth?

I mean come on, if we can install a CPU in a socket why not a GPU as well? Just chose a motherboard with the applicable sockets, buy the chips, and away you go. And a year down the road when the GPU is getting old replace just the chip itself at far lower cost than replacing an entire card.

This would also result in less e-waste going into the landfills (I know, I would never toss electronics but many people don't even know better or are too lazy to be bothered with proper disposal).

RE: Bad news for the junkie
By Gul Westfale on 11/30/2007 9:38:34 AM , Rating: 2
because a new chip like an 8800 would be handicapped by the small memory interface and slow memory chips of an older card.

RE: Bad news for the junkie
By ceefka on 11/30/2007 11:10:56 AM , Rating: 2
Then perhaps GPUs should also have HT(2/3) and On Board Memory Controller, maybe even their own memory slot. It makes sense, but I doubt if it is any cheaper in the long run.

RE: Bad news for the junkie
By latrosicarius on 12/5/2007 1:14:22 PM , Rating: 2
Video cards nowadays use GDDR3 RAM, anywhere from 256MB to 768MB. You would have to add more RAM to the system to make up the difference.

"A politician stumbles over himself... Then they pick it out. They edit it. He runs the clip, and then he makes a funny face, and the whole audience has a Pavlovian response." -- Joe Scarborough on John Stewart over Jim Cramer
Related Articles

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki