Print 95 comment(s) - last by CSSaddict.. on May 21 at 1:04 PM

Intel set to cut prices ahead of AMD's "Barcelona" launch

Last week, AMD talked about its desktop plans for the upcoming year in Sunnyvale, California. Although AMD showed its upcoming desktop processor running in single and dual-socket configurations, the company chose not to announce an official launch date for its next-generation desktop processors. According to Robert Rivet, AMD executive vice president and CFO, however, AMD's next-generation processors will be ready by Christmas.

Despite how distant the possible December launch date of AMD's native quad-core desktop processors may seem, Intel is already stepping up the competition and will be instituting a series of aggressive price cuts in July. We originally reported these major price cuts, which will be targeting Intel's quad-core desktop and server processors, in March. At the time, we didn't know the official date of when the price cuts would take place. We can confirm today that the price cuts will take place on July 22.

Intel Core 2 Quad
L2 Cache
FSB July 22
QX6800 2.93 GHz 8MB 1066 MHz
Q6700 2.66 GHz 8MB 1066 MHz
Q6600 2.40 GHz 8MB 1066 MHz

The first part of the price cuts will center on Intel's quad-core desktop processors. The Q6600, which Intel launched in February, currently sells for $530 in quantities of 1000. When the product was originally launched, it was priced at $851 in quantities of 1000. The next round of price cuts will effectively lower the price to $266. The selling price of the Intel QX6700 will also be lowered, coming in at $530 by the end of July.

Intel Quad Core Xeon DP
L2 Cache
FSBJuly 22
X53653.00 GHz 8MB 1333 MHz
X5355 2.66 GHz8MB 1333 MHz
E5345 2.33 GHz 8MB 1333 MHz
2.00 GHz 8MB1333 MHz
E53201.86 GHz 8MB 1066 MHz
1.60 GHz 8MB1066 MHz
L53201.86 GHz 8MB 1066 MHz
1.60 GHz 8MB1066 MHz

Intel is also slashing the prices of its quad core Xeon DP processors. The flagship Xeon DP X5355 will see its introductory $1172 price drop to a more manageable $744. Likewise, Intel's slowest 1333MHz FSB Xeon DP processor will drop to $316 while the Intel's two low-voltage Xeon DP L5320 and L5310 processors will fall to $320 and $273 respectively.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Intel is out for blood
By FITCamaro on 5/15/2007 11:27:31 PM , Rating: 5
Umm... who exactly games without AA and AF?

*raises hand*

At 1680x1050 the games I play really don't need the AA. When you get to resolutions around 1600x1200 and over, AA starts to become not quite worth the performance hit since theres already enough pixels on the screen to not have a jaggy edge.

And 2900XT is priced barely over the 8800GTS 640MB card and gets a little better performance. So how is it priced badly? And the fact is that ATI's card is designed to do DX10. And until we know what the performance of both companies cards is in DX10, you can't call either a winner. If ATI can keep over 60 FPS and rocks DX10 while Nvidia gets higher frames in DX9 but is worse at DX10, I'd call ATI the winner.

RE: Intel is out for blood
By redbone75 on 5/16/2007 12:00:54 AM , Rating: 2
I don't have any real loyalties to either company, because they're both out for money. I want to see competition because it's ultimately good for the consumer. So, my post gets voted down because... there is a hint of truth to it? Face it, sure you can game at higher resolutions and feel the need to sacrifice AA and AF for performance, but if I'm spending $400 or more for a higher end part I want BOTH performance AND great image quality. Sure, both cards were designed with DX10 in mind, that's why people are buying them. And, yes, we won't know which one will perform better under DX10 until we get some DX10 titles; however, if history serves as any indicator, how much are you willing to bet that the new DX10 games are going to bring these first generation DX10 cards to their knees? Fact is these cards are being tested in the now, so their performance in DX9 games is paramount.
And 2900XT is priced barely over the 8800GTS 640MB card and gets a little better performance.

No, the 8800GTS 320MB even bests the 2900XT with the eye candy turned up, so how is the 2900XT better than the GTS 640MB?

By the time DX10 games are released both companies will likely have released refreshes to their lineups with more horsepower/better drivers/whatever to make them perform better. Like I said, you don't have to justify your purchase to anyone, much less myself. I don't care. I'm not going to spend the high dollar and not get the eye candy and performance I'm expecting, so that means Nvidia gets my money for now. Your reasons for buying either card are exactly that: your reasons. Don't get upset because someone makes a point that's common sense and truthful.

RE: Intel is out for blood
By redbone75 on 5/16/2007 12:08:48 AM , Rating: 2
Oh, a little addendum: these cards were designed for resolutions at 1680x1050 and 1600x1200 or more, so it's killer to release a card that can't provide the goods at these resolutions. Yeah, yeah, we're waiting on DX10 numbers, but we've all been excited about DX10 since it was announced. I'm anxious to see which card's architecture is better suited for DX10 like the rest of you, but if I'm buying one now means I'm playing games now. Otherwise, I'll just keep quiet and stick with my X1950.

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Most Popular ArticlesSmartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki