Print 50 comment(s) - last by Targon.. on Jul 16 at 8:18 AM

Did I read those benchmarks right? (Source: PC Magazine, June 15, 2007)

Techland's Call of Juarez
The ATI Radeon HD 2900 XT gives the GeForce 8800GTX a run for its money in high end applications

Kelt Reeves is the CEO of Falcon Northwest.  This story was originally published on Kelt's Take.

It would probably come as no surprise to even the most casual PC hardware enthusiast that NVIDIA has been dominating the high end graphics card market for going on two years now. What was the years-running battle of ATI and NVIDIA, each leapfrogging the other with a faster card every six months, seems to be a thing of the past. The R600, ATI’s long awaited DirectX10 card, was ATI’s last hope for remaining an option on the high-end enthusiast’s shopping list. Seriously delayed, when the R600 finally arrived last month (as the officially named ATI Radeon HD 2900XT) the press was… less than kind.     
  • "ATI Radeon HD 2900 XTX, doomed from the start."
    • DailyTech
  • "...the ATI Radeon HD 2900 XT, in our opinion, is a flop."
    • [H]ardOCP
  • "Sometimes it has a hard time keeping up with a 320MB 8800GTS..."
    • Guru3D
The Radeon HD 2900XT got the kind of press normally only reserved for unrepentant hotel heiresses convicted of multiple DUIs. It got its butt kicked at almost every press outlet. Instead of being ATI’s great hope for DX10 performance titles in the future, it’s been relegated to the bargain-card bin.    
  • "That boy was our last hope."
    • -Ben Kenobi
  • "No. There is another."
    • -Yoda
What other hope? I must be shamelessly quoting Yoda hypothetically, in a cheap pop-culture reference to some future card ATI PR is now saying will make .5 past lightspeed. It’s obvious that ATI lost this round. Isn’t it?     
What if ATI won a round, and not even ATI noticed?
This is of course, impossible. If ATI had a winner product their PR department would be crowing about it to the heavens. That’s what PR does, over-hypes wins and spins losses. Even if every other part of a company has problems, PR still manages to fill the world with FUD. You know PR guys - they’re full of it. [Disclaimer: I do most of the PR for Falcon] There’s just no conceivable way the ATI Radeon HD 2900XT card was anything but a disappointment or we would’ve heard about it somewhere.
Did I read those benchmarks right?

Yes. This is Falcon’s fastest ATI Crossfire based system put up against our own fastest SLI based system that PC Magazine had reviewed less than two months earlier. The testing was done under Windows Vista 32-bit. And yes, our ATI based system destroyed the NVIDIA based system. Even given that NVIDIA had made driver improvements in between the time these two systems were benchmarked, these scores aren’t even close. Prey at 2560 resolution was almost twice as fast on ATI than it was on NVIDIA. Company of Heroes at 2560 was more than twice as fast. And we couldn’t clock the CPU on the Intel/ATI combo as high as we could on the Nvidia motherboard. Want to see the full story for yourself? Here’s the link to the PC Magazine review.    

So how did ATI win, exactly? And why are PC Magazine and Falcon Northwest the only ones that know it?

It's complicated, but I can boil it down to 3 main reasons:
  1. Super-secret hardware with dastardly effective codenames.
  2. Great Windows Vista drivers.
  3. Someone at ATI mistakenly sending "the good cards" out to Falcon Northwest.
ATI, despite being purchased by AMD, is still essentially a Canadian company. As such, I believe they name their products in metric. It’s the only excuse I can invent for them. For instance, the code names of their R6xx series products: R600 is the fastest. R610 is the slowest. R630 is in the middle. Okay so maybe this isn’t metric so much as just unhelpful. The U.S. tried to go metric in the ‘70s. I found that also unhelpful, but I digress.

To graduate from unhelpful idea to shooting yourself in the foot: Make 3 different Radeon HD2900XT cards that are vastly different performers but all have the same name. Then do not publicly acknowledge the existence of the fastest card. Continue denying all knowledge of the faster card even when the press eviscerates the slower version. This is what ATI did.     
So here are the Cliff’s Notes on the 3 different cards so you can identify them in the wild:
  1. ATI Radeon HD2900 XT- 9" Long, 512 Megs of GDDR3 memory. This is the card that’s getting all the bad press. Deservedly so? Possibly. Suffice to say it’s not the high end product that Falcon Northwest’s clients would be most interested in.

  2. ATI Radeon HD2900 XT- 12" Long, 1024 Megs of GDDR4 memory. Huh? You probably would’ve noticed the world’s first GDDR4 based graphics card. Especially if it was a gigantic 12" monstrosity that would have a hard time fitting in most PCs, right? This card was only sampled to system builders, and to my knowledge none of them picked it up as a product. It’s too big, WAY too hot, has clearance issues - forget I brought it up.

  3. ATI Radeon HD2900 XT - 9" long, 1024 megs of GDDR4. This is what we’re talking about! Two of these are what walked all over the SLI’d NVIDIA 8800 GTX cards at PC Magazine. Ask for it by name. Wait, that won’t really help. Some places have called this the "XTX". While technically incorrect, I’ll go with it. Anything to differentiate the winning version from the losing. For now, this card will only be available from Falcon Northwest and a handful of other boutiques in new systems.
Adding to unhelpful metric nomenclature, the 9" GDDR3 512 Megabyte card appears to be physically identical to the 9” 1024MB GDDR4 version. If you had one of each in your hands you would not be able to tell them apart. Come on ATI, even TV executives by the late ‘60s had figured out how to identify the good version and the bad version of otherwise identical looking things. At the very least ATI could’ve painted goatees on the slow cards.         
It's the drivers.
This article’s need for existence is cause for a bit of retroactive chastising for NVIDIA's drivers. I say retroactive because NVIDIA and ATI are constantly improving drivers, and some benchmarks I ran just days ago have Nvidia catching up on some of the scores PC Magazine saw. But of course, ATI is always working on driver improvements of their own. It’s always tough to draw a line in time and to say accurately how fast each card is at that moment. But that’s what all the press reviewers did in their write-ups of the Radeon HD 2900XT – benchmarked them against Nvidia’s cards at the same moment. How come PC Magazine’s results are so different?

Every review we found online was only testing single graphics cards, no SLI vs. Crossfire dual card setups. And all were under Windows XP. And we only saw one review that even got a hold of a GDDR4 version (that was leaked against ATI's wishes), and none with two GDDR4s run in Crossfire. So the vast majority of reviews we saw were pitting a single GDDR3 Radeon against NVIDIA's offerings, under Windows XP. And in that environment, I do not disagree with their findings.

But isn’t the big selling point of this entire new generation of cards from both ATI and Nvidia their Direct X10 ability? Sure there’s very little Direct X10 content to test right now, but Windows XP isn’t going to support Direct X10, so why not at least test it on the new OS?

And Falcon Northwest’s customers typically buy high end setups. All of our tower systems are dual-card ready, and most of our clients buy two cards with their new systems. If you were buying a new high end PC, wouldn’t your goal be for the fastest DX10 performance you could prepare for? Crysis IS coming people.
  • "Twins Basil! Twiiins!"
    • -Austin Powers
PC Magazine’s benchmarks show how two similar Falcon machines did with dual card setups under Vista. And the Crossfire setup easily beat the SLI setup. It’s no secret NVIDIA has had huge Vista driver issues, despite being first to market with a Vista-ready DX10 card by 5 months. Under XP, where both NVIDIA and ATI have established and polished drivers, NVIDIA's faster hardware wins. For any of our customers running Windows XP, I’d recommend NVIDIA's cards over any of the new Radeons. When you move to Vista, scores start to even out. And when you move to dual card setups under Vista, SLI encounters significant scaling problems. Crossfire scales up performance well. The end result is what you see on that benchmark chart.
  • Caveat Emptor Magnus Conclamatio - 'Buyer Beware of Loud Noises'
Sure I made that up with Google’s dubious translating help, but there is one downside to the Radeon card that carries more weight when it’s in questionable Latin: The Radeon cards are loud. Two of them are very loud. The loudest cards ever made? No. But Nvidia has been lowering the decibel bar lately, and our clients tend to really appreciate that. The NVIDIA 8800 Ultra, with its oversize fan, is the much quieter choice at the highest end. It’s something ATI should have considered, and depending on whether you care about how much noise your system makes, something you may need to as well.
  • "There are three kind of lies; lies, damned lies, and benchmarks."
    • -Loyd Case      
 I believe I’m stealing Loyd Case’s own twisting of Mark Twain’s famous saying about statistics here, but it fits. Benchmarks, like statistics, can be used selectively to prove a point. The benchmark table from PC Magazine above is of two of our own PCs against each other. My point? Falcon’s Swiss-like neutrality. Falcon Northwest has no vested interest in which brand of graphics card you purchase in your system from us, as long as we get you the right brand for your needs. We sell both ATI and NVIDIA cards, and enjoy working with both companies. This article is not about selling you an ATI card. I just want Falcon’s visitors to realize there is finally another viable option in graphics cards for them to look at. I felt the need to tell you, as no one else - not even ATI, did.

Epilogue: Graphics card guys, you've had 5 months since Vista shipped. The DX10 drivers need to be ready. Now.

In the past couple weeks some very important developments have occurred in Vista gaming. The first two games that require Vista to run shipped. Not surprisingly, both are published by Microsoft: Halo 2 and Shadowrun. Whether these games are your cup of tea or not, expect more of this trend. Microsoft has a lot of muscle and wants you to move to Vista. That’s not such a bad thing. No one’s enjoying Windows 95 anymore. You will move to Vista, it’s just a question of when. Microsoft making their games Vista only already can best be termed the "stick" approach.

I prefer the "carrot" approach, which recently shipped in the form of the first 'real' DirectX10 game: Call of Juarez. I say "real", because the also recently available DirectX10 patch for Company of Heroes really doesn’t seem to add much graphical goodness, and it cripples the framerate. Call of Juarez ships with both a DirectX9 version and a DirectX10 version, which looks significantly better. And when you run the DX10 version, it warns you of all the extra graphical goodies it’s going to enable and that this will require a lot more PC power. As a system builder, that’s music to my ears. As a gamer, it’s the reason why I love this hobby.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Oh boy
By Griswold on 6/28/2007 3:01:16 AM , Rating: 2
Can somebody filter out the unsuccessful attempts of being funny and provide a distilled version of this article with the essential parts? I got a headache after the first few paragraphs...

RE: Oh boy
By CrazyBernie on 6/28/2007 4:10:22 AM , Rating: 5
Umm... the 2900 doesn't suck as much as people think? I think that pretty much sums it up.

RE: Oh boy
By Lightning III on 6/28/2007 8:39:44 AM , Rating: 5

what I got is that nvidia's sli vista drivers suck and ati's crossfire are a little less sucky

ooh the goodness card is comming out at least diamond's version of it soon, saw a review

he left off one of the main selling points cost
these are ultras that they going against after all there is a 400 hundred dollar pricepoint differance (499 msrp for the 1gig gddr 4 model)

also go look at the computex firing squad invitational were nvidia did come with an actual assembled and tuned by them system and got the big no prize

and an ati chipset and video cards was the winning machine

(personally I think the cooler running ati chipset allowed a higher overclock [on air] than the 680i heat beast)

so there a second source essentialy saying the same thing

the 8800 is evolutionary the new architechture of R600 is revolutionary unfortunatly it requires individual optimization for each game that's why the gap is slowly but surely closing

that and the real DX10 battle is just starting that's where its supposed to shine

RE: Oh boy
By christojojo on 6/28/2007 3:58:24 PM , Rating: 2
I don't know if it my office computer or the site but slows down my pc to a crawl (3000+, 512DDR, NV Gt 6600).

For anyone that is interested; The article reads like one big commercial. It took long to get to the meat of it. Sapphire received kudos for its OC'ed watercooled 2900XT.

Firingsquad seemed to have interesting articles but everypage chugged my system to a jerky crawl. (Not good web design)

RE: Oh boy
By wordsworm on 6/29/2007 6:05:54 AM , Rating: 2
The guy is definitely drawing attention to his company, and I don't think he hid it, as it's pretty obvious. But even if that's true, I don't think it makes the article illegitimate. If anything, it solidifies the position he took: one of the few companies still independent from the big conglomerates (which absorbed Alien and that other one...) has offered a different perspective on the cards. It's clear there's a great deal of turmoil going on at ATI/AMD.

I suspect that the company is heavily investing R&D into convergence chips, and that's why they're not focussed on selling their current generation of cards. Of course, that's an uneducated guess. I wonder if there's an engineer who reads these posts could offer an idea as to the change that would take place if the CPU and GPU were coupled together. I think the bus speeds between CPU and GPU would be the first, biggest thing that would jump astronomically. Currently, the subsequent architecture built around having the CPU and GPU communicate with each other seems to be the biggest burden on current boards. It just seems to be the next biggest leap in computer technology, and AMD seems to have it in mind. If they achieve this marriage, I can't help but think that they could bring woe back into the Intel camp, not to mention Nvidia.

RE: Oh boy
By Regs on 7/1/2007 3:31:43 AM , Rating: 2
The director wants to go here, and only the engineer knows how.

The only problem is the director only sees the future with complete lack of regards to the present problems (Hector).

Even in AMD's slide show of how they plan to merge processes together, it will take time and a lot of it. AMD does not have a lot of time to play with.The first Fusion product is not expected until 2008-2009, and that's being optimistic. Even then AMD does not plan much for the first products other than saving ..yet..more power.

The true benifits will come later when they are truly intergrated, but we wont be seeing that until 2010+. It does not matter because any sooner and the industry won't be ready for it.

If you want your answer of how the the processor will perform, just take a look at the Cell Processor. The Cell processor is hard to program as of right now, and not for x86.

RE: Oh boy
By soydios on 6/28/2007 11:50:25 AM , Rating: 5
Correction: the 9-inch-long HD2900XT with 1GB DDR4 memory is better than anyone knows about.

RE: Oh boy
By Crusader on 6/28/2007 9:16:01 AM , Rating: 2
Graphics card guys, you've had 5 months since Vista shipped. The DX10 drivers need to be ready. Now.

That is the summary.

To reiterate-
ATI "beats" NV in multiGPU mode under Vista. Not a surprise, while ATI drivers have been consistently the worst for years.. NV are now behind in their Vista drivers, with the multiGPU portion showing significant losses in some cases.
It is a temporary loss till NV stops being lazy (or diverting their resources to new/hidden projects instead of focusing on their existing product line).
Simple truth is though, Xfire/SLI are for the fools anyway. Most of the wise run a single 8800GTX and call it a day, don't deal with these multiGPU issues.

RE: Oh boy
By Spoelie on 7/4/2007 9:05:51 AM , Rating: 2
Not a surprise, while ATI drivers have been consistently the worst for years..
Welcome to the real world, ever since ati took on the catalyst program, they have been at least on par with nvidia's drivers.

RE: Oh boy
By Spyvie on 6/28/2007 1:31:33 PM , Rating: 2
Can somebody filter out the unsuccessful attempts of being funny

Actually, I rather enjoyed the humor, I'd say this column was a home run.

Falcon just shot up a couple of spots on my radar.

RE: Oh boy
By VooDooAddict on 6/28/2007 4:23:52 PM , Rating: 2

Especially after that VooDooPC blog.

Heading over to price out a few laptops now.

RE: Oh boy
By just4U on 6/29/2007 4:39:16 AM , Rating: 3
I enjoyed this article and thought it was well written!

RE: Oh boy
By Sceptor on 6/30/2007 6:32:16 PM , Rating: 2
They want you to keep your eyes open for this card...ATI Radeon HD2900 XT - 9" long, 1024 megs of GDDR4. It's not supposed to suck like the regular HD2900 XT.

The ATI equivalent of a unicorn AND they want you to run 2 of them in Crossfire.

RE: Oh boy
By jmke on 7/4/2007 7:19:26 AM , Rating: 2
What this finding boils down to is that Crossfire under Windows Vista works better than SLI under Vista.

With 99% of people out there using Windows XP, this article is catering to a very small crowd:)

RE: Oh boy
By Spoelie on 7/4/2007 8:57:12 AM , Rating: 2
It's also factually incorrect. ATi had the better card during the 9 months reign of the R580. 7900GTX was slower, not by much, but slower nonetheless.

Also, it would be about time to retest the cards in a single card system as well, ATi should have improved the driver situation since launch time quite a bit.

By AmberClad on 6/28/2007 7:27:02 AM , Rating: 4
From what I gather from the article, only Falcon-NW and a select few other system builders have access to the so-called "good" version of the 2900 XT. So, in practice, this is not too useful to the vast majority of the AT target audience just yet. I doubt most ATers (many of whom are either DIYers and/or quite budget-conscious) are going to shell out the money for a Falcon-NW system solely to get the 2900 XTX. But if and when it becomes available to the mainstream public, I think it'd definitely be worth taking a look at.

By AthlonBoy on 6/28/2007 7:51:25 AM , Rating: 2
Pointless IMO... by the time it comes to mass market, the 65nm shrink will be imminent. Now THAT'S gonna be a good chip. With the improved thermals they should be able to crank up the MHz, and we've already seen how the current 80nm chip will scale if you cool it. It should have the 1GB of DDR4 as well, and - hopefully - the drivers will be truly sorted by then.

That chip might/should defeat an 8800GTX, but nVidia have been very quiet lately, leading me to believe they've got an 8900 card ready. But the specs on that just ain't clear. Then there's the whole GX2 dual-GPU-card issue, which might be attempted if they use a die shrink.

By AmberClad on 6/28/2007 8:05:16 AM , Rating: 2
Agreed. I'm looking forward to the GPU version of what Conroe did for CPUs -- more performance, less power and less heat. The current trend towards more and more massive GPU cooling solutions is kind of akin to the upward heat/frequency trend with the P4s, and it'd be nice to see it reversed. I still have my old GeForce 2 MX lying around somewhere, and the tiny little fan on it is a huge contrast to the massive dual slot fan+fin coolers on the current top end models.

By omnicronx on 6/28/2007 1:27:18 PM , Rating: 2
i wouldn't count on a new 'conroe' of gpus.. i think the days of leapfrogging every 6 months for the performance title is over. its all about scalability nowadays.

By abhaxus on 6/29/2007 11:25:51 PM , Rating: 2
eh... isn't that what [B]AMD[/B] did for CPUs? More performance, less power, less heat was definitely an AMD trend long before Intel brought C2D to market.

By jedisoulfly on 7/2/2007 2:33:29 PM , Rating: 2
here the card is already for sale on newegg they rock

By oTAL on 6/28/2007 5:22:31 AM , Rating: 2
A shadowy flight into the dangerous world of a man... who does not exist.

Great reference.

By therealnickdanger on 6/28/2007 9:23:59 AM , Rating: 3
Actually, while in the same vein, his reference isn't for Knight Rider's "Garth", as seen below:

... but rather it is a reference to the original Star Trek episode "Mirror Mirror" where a transporter malfunction sends part of the crew to an alternate, "evil" parallel universe where the women are more scantily clad and the men all have goatees. Observe Spock rockin' the goat:

By christojojo on 6/28/2007 4:03:46 PM , Rating: 3
"evil" parallel universe where the women are more scantily clad and the men all have goatees. Observe Spock rockin' the goat:

You mean he went to the 21st century ;-)

Denny Crane

By Lightning III on 6/28/2007 7:08:08 PM , Rating: 2

but that wouyld meen that they are evil alternative universe cards

aaaargh as I run screaming from the room

actually my favorite epsode

I always wanted the evil alternative geek uniform

Just those games?
By AthlonBoy on 6/28/2007 7:19:56 AM , Rating: 4
Prey and CoH? That's the best they can throw at them? What about Oblivion outdoors, what about STALKER? Those are todays really intensive games.

And what about antialiasing? That Radeon suffers massive performance hits thanks to AA, especially in scenes with lots of alpha blending to be done (think grass). Show me a wider selection of games with 4x and 8x AA on that system, then we'll talk.

I don't know if it's either better CrossFire scaling or that 1GB of DDR4 giving that card the edge at higher resolutions, maybe it could be both. What I do want to know is, is this advantage enough when you enable antialiasing, even given SLIs apparent shortcomings in Vista?

We need more tests!

Oh, by the way, R610 and R630 are the HD 2600 and 2300 cards. All versions of the HD 2900XT use R600. He might have actually implied that, and were just using the names to make a point, but it wasn't very clear.

RE: Just those games?
By titan7 on 7/1/2007 5:11:18 PM , Rating: 3
Prey and CoH? That's the best they can throw at them? What about Oblivion outdoors, what about STALKER? Those are todays really intensive games.

Sure, Oblivion outdoors does have LOD problems as is therefore intensive, but what about CoH? The 8800GTX SLI only could manage 32fps at 2560x1600! 32 fps on a pair of 8800s sounds pretty intense to me! And it's not bad code or anything because the more powerful system got up to 194fps! CoH should be there, it's nice to see benchmarks that aren't all FPS games :)

I like to see Prey in there as it is an OpenGL game. It's good to see that drivers work on multiple APIs, if they don't it's a sign that by the time they get their drivers figured out you'll be considering a card upgrade anyway.

By Visual on 6/28/2007 12:28:31 PM , Rating: 2
what i don't get here is why the "unbelievable" chart compares the newest of ati with age-old quad-6800 from nvidia... or were those some 8800gtx-es in disguise? or typo?

oh and another thing i don't get here is who was anakin's father? and when will we see episode 7? or episode 0, for that matter?

RE: nuh-uh
By Visual on 6/28/2007 12:31:47 PM , Rating: 1
hmm, disregard that first bit... maybe if i had spent more than two seconds reading that chart... but then again, it doesn't help that it was put in an awkward popup...

and what's with that "You must wait a few more minutes before you can post again." BS? this forum sucks more and more every day...

RE: nuh-uh
By KristopherKubicki on 6/28/2007 12:52:59 PM , Rating: 4
It's to prevent spam.

If ATI Won a Round, Would Anyone Notice?
By Kougar on 6/28/2007 12:34:03 PM , Rating: 2
I think Steam's survey sums it up. 8446 of 650583 users run dual GPU configurations, that is only 1.3% of those surveyed. For a game that is ATI friendly and designed on ATI hardware, of that meager 1.3% or 8446 users, only 346 used any kind of Crossfire configuration at all. That is 4.1% of all multiGPU users.

So I would say the answer is probably not, considering that the only people that can take advantage of the performance would be the extremely thin sliver of the market that buys, let alone can afford $800+ Crossfire configurations, a Crossfire motherboard, and a $200 power supply to run one the most inefficient running architectures designed to date.

Sure, even if they win that battle, they still lost the war. ATI can't win by catering to only the 1-2% of buyers that want performance at all costs and can afford to do so. Did I mention the heat and power draw again? Two rotten eggs will only make a stinking omelet.

RE: If ATI Won a Round, Would Anyone Notice?
By christojojo on 6/28/2007 4:14:33 PM , Rating: 2
$499 * 8446 multi GPU = $4,214,554

346 crossfire * $499 = 172654

Bragging rights = priceless

The hard part of comparing multi GPU segments is that ATI really has never had a good enough product to make a dent into that segment.

But if they do...

It is possible the multi GPU segment will continue to grow.

By Targon on 7/16/2007 8:10:20 AM , Rating: 2
A big factor in this is the price of the board and the memory on video cards as much as the price of the GPUs in the first place. If video cards came with slots for the memory that used a standard slot type, we would probably see more people going to a SLI or Crossfire setup. If the GPU costs $250 out of the $400-$500 price we see for the higher end video cards, that just goes to show why more don't jump on that bandwagon.

The reason the PC really took off was because resellers and then end-users were empowered to build our own systems, swapping out the CPU and memory ourselves based on what WE wanted/needed. We didn't have to buy a whole new computer for upgrades, and we loved that level of control. Graphics chips never got the same level of control over graphics. Sure, we can buy a new video card(though Dell and others left the AGP slot off lower-end systems for a while), but we have never had the option to just buy an AGP or PCI Express card with a socket on it for the GPU and slots for the memory. If we had that available, most of us would probably jump on the chance to have dual or quad GPUs by now.

AMD is sort of addressing this with Torrenza, but there hasn't been word of this technology being looked at for video technology. Picture being able to go 1 quad-core CPU with 3 GPUs. Since each CPU socket can have it's own bank of memory, I can see this working fairly well with proper support on the motherboard for connecting the GPUs to dual DVI and HDMI outputs. It would even make it easier to cool the latest GPUs this way since the CPU socket would be standard.

Vista can suck it.
By DEVGRU on 7/3/2007 11:32:04 AM , Rating: 2
"Microsoft has a lot of muscle and wants you to move to Vista. That’s not such a bad thing. No one’s enjoying Windows 95 anymore. You will move to Vista, it’s just a question of when."

Not such a bad thing, to who??? Maybe, maybe not. No ones enjoying Windows ME either, and everyone knows Vista = The new ME. I'll move to Vista when I get a free copy, stripped of all DRM, with SP3 and a EULA that has a shred of common sense. Maybe.

RE: Vista can suck it.
By Master Kenobi on 7/6/2007 2:09:04 PM , Rating: 1
I'm on Vista and I do not find your FUD to have any credibility. Please provide evidence of the following.

everyone knows Vista = The new ME

I do not believe Vista to be the new ME. Infact I regularly state that Vista to XP is the equivalent of Win95 to Win3.1. And I've been working in the trenches of IT for years.

I'll move to Vista when I get a free copy

You could get one if you were Microsoft Certified, or worked for a credible company that deals with Microsoft. You could also get one if you had submitted a bug to Microsoft for Vista when it was being publically tested.

stripped of all DRM

DRM? What DRM? There is absolutely nothing I can't do on Vista that I could do on XP. Stop spreading FUD.

with SP3

SP3? Why would this be necessary? Win2K had a grand total of 4 service packs. XP has as grand total of 2 (3rd in the works according to rumor). Vista has 1 in the works. Vista is a highly polished product, the need for a service pack isnt there right now. The bullshit of using service packs to "gague" when to adopt a product is straight up false.

and a EULA that has a shred of common sense

The Vista EULA is identicle to the XP EULA, so whats the problem?

RE: Vista can suck it.
By Targon on 7/16/2007 7:51:31 AM , Rating: 2
It seems that many people equate a lack of good drivers to a fault in the operating system. They also look at HD-DVD and Blu-Ray as something important for a computer, so the whole DRM issue is bothering them, even when they wouldn't spend the money on a player for at least another few years.

Both Windows XP and Vista have a shortage of mature 64 bit drivers. It's not limited to Vista.

Windows Vista is still new, and even though hardware companies have had access to Vista for a LONG time now, many still need a lot of work. Many companies have abandoned older discontinued products when it comes to fully capable drivers, though base functionality may be there.

The whole sound acceleration issue, and EAX support is an area where people do have a reason to complain right now. If you have a fairly modern computer, a lack of hardware acceleration probably won't bother you too much as long as the drivers provide decent quality, but some people feel they MUST have EAX support under Vista, which only Creative Labs is doing anything about at this point.

With a new API in Vista for audio, it will take another few years, but we will probably see hardware acceleration of some sort show up again. We have DirectX acceleration in video cards, so OpenAL support on sound cards may show up in time.

Vista comes with loads of extra bells and whistles that slow things down compared to Windows XP. Most of these features can be turned off, and if you have a computer with Intel graphics, you get what you deserve if performance is lower.

People also are complaining that Vista isn't worth UPGRADING to. There is a big difference between something not being worth an upgrade and something that is bad. If you get a new computer, and it isn't something at the bottom end of the spectrum, then Vista won't be that bad. It may feel a bit slower due to things like AeroGlass sucking up a lot of GPU power compared to Windows XP, but it's not BAD, and you can always turn it off. When you buy a car, you don't expect to upgrade the engine in that car after the fact. If you have XP, there is NO reason to upgrade to Vista.

When Windows XP came out, it took people a while to figure out where things were, and there was a learning curve. The only reason it really took off is because it was a huge improvement over Windows 98/ME. It wasn't a huge improvement over Windows 2000 though, so many people stuck with 2000. Now that Vista is here, people are going through a learning curve, and they don't like it because XP was around for quite a while.

So, all these kids who say how bad Vista is without even using the thing should be ignored. These are the same ones who feel that ATI is in so much trouble just because their highest end parts can't beat the highest end NVIDIA parts. The money is made in the mid-range of the market, where ATI IS competitive.

<no subject>
By Scabies on 6/28/2007 2:50:30 AM , Rating: 2
Thats very interesting... What would have been nice is to see those two systems running on their own single cards, 1 2900 vs 1 8800. I know we know who the winner will be (orly?) but to see the scaling of performance with that second GPU would be nice. In either event, I guess it comes as no suprise that AMD hasnt been saying "grab two for the REAL ass-kicking" as everyone will see the one-card benchmarks and immediately look elsewhere (followed by the second wave of abandonment as to-be-2900-crossfire-ers would look at the price of a capable PSU)
Imagine if it were a fact that two PS3's could be linked for an experience incredibly better than that of a single. Who on earth would believe Sony when they tell you to get another? It could be as concrete as fact can be, but its still the PS3.
[dont get me wrong, I have one (and an elite and a wii)]

RE: <no subject>
By danrien on 6/29/2007 12:51:25 AM , Rating: 2
i hear that two ps3s linked over a network provide phenomenal folding @home performance.

I have to admit...
By blckgrffn on 6/28/2007 8:59:27 AM , Rating: 2
I got a huge kick out of this :)

The first real computer that I owned was an $800 falcon talon. All of the support and awesome cable management at an excellent price. What happened to the affordable falcons? I suppose you don't need them anymore...

RE: I have to admit...
By Targon on 7/16/2007 8:18:13 AM , Rating: 2
The mid-level computer market is HARSH right now. With HP and Dell sending out dual-core machines in the $500-$600 range, it's REALLY tough to compete with that. Once you get into the $800 range, companies can compete a bit better for price/features.

Take a look at the HP 6000 series desktop machines, and see if you can manage to get the parts for the price it would cost you for one of those machines(AMD or Intel based). I would have a hard time getting the parts for the $530 you can buy the a6000n for. For even a medium sized company, the profit margin would be eaten by support costs.

Going to the $800 and up range, the margins are quite a bit better, and a company can stand out with custom case designs and other features you don't see from the big names(like thermal sensors with LCD displays), water cooling already set up, and things like that.

WTF Benchmark
By gramboh on 6/28/2007 6:16:34 AM , Rating: 2
12,300 in 3DMark06 on a C2Q @ 3.6GHz with 8800GTX SLI? Hahaha give me a break, thats a single GTX score.

I get 10,650 on a C2D @ 3.2GHz with a single 8800GTS @ 610/1900.

RE: WTF Benchmark
By Lightning III on 6/28/07, Rating: -1
RE: WTF Benchmark
By Lightning III on 6/28/07, Rating: -1
RE: WTF Benchmark
By Lightning III on 6/28/07, Rating: 0
RE: WTF Benchmark
By ricera10 on 6/28/2007 2:29:18 PM , Rating: 2
Those were Vista benchmarks...

By dm0r on 6/28/2007 8:58:40 PM , Rating: 2
I loved the "the good cards" part :D

Not only ATi...
By TGoP on 7/3/2007 7:00:00 PM , Rating: 2
The article explains a lot but not only ATi should learn from it. Although they horribly failed here nearly all IT companies making the same mistake over and over again. Graphics cards, CPUs, Harddisks, Mainboards... whatever part of the PC you look at the names for them are just stupid. It's getting super stupid when the product name doesn't even reflect what you actually get (take a Pentium4 661 for example). Whoever thinks this is what customers want should be fired.

btw, maybe I'm just one a a few people but I switched from ATi to nVIDIA after ATi's driver forced me to install .Net. I know I could use just the pure driver without any GUI and use a 3rd Party tool to configure my card but is that really what they want me to do? I do not need .Net for anything and I won't install it just for me graphics card driver. It's the same mistake Creative does... their drivers just suck and now they even want money for it (Alchemy). Sooner or later they will pay for such behaviour.

Sorry for my bad english guys.

By Unknown Soldier on 7/6/2007 1:36:03 PM , Rating: 2
Have to say that the 3DMark06 score for that GTX SLI must be wrong.

My Q6600 @ 3Ghz and GTS 320MB at 675/936 got 12273 and the same CPU at 3.2Ghz and GTS 320MB at 648/918 got 12104 3DMark06 points.

At 3.7Ghz and GTX SLI, that system should get a much much higher score.


By Anderson21 on 7/6/2007 11:42:58 PM , Rating: 2
These cards are on sale for like $529 CND in Canada. I looked on ATI's website here:

and i found that Extreme PC sells them to every ppl like us without systems. I'm not kidding:

So its not like we have to buy $10,000 computers to get these. Also i wanted for XTX so long, i just want the cheapest variant they got so i can overclock the crap out of it. Hehehe...

"This is about the Internet.  Everything on the Internet is encrypted. This is not a BlackBerry-only issue. If they can't deal with the Internet, they should shut it off." -- RIM co-CEO Michael Lazaridis
Latest By Kelt Reeves

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki