quote: and the cooling solution which now exhausts heat into the case.
quote: At the rear side of that rounded curve on the card we see two air intakes. The card is designed in such a manner that it will take in air from inside your PC and exhaust the heated air outside the PC.
quote: Whether is 100fps vs 120fps is Moot.
quote: the man is clearly a genius
quote: since it seems many of you have a problem
quote: Yeah I must admit he is a genius because he successfully tricked a lot of customers into buying a new (but actually an old) products. And he also a genius in getting his ass kicked by his competitors . Hmm... A truly genius man indeed... very very genius.... a superman maybe... Tell you what, who cares?
quote: You're the one who don't get it because clearly we don't have any problem with those benchmarks, you're the one who talk too much about the 5870 being inferior to the 295. Those are only preliminary benchmarks so who cares? And you're comparing a dual gpu to single gpu, that makes it unfair.Get over it dude, if you don't like it, fine, that's your choice, just stfu because you're clearly biased to nvidia.
quote: fact their parts are consistently superior
quote: 5870 didn't live up to the hype
quote: I guess all you can do to avoid this sad bit of news is cover your ears and scream "LALALALALALALALa!!!!"
quote: Yeah, just like the recent fuss about their G84/G86 chips being defective? Better luck next time.. Oh and that's a fact.
quote: What didn't live up? All I see is a perfectly made products, with a good price-to-performance ratio and a lower power draw, that alone would made it suffice for me to consider buying it.
quote: Sad news? I think you're the one who being afraid of ATI's kicking nvidia's butt again man. That's why you're trolling here saying anything you could to discredit ATI. You really don't get it why your posts mostly got downrated, do you? Its because you're obviously a little silly fanboi who try to do anything just to make ATI's new offering looks bad. Better luck next time man.
quote: I bet you can't even bring yourself to admit the GTX 295 is still the faster part, can you? :D
quote: If you're going to bring up defective chips, be sure to mention that 1 billion fiasco fireball of a chip AMD sold to Microsoft for their 360.
quote: by Chocobollz:Who said I won't admit that the GTX 295 is the fastest gc on earth? The fact is, it is, so I must admit it. The problem is, I believe most people would choose the best bang for the buck over an overpriced 10% gc. That's why your arguments about GTX 295 being the a must buy over the 5870 is a moot point because it's more expensive, more power hungry and behind the 5870 in terms of DirectX support. Can you also admit that the 5870 is currently the better buy over the GTX 295? I would say you wouldn't, because you're clearly a fanboi.
quote: by chizow:5870 does have quite a few new features that make it the better choice over last-gen parts like DX11 support, HDMI HD bitstream, angle-independent AF, and Supersample AA. Still, it wont' be competing with these old parts for long, it'll have to deal with Nvidia's next-gen offering GT300 over the long-term
quote: by Chocobollz:That's AMD not ATI, mind you. And that's not solely AMD's fault because Microsoft could also be blamed for the problem, because they're the one who should do the final QC. On the other hand, the problem with the G84/G86 parts from NVDIA is a problem which NVIDIA itself already know before launch, but they keep remain silent, would you say that's a good ethic? At least AMD or ATI, as far as I know, never have been trying to milk their customers like that.
quote: were using *INDUSTRY STANDARD* soldering techniques
quote: only saw RED
quote: If it was industry standard soldering techniques then why would we don't see similar problems with other products from the other manufacturer's as well? It just blind luck I guess huh? If an *INDUSTRY STANDARD* soldering techniques is as bad as causing the parts to malfunction in such a little time, then why do they keep using it? And why keep silent?
quote: http://blog.seattlepi.com/digitaljoystick/archives...Q: Let's go over some of the rumored reasons RROD. Could you tell how close each theory is?Over heating CPU/GPU due to the lead free solder?They don't overheat due to PB Free. They over heat due to too much power dissipated in too small of an area , w/o a sufficient thermal management design to take the heat away from the junction of the transistors on the chips, the packages themselves, and the mobo. And the over heating is on the GPU. When the CPU heatsink is applied right, it does not over heat.
quote: Saw RED? Hah, you're joking right? What gc do you think I currently using? Guess what... its color is gree, and its a Geforce. I don't have to tell you who made it right? And you, have you ever once bought an ATI gc? Guess you're not? Then you're clearly a fanboi who just made anything up to discredit ATI. Me on the other hand, would choose anything that is best for me, and for now, Radeon 5870 is clearly the winner.I think you're just run out of luck.
quote: since G80, there has been ZERO reason whatsoever
quote: I guess I'm just "lucky" Nvidia has been kicking ATI's ass over the last few years?
quote: I'd say only an ignorant person would say something like that....plus some other nonsense
quote: Admit it dude, it's the other way around this time. Guess who made NVIDIA drops its price so much overnight like crazy and lose a lot of market share? And who trumps NVIDIA's flagship (and rebadged) gc? Who kicks NVIDIA in the butt when they said they will "open a can of whoop-ass"? (yeah that's just as stupid as it sounds but hey that's a fact)Wish you many luck again dude.
quote: do NOT cloud my ability
quote: ompletely accurate statements based on the REVIEWS in this news bit
quote: If you don't think its a big deal
quote: and some fanboiiiiiiiiis
quote: Saw RED? Hah, you're joking right? What gc do you think I currently using? Guess what... its color is green, and its a Geforce. I don't have to tell you who made it right? And you, have you ever once bought an ATI gc? Guess you're not? Then you're clearly a fanboi who just made anything up to discredit ATI. Me on the other hand, would choose anything that is best for me, and for now, Radeon 5870 is clearly the winner.
quote: Completely accurate statements based on a few preliminary benchmarks, even if it's accurate,
quote: that's why I said it was no problem for me if 5870 fell a little short of my expectations
quote: If you're calling me a fanboiiiiiiiiis, well I'm sorry I dissapoint you this time, Mr. Huang fanbot, because as I already said in another reply:
quote: Exactly, so why are you trying to make this insignificant problem bigger than it is, by your own words?
quote: So go enjoy your 5870!
quote: You lose at the intarweb.
quote: Sourthings :It's not worth it with this fellow. Whenever the comparison of 4870/4870X2 vs GTX260/280 comes up. He (chizow) does something akin to a toddler placing their hands over their ears and go la-la-la-la-la.The majority of reviews show the 4870 against the 260, both at stock clocks, in the majority of games, having the 4870 the victor, at 1920x1200 and down. If you OC the 260 it pulls ahead in some titles, if you OC the 4870, it is again the faster card. In all reviews where the 4870x2 is on any chipset but skulltrail vs the gtx 280, in almost every game the 4870x2 is faster than the gtx 280, and gtx280sli as well. The only exception really is Crysis.In before FUD, and la-la-la-la.
quote: cmdrdredd :You know, you're hopeless (chizow). There's no reason for you to complete bash a product. You seem to ignore other facts and only see what you want and read what helps you prove to yourself that your decision was good.It's the same as when I left. Some people never grow up. /end
quote: I know I'm just feeding the troll here..
quote: Sure I will enjoy it, and many others will do too.
quote: Me? I'm not losing anything. And for you, I've seen you lose quite a lot over trolling on ATI's forums at AnandTech.
quote: Ya, 1 out of every 3, same as the last 3-4 years. Of course that 1 will yell and cry and bitch and moan and lie and make himself sound like 10/3 but you get the point.
quote: Hahah that guy ended up getting a GTX 280 from a 4870 and thought it was the greatest thing since sliced bread
quote: That would be the same as one of my friend who happens to be an nvidia fanboi, every time he buy an nvidia gc, that card will fail in like 3-6 months but he will keep talking about how nvidia gc is superior. And then he buy another nvidia gc just to get it failed again and again. So what's the point?
quote: Then you're just implied that I was true. If he's considering to switch to nvidia, then he's clearly NOT A FANBOI so I have reason to trust what he say. And i'd say that 2 guys are only a few examples, I'm sure there's many more. It just I don't have time to read your rants man.
quote: Now the real question is, why do you feel the need to make all these "little nonsense issues" into more and more replies?
quote: LMAO. This is exactly my point, because that idiot insisted *I* was the fanboi similar to your idiotic rants here claiming *I* was the one spreading misinformation as you quoted.
quote: Was hilarious really, all the ATI fanboys like you just ate him up so fast....didn't even realize he used to be one of em. Reminded me of that scene in Lost in Space where all the little shit eating spiders ate each if they were injured.
quote: Because its fun and at least I'm learning English LOL.
quote: You both are a fanbot for the same company and yet you call him an idiot so I guess the ATI fanboys is better than the nvidia equivalent? (because they're aren't cannibals like the nvidia's fanbots does?) That's even more hilarious LOL.
quote: some uninformed idiot,
quote: An uninformed idiots is better than some idiot who pretend they know everything while in fact they're blinded with their own fanboinism.
quote: Yes its obvious you've been feeding yourself.
quote: Completely accurate statements based on a few preliminary benchmarks, even if it's accurate, its based on a premature conclusion, that makes it's a no problem for so many peoples in here
quote: *sigh* You know why I said that?
quote: You get it now? Well, if you want to believe on a premature conclusion, then you suggest you to do whatever you want now dude, because we all will die in 2012! <- premature conclusion based on an insignificant fact, the end of Mayan calendar. That's a fact, but would you trust it? Judging by your past behaviour, I'd say you will LOL.
quote: Premature? Oh we're supposed to wait a year for AMD to figure out what's wrong with their GPU before we come to a conclusion?
quote: I guess if you were trying to determine relative performance, benchmarks indicating just that would be irrelevant
quote: Wow.. you just take my words and over-exaggerate it to an extent that even its meaning has been changed LOL.
quote: Relative performance?
quote: think it's useless to talk to you here, you will never change. Why don't you look at this articles and see what the others think about the 5870:
quote: Fact: those leaked benches claimed the 5870 was faster the 295.Fact: actual reviews are showing the 5870 is slower than the GTX 295.
quote: Rofl, again, I'm not disappointed at all, it just further validates my decision to wait for GT300 before making any buying decision.
quote: Once again, prior to the actual reviews, the leaked bench performance seemed absolutely attainable given the specs and expectations of being a complete doubling of the 4890.
quote: Like them, I'm absolutely shocked the 5870 scales so poorly compared to 4870X2
quote: Waiting for something that's most likely will be so big and power hungry and cost you arm and legs? And even if it does comes out, its chance is 50-50 that it would beat the 5870 in terms of tech and power consumption. If you're a gambler then it wouldn't stop but if you want something that surely will give you the best value right now, you would choose 5870. If you want the GT300 so bad, then its fine because we all know you wouldn't support ATI no matter what, even if ATI is the only company who produce gc. Most people would just choose what the best for them, and at this time, it's the clear winner.
quote: Peoples don't buy a products based on the result of a leaked benchmarks. Why? Because the products isn't even in the store yet (gosh.. that's so obvious and yet you don't know that).
quote: Why don't you just admit it that you're not shocked
quote: you talk way too much shit for someone who knows so little.
quote: Fact: Nvidia parts have consistently drawn less power than their ATI performance counterparts on the same process. Compare 8800GTS to 2900XT, 8800GT to 3870, 260 to 4870, 285 to 4890, 295 to 4870X2....Nvidia parts are faster, draw less power and run cooler due to better coolers.
quote: Also, why would I rush out and buy a 5870 today instead of waiting for GT300 when its slower than what I already have?
quote: It offers no immediate benefit
quote: and if I were actually anticipating the 5870 and planning to buy it, I would've been absolutely gutted.
quote: its a complete doubling of the 280 in specs and transistors, as its expected to be
quote: You're clearly smarter than me but your fanboinism has blinded you so much ;-)
quote: Uhh.. that's why I said it to have 50-50% chances because you could only make assumptions based on previous cards, not the GT300 itself..
quote: So the 5870 only aimed to those who already have an NVIDIA GTX 280 (and above)? It's not isn't it? So stop talking about it loses to GTX 295 because it doesn't matter, it's a good buy whatever you're saying.
quote: For you maybe, but not for those who are looking for a new gc at the 350+ price points, they surely will find that 5870 is an interesting piece of tech, especially with Windows 7 around and dozens of game dev already planned to make a DX11 games.
quote: I think that's just you then because I can't see any other guy here who's gutted like you, not even myself. I'm pretty happy with what the 5870 offers.
quote: I think you already know that 2x the transistors doesn't always means 2x the performance so why the fuss? Anything over 10% is significant already and hey, at least they're not charging US$ 600+ for it (like some greedy green company does) so you should be happy with that. I'm sure many peoples would be drooling with its current price LOL.
quote: Actually looks like there were a few others who started down that path, but then the ratedowns and haterade started raining down and they probably figured, fuck it, not worth dealing with these idiots to get my point across. I just find it funny though so here I am! :)
quote: LMAO, ya if you meant 100-0% using historical references sure. I'll give ya the benefit of the doubt though and say 99-1%. Here's a hint, AMD already overclocks the piss out of their parts to remain competitive, which is why their overclockability sucks, run hot, and draw a lot of power. Its basic principles of semiconductor behavior. When you you're begging that little ATITool slider to go up more than 1 measly notch on your 5870 and it won't, you'll know what I'm talking about.
quote: There WAS a time ATI charged $600+ for their cards (X850XTPE, X1950XTX etc etc)
quote: Look at how damaging the 48x0 release was....$200 and $300, now they try to charge $300 and $400 and people are furious
quote: I'd say that's a good decision dude because it is obvious that most of the guys here are also thinking that it's not worth to make arguments with the idiots who say that the 5870 is not worth buying. As you can see, I'm the one who replied to most of your posts and I know that'll make me looks idiot too because I obviously is feeding a fanbot here but oh well, I guess someone have to do it LOL.
quote: Oh and also, the guys you said taking down the path are obviously only a mere fraction of the other side, they're the minorities. And hey, why make it such a big deal? If you don't like it, don't buy it. It's always that simple right? Just like you said. So why the fuss? LOL
quote: Err.. about the ratedowns.. Had you ever thinked that it's maybe because they're (and you) are mostly some nvidia fanbois trying to troll on an ATI threads? I'd say that's not funny, that's more like dumb moves, really. Don't they have any better job than trolling? Now that's funny. If you want to be a hero for nvidia, why don't you started to just posting exclusively only on nvidia's threads? Talking about how nvidia's good, not the other way around, talking about how craps their competitors are. Then everyone will start to have some respect for you. What you think? It's a good idea right?
quote: Now, you already said that NVIDIA and ATI gcs were both manufactured by TSMC. Then by your own definition, that will means that the GT300 will also suffer from the same weaknesses. And to mention that its die area is roughly 30% larger than the RV870 (334 vs. 452) at the same process tech, I'd say 99-1% that it'll be a lot more power hungry with a lot less yields per wafer. That makes me thinking, why are nvidia keeps their large monolithic die when the other have started to move to a more modular design. Were they that ignorant?
quote: At least they try to innovate and improving their architecture and not rebadging the same products over and over again until everyone got a hole in their pockets just to find out that they have been robbed LOL.
quote: And you must've also already know that the 5870 could achive a thrilling 14.7W, 25.8W, and 107W idle, peak 2D, and peak 3D power draw respectively (source: http://www.xbitlabs.com/articles/video/display/rad... ). That's quite a feat and I'm sure 99% that the GT300 wouldn't even close to it (judging from the X-bit labs articles).
quote: What damaging? I don't see their prices as ridiculous. Even I myself was once really tempted to buy one (but I don't, and I choose an NVIDIA gc as you already know). A lot of my friends were also immediately bought it when it becomes available. So what are you talking dude?And I don't see anyone here who are furious except you dude (your first post, the one who quickly spotted a lil insignificant weaknesses of the new ATI's products). And to think that you are furious while not even considering to buy it, it makes us wandering, why are you even here? What are your motives? (I know its obvious judging your past behaviour but hey, it's better to ask the person directly ;-)
quote: Here's a hint, AMD already overclocks the piss out of their parts to remain competitive, which is why their overclockability sucks, run hot, and draw a lot of power.
quote: So, the power draw is not higher than 107 watts in 3DMark06 which is amazing for a graphics card whose GPU consists of over 2 billion transistors. Nvidia’s G200b can’t even dream of such power efficiency.
quote: On the other hand, FurMark is far more intensive than real-life games, so the practical power draw is going to be closer to the first number than to the second . The Radeon HD 5870 is also extremely economical in 2D mode, setting a new record for its class. The cooling system did well despite our apprehensions.
quote: Wow.. just wow.. where did you made that up dude? In your basement? I just said you're smart and you're already getting your head bigger LOL. You want a pure fact and not a made up facts? Read here (again LOL):
quote: Yeah the overclockability maybe suck ass but its not like nvdia's gc also always do good in overclocking.
quote: You do realize you're referencing a 5 year old demo run at 1280 as a reference for power consumption right? (and some other half-baked arguments)
quote: Following our standard procedure, the 3D load was created by the first SM3.0/HDR test from 3DMark06 running in a loop at 1600x1200 with forced 4x FSAA and 16x AF . Additionally, we used OpenGL FurMark. The 2D load was emulated by the 2D Transparent Windows test from PCMark05. We’ve got the following results..
quote: Might as well run Prime95 LOL. And Furmark...
quote: you do know AMD throttles performance via driver right? Even if he was able to run it (most 48x0 just crash), it was probably running at lower power draw due to the driver throttling. Name it UT3.exe and that story changes heh.
quote: Also, stop referencing insignificant parts no one in this price and performance bracket is interested in. I could reference a review I saw where they overclocked an IGP 8400 to 2x the clockspeed but no one gives a shit. LOL.
quote: Wow.. where did you get that? In a toilet somewhere in your basement again? Did you even read the articles? Or you just too busy with your hobby of making something to discredit ATI? Read again carefully:
quote: LOL what? Your tongue got stuck? Is that the only you've got?
quote: You seems to know everything huh dude? So why don't you open up a new website called "Chizow's NVIDIA labs" and start posting whatever benchmark results you want? I'm sure you would get a lot of visitors, well maybe in like.. 100 years?
quote: And about the 5-year old demo, so exactly what do you want? You want them to do the benchmarks on all games ever made in history? Or is it you want them to run in a game which have the TWIMTBP by nvidia? I'm sure you would choose the latter just because it run better on nvidia's hardware?
quote: And it seems that you know alot about AMD/ATI. Are you one of AMD's insiders dude? If so, let's start promoting ATI gc and not bashing it.
quote: Nope, they could've run a screensaver LMAO.
quote: Its simple enough to test, running a fast GPU at 1280 and a slow GPU at 1280 with a slow CPU might produce similar FPS, yet the faster GPU would have a lower utilization rate than the slower GPU, which would be closer to 100%.
quote: PS. No one gives a shit how your IGP overclocks. Its useless in the very applications overclocking would be useful - gaming - so its irrelevant. The end.
quote: Again, you only need to go back to January to see their benches claiming the 4870X2 was faster than the GTX 295 (yet its somehow 40%+ slower now?)
quote: There's no need for me to run benches for everyone to see, we have a dozen or so linked at the top of this page to back my points , remember?
quote: Some advice: if you clearly have pawned, just stfu so you don't make yourself looks more embarassing LOL.
quote: Do you own a benchmark site? You seems to think that you're better than them so why don't you make your own and lets see if you can be a trusted source like them. If not, then stfu because you're not in a position to tell us what to do. We have a lot more better person to decide it than you.
quote: Wow, IGP is irrelevant, how ignorant could you be?
quote: OMFG! Why is it take you so long to get what I (and the others) have been saying to you for all this time! LOL That's what I've been yelling at you all this time and yet you're like a deaf person who hear nothing except the superiority of nvidia and the inferiority of ATI.You still don't get it? Ok, let's go back. My original thoughts is basically that the 5870 (which is just released yesterday) could be improving over time so making a judgement based on some early benchmarks is meaningless . Just like what you say about the GTX 295 being slower than the 4870 last January (an early benchmark). Now, 8 months later, the GTX 295 was 40% faster. So, if you were an ATI fanbot, you surely have made the posts bashing the GTX 295 in last January, just to get yourself pawned 8 months later. You don't want to get pawned again don't you? So why the fuss? And did you realize that you just got pawned again? If only you just stay silent and not being stubborn, you wouldn't get these triple or so pawned today LOL.
quote: Jan 2009 - GTX 295 is 5-10% slower than 4870X2 * Sep 2009 - 4870X2 is 40% slower than 5870, but the GTX 295 is faster than the 5870. ** * - No other review sites come to this conclusion** - No other review sites come to this conclusion
quote: But then, they all could just run a screensaver just like what you said above , so what evidences could back you up? Nothing right? So now you're telling us that all of your talking here is actually a bunch of nonsense? That's a very good way to describe yourself.
quote: By Chocobollz on 9/25/2009 2:55:03 AM: Completely accurate statements based on a few preliminary benchmarks, even if it's accurate
quote: closer to a full doubling of the GTX 285
quote: As for 5870X2 easily beating the 295...there's no doubt about that, but the 5870X2 (and the 5870 for that matter) won't be competing against the GTX 295 for much longer. They'll be going up against the GT300-based GTX 380 and soon after, the GX2 version. The 295 again being significant because that's what GT300 will be compared against relative to how well the new generation scales compared to the old.....
quote: The case can always be made that the next big thing is right around the corner, but you can do that forever and never end up buying. You don't know that the GT300 will be out any time soon, or even that it will beat the 5870 when it arrives. I think it will, but I know that if it does, nvidia will charge whatever premium they can get away with. Right now, today, the 5870 is the clear choice for an enthusiast level card.
quote: Say what you will about the GTX 295, the fact is that it's now irrelevant. No matter how much they drop the price by, it doesn't offer DirectX 11. Who would spend $500, or even $200, for a card that will be severely limited in running the next generation of games?
quote: You are saying you would purchase a GTX295 over a 5870?
quote: Nvidia is not known for under hyping its products. If GT300 was just a month or two away you can bet they would be making announcements.
quote: So yes, for this generation, we can reasonably expect Nvidia to follow with their next-gen part in the next few months
quote: How is it irrelevant if its still the fastest single-card in current games?
quote: Sure it doesn't fully support DX11, but I'm willing to bet you couldn't even tell me what features DX11 supports over DX10, or give explicit examples of how those supported features would improve performance in the 5870's favor.
quote: If anything, it'll result in lower performance as the addition of unsupported features would have zero impact on the GTX 295, as those features would be excluded.
quote: Well, if GT300 is really due in the next few months, Nvidia has been awfully silent about it . . . and the rumors all point to a delay. I hope you're right, but I wouldn't put money on it.
quote: It's irrelevant in the same way that any $200+ DX10 card is now irrelevant. Because it won't give you a better experience than the HD5870 in any current title, and it will be seriously handicapped on future titles. I don't mean that everyone has to upgrade their existing cards, mind you, but you would be foolish to buy a GTX295 today.
quote: It doesn't fully support DX11??? It doesn't support it at all!!! It doesn't even support DX10.1.
quote: You're actually arguing the point that DX11 support is somehow bad??? Should we all check the box before we buy a video card to make sure it doesn't support DX11? How much more should I be willing to pay to make sure my card doesn't come with that terrible feature??? Come on . . . that's fanboy talk.
quote: ... it'll have to deal with Nvidia's next-gen offering GT300 over the long-term.
quote: Reactions and reviews are certainly split, while its clearly the fastest single-GPU right now it still loses overall to the GTX 295 and even its own predecessor, the 4870X2 as the fastest single-card solution. What is painfully clear however is the 5870 does *NOT* live up to the hype generated over the last few weeks from various "leaked" marketing slides and web outlets.
quote: Taken from X-bit labs articles:So, the classic single-chip architecture has returned with new capabilities and new level of performance, making the Radeon HD 4870 X2 obsolete . Nvidia finds itself lagging behind once again.
quote: by jonmcc33 on September 24, 2009 at 10:19 AM There's no such thing as "future proofing". Those that got the first DX10 cards from nVIDIA (8800) sure couldn't handle much DX10 gaming at all and don't compare to the GTX 260 cards that can actually handle DX10 games. Not to mention DX10.1 that came out later on. No future proofing by getting an initial DX10 card. My old Radeon 9700 Pro (first DX9 graphics card) couldn't handle some of the DX9 games of today like COD4 could it? Please, stop using the words "future proof" because they will never apply.