Print 106 comment(s) - last by Tyranties.. on Aug 14 at 10:56 PM

Starcraft II is overheating some users GPUs, but Blizzard has released a temporary fix.  (Source: Doupe)
Starcraft II fans beware, your graphics card may get Zerg rushed

StarCraft II: Wings of Liberty, the first game in Blizzard's highly anticipated real-time strategy sequel trilogy launched on Tuesday.  Unfortunately, the blockbuster PC title -- which is expected to sell 10 million copies or more -- had some bumps during its launch.

There were a number of minor bugs, but nothing show stopping at first.  Then the reports of melting GPUs hit.

Among those affected was Adam Biessener of Game Informer whose card melted while he was live blogging about his game experience.  He bemoaned, "Three hours of cursing later, I'm posting this from my wife's laptop because both my graphics card and my work laptop appear to be fried."

The problem appears to be located in the main menu, where an uncapped frame rate maxes out the GPU, in some cases pushing it to overheating and potentially permanent failure.

Blizzard has issued a response on its support site, acknowledging that it was aware of the issue, and offering a quick fix.  The company writes:

Certain screens make your hardware work pretty hard

Screens that are light on detail may make your system overheat if cooling is overall insufficient. This is because the game has nothing to do so it is primarily just working on drawing the screen very quickly. A temporary workaround is to go to your Documents\StarCraft II Beta\variables.txt file and add these lines:



You may replace these numbers if you want to.

For eager customers who already lost a graphics card, though, that fix may prove too late.  Blizzard has not announced any plans to replace the lost hardware of victims who experienced the bug.

Many customers are outraged at this.  Writes one victim Lorsaire:

Why was this not addressed already before release, and why were there no breaking news warnings or updates to fix this before people started having damage done to their hardware?  My Nvidia GeForce cost me more than $300 to get a good card that was great for gaming...  Blizzard are you doing anything or have plans to compensate people for the damage you've created?

Of course some of the cards may be covered by manufacturer warranties.  And while it does appear a bug (uncapped framerates) is partially to blame for killing off the cards, a card pushed to the max would generally not die instantly were it not for poorly engineered and/or defective cooling.  It appears that the cards ultimately were done in by the double blow of both a software bug (in SC II) and hardware issues.

The game features intensely addictive multiplayer gaming between three diverse races -- the Zerg, the Protoss, and the Terrans.  It also features a single player campaign in which you play a Terran rebel.  Future titles -- Heart of the Swarm and Legacy of the Void will included Protoss and Zerg campaigns, and possibly deliver new multiplayer features as well.  Just beware the uncapped framerates.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

So who's fault is this?
By plewis00 on 8/1/2010 1:37:42 PM , Rating: 5
I don't know if it's just me who thinks this but surely a piece of software cannot be responsible for a graphics card failure (unless that piece of software stops the fan or something similar)?

If a game is running at an uncapped framerate (I'm sure quite a few games do this anyway) why is that causing a card to burn out? These cards aren't running beyond spec are they - unless we later find out that these people who reported dead cards also had been overclocking them and have insufficient cooling.

While this is annoying, unless Blizzard does something stupid like auto-overclocks graphics card, modifies fan settings or does something else on a hardware level to modify the card, I don't see how they can have any real element of blame pinned on them...

RE: So who's fault is this?
By kaosstar on 8/1/2010 2:32:01 PM , Rating: 5
Agreed. One should be able to have his GPU maxed out pretty much indefinitely without it overheating. As much as I'd like to bash Blizzard, the blame lies with the card manufacturers for providing insufficient cooling, or the end users for having inadequate airflow.

RE: So who's fault is this?
By Qapa on 8/1/2010 6:40:45 PM , Rating: 2
2 possibilities only:
- card running normal and fried => card company is to blame! they must have sensors to detect overheating and throttle down or even stop;
- user also overclocked the card or did something else to the card, then the user might be at fault;

RE: So who's fault is this?
By MonkeyPaw on 8/1/2010 8:40:57 PM , Rating: 1
What might be going on is that cards are running at 2D fan speeds at the game's menu prompt (possibly for noise reduction), but the game is actually maxing out the GPU anyway. If the game isn't using the driver correctly, it might be on the developer. Sounds unlikely though.

RE: So who's fault is this?
By afkrotch on 8/1/2010 10:23:23 PM , Rating: 5
That's manufacturer's problem there. The fans should be spinning up after a certain threshold is hit.

While Blizzard is at fault for making not correctly implementing a framerate cap on their menus, they aren't responsible for the flaws in your hardware.

My GTS 250 and 8800 GTS run it fine. Course I have adequate cooling on my card and in my case.

RE: So who's fault is this?
By Omega215D on 8/2/2010 3:43:26 AM , Rating: 2
Unfortunately my GTX 260 216 by EVGA doesn't appear to speed up the fans when I have a game going but will throttle the clocks. Before every game I have to set the fan settings manually in the console.

RE: So who's fault is this?
By hrah20 on 8/2/2010 4:02:43 PM , Rating: 2
I think it's users fault, I have a radeon HD4890 factory overclocked and never had a problem with starcraft II even though the ati 4 series runs hotter than the new 5 series.

I always check my card temps on CCC & I have a very good ventilation system on my case

RE: So who's fault is this?
By afkrotch on 8/2/2010 8:39:08 PM , Rating: 2
I don't see how it can be the user's fault. The cards are meant to run and the drivers should have thresholds built in. Unless you are voiding the warranty in some way, it's the manufacturer's fault.

Unless new TOS state that you need to be constantly monitoring your card and your case temps need to be at X temp or lower.

RE: So who's fault is this?
By AntDX316 on 8/3/10, Rating: 0
RE: So who's fault is this?
By Omega215D on 8/9/2010 7:27:36 AM , Rating: 2
Users fault that when the fan speed is set on Auto and it stays in the default speed when a game is fired up? Kind of dense don't you think?

RE: So who's fault is this?
By Thalyn on 8/2/2010 5:28:21 AM , Rating: 4
I believe every ATI card since the 9000-series (non-inclusive) has had the split clocks - 2D and 3D modes. However, even though it wasn't until the 3000-series that the decision on which to use was made based on graphics load rather than surface mode (windowed modes, even fullscreen, would use 2D clocks on 2000-series cards and earlier) the temperature of the graphics card was always used to determine what speed the fan should spin. Even if the earlier cards were still in 2D mode, if they got hot than the fan would speed up accordingly.

Unless it was a non-reference design which lacked thermal control or you manually told it to run at a different speed (eg My 4870 is fixed to 34%).

I can't say with any certainty about GeForce cards, though I believe they implemented this same thermal fan control with the 5000-series (load-based speed control wasn't until the 8000-series from memory). Again, though, non-reference cooling solutions and manual overrides could bypass the thermal control.

In short, everyone experiencing graphics hardware failure has inadequate cooling. Whether this is because the cooling they have is blocked (dust or otherwise), inadequate for the power draw of the card or fixed to an inadequate level is largely irrelevant - SC2 is about as dangerous to graphics hardware as 3DMark 01.

RE: So who's fault is this?
By michael67 on 8/2/2010 5:22:11 AM , Rating: 3
There is a third, And most likely cause of overheating cards/components.


I blow out my case every 3~6 months and after cleaning it usually i have a +3c drop in temp's.

But there are lots of people that never clean there PCs 0_o

I have a compressor in the garage and take my PCs there and blow them out, its amazing how mouths dust they collect specially my HTPC that's on 24/7.

Only one thing that really important, when doing this is holding the fans in place otherwise you spin them to dead.

Did that once a test whit a old fan 100psi/7bar made that fan spin at about 12.000 rpm before it died, according to the mobo ;-)

RE: So who's fault is this?
By tallcool1 on 8/2/2010 9:05:27 AM , Rating: 2
That is why I only buy computer cases that have integrated dust filters. Preferably ones that are easy to remove. Take the filters out regularly for cleaning.

RE: So who's fault is this?
By afkrotch on 8/2/2010 8:45:43 PM , Rating: 2
Humidifier and table. Why clean out your case, just set up your living space so you don't have to.

The humidifier will keep the dust/dirt down on the ground. The table will keep your desktop off the ground, so it doesn't suck in the dust/dirt. I only have to clean my case out about once a year. By then, I'm usually upgrading and it would have gotten cleaned out regardless.

RE: So who's fault is this?
By mindless1 on 8/1/2010 10:33:42 PM , Rating: 2
With overclockers that do voltmods, they can cause the GPU to operate beyond safe limits no matter what normal measure of heatsinking is used, but I can't blame end users for inadequate airflow because the card should throttle back or cease function if it gets above a certain temperature enough to damage it.

Blizzard has done nothing wrong, the IDEAL we all strive for is that our GPUs do as much work as possible, as maxed out as they can get up to the point where it causes stuttering or a bottleneck to the framerate.

Otherwise, you are either gaming at lower FPS or have wasted money on a GPU you never fully utilize.

RE: So who's fault is this?
By 0ldman on 8/2/2010 10:19:55 AM , Rating: 2
I am tempted to vote you down just because you said exactly what I was going to say before I got to say it.


RE: So who's fault is this?
By Proxes on 8/2/2010 2:28:09 PM , Rating: 3
I agree. Saying a game is burning out video cards is retarded. Almost every game out there will max a video card.

When I first started playing WotLK my computer would lock up and crash. I noticed the fan on my video card (8800 GT) was freezing. So I bought a 3rd party fan/heat sink. Temps dropped almost 10c and my video card doesn't go over 55c while playing any game, including hours of SC2.

If it was the games' fault then Crysis would be eating video cards left and right.

RE: So who's fault is this?
By Alpha4 on 8/2/2010 7:22:05 PM , Rating: 5
World of Warcraft crashed your GPU?? Windows Aero Glass looks more demanding than that motley collection of jagged edges. Crysis would have set your house on fire.

RE: So who's fault is this?
By walk2k on 8/5/2010 2:03:30 PM , Rating: 2
Actually most games will not max out the GPU, the bottleneck for most games is the CPU or other subsystem (or the code itself.. believe it or not most game code just isn't very efficient). It's only when the game was doing nothing and basically running the video refresh system in a tiny tight little loop that is stressed out the video card. Just normally playing the game does not.

It's the same with CPU (or any other PU) try playing a game and monitoring the temps. Now run a stress-test like prime95 or orthos and watch the temps climb another 5-10c, maybe more with poor cooling. These programs run tight little loops that max out the CPU. This is a situation you almost never find in a real-world application. Even the most intensive number-crunching apps don't run in small loops like that.

RE: So who's fault is this?
By Lerianis on 8/11/2010 4:34:10 PM , Rating: 2
This falls on the manufacturers also for allowing the graphics cards to 'shoot up' in memory clock and other things.

Hell, the laptop card in my Gateway laptop has to be put on 'constant performance' in Rivatuner so that crap doesn't happen, otherwise it is running at 1600/800/600 speeds in 'graphically intensive games'.

These things should be LIMITED unless the person who owns the computer changes it otherwise to the STOCK SPEEDS of the graphics cards.

RE: So who's fault is this?
By Lerianis on 8/11/2010 4:36:55 PM , Rating: 2
I should also say that the graphic drivers should have a 'heat limit' of (at most) 70C before it start underclocking the graphics card all on it's lonesome.

RE: So who's fault is this?
By kingmotley on 8/1/2010 2:35:50 PM , Rating: 5
I totally agree. I would hardly call what blizzard did even a bug. They draw the screen as fast as they can. Perhaps limiting the refresh rate would be an additional feature, but that shouldn't be absolutely required. Most really old games never did that either.

Beyond that, asking the graphics card to draw stuff should never cause it to overheat to the point of burning itself out. The card should have detected it's overheating and either shut down, or underclocked itself to keep it from getting to that point. Poor engineering on the graphics cards part.

RE: So who's fault is this?
By BruceLeet on 8/1/2010 2:51:11 PM , Rating: 2
He bemoaned, "Three hours of cursing later, I'm posting this from my wife's laptop because both my graphics card and my work laptop appear to be fried."

I know first hand laptops generally have insufficient cooling and I've had laptops that sit on my lap for 10 minutes and it caused heat discomfort on my thighs, although it is a bug, obviously unintentional but it could be considered next to stress testing.

RE: So who's fault is this?
By MastermindX on 8/1/2010 3:00:32 PM , Rating: 3
Last time I checked, every FPS (first person shooters) does NOT limit frame rate in any ways. It just renders non stop at the maximum frame rate possible. So... Unless there is some subtlety between rendering a simple menu VS rendering a complex 3D environment that would make the menu more likely to overheat than the 3D one, any FPS would have fried those graphic cards too.

That reminds me of the Civilization II CPU utilization bug, where at the end of a turn the CPU would go to 100% utilization.

But like many have said already. Yes it's a bug... But it's not a hardware killing bug. It's nothing like the bug in nVidia's drivers that affected the speed control of the fan and fried the card in the process.

Is Blizzard legally liable of the cards burning down? If they are, we are in a way more retarded civilization that I could imagine. Hardware should have no problem running at maximum capacity for YEARS before blowing up. If they don't, the hardware maker made it's hardware rating too aggressive.

RE: So who's fault is this?
By Assimilator87 on 8/1/2010 3:14:31 PM , Rating: 5
SC II Menu vs. Furmark

Who shall be the victor!? Please test Anand =P

RE: So who's fault is this?
By HostileEffect on 8/1/2010 3:38:34 PM , Rating: 2
I thought I recall hearing about this lack of FPS cap bug in the beta a very long time ago... then again it may be a different game.

RE: So who's fault is this?
By Zehar on 8/1/10, Rating: 0
RE: So who's fault is this?
By kyp275 on 8/1/2010 5:23:23 PM , Rating: 4
Whatever you may think of the uncapped FPS issue, the fact remains that what killed those card were either poor engineering by the card's manufacturer or maintenance/cooling setup on the end user's part. I certainly see what you're saying when it comes to Blizzard's response, but ultimately the cause of these failures lies elsewhere.

RE: So who's fault is this?
By nic4rest on 8/3/2010 3:21:55 PM , Rating: 1
how can you blame the card when all of a sudden thousands of people cards died? lol are you serious all of a sudden they all got dity cards during a SC2 beta and release wow you must run for government your good

RE: So who's fault is this?
By kyp275 on 8/4/2010 2:24:08 PM , Rating: 2
Where are the "thousands" of cards that died? no seriously, please link me to the source where you got that information.

Not that the number matters in any case, nor are the only cause being dirty cards. It could have been poorly designed cards, or poorly setup case with limited airflow, which in turn limited the performance of the card's cooling system.

Ultimately, the software did nothing other than push the hardware hard, and that's no excuse for the hardware to fail, especially when it's operating within spec. SC2 may have been the trigger for the time bombs that were in some systems, but it certainly did not put it there.

RE: So who's fault is this?
By kyp275 on 8/4/2010 2:24:21 PM , Rating: 2
Where are the "thousands" of cards that died? no seriously, please link me to the source where you got that information.

Not that the number matters in any case, nor are the only cause being dirty cards. It could have been poorly designed cards, or poorly setup case with limited airflow, which in turn limited the performance of the card's cooling system.

Ultimately, the software did nothing other than push the hardware hard, and that's no excuse for the hardware to fail, especially when it's operating within spec. SC2 may have been the trigger for the time bombs that were in some systems, but it certainly did not put it there.

RE: So who's fault is this?
By Reclaimer77 on 8/1/10, Rating: 0
RE: So who's fault is this?
By CptTripps on 8/2/10, Rating: 0
RE: So who's fault is this?
By CptTripps on 8/2/2010 8:30:32 PM , Rating: 2
Forgot my last bit. Blizz has no liability here, hit up your card manufacturer for replacement. I still think it's crap that they left it in when reported in the beta but... still not their problem in the long run.

RE: So who's fault is this?
By afkrotch on 8/1/2010 10:26:13 PM , Rating: 3
Last time I checked, you are wrong about FPS's not having limited frame rates. A huge amount of FPS's will have their framerates capped at 60 fps or 100 fps. It's done not for frying hardware, it's to provide a smoother gameplay.

That way your fps rates around jumping around so much. Like going from 60 fps to 100 fps, to 150 fps, to 40 fps, to etc.

RE: So who's fault is this?
By Reclaimer77 on 8/2/2010 12:55:50 PM , Rating: 2
Capping FPS is to 60 is to prevent "tearing". Not making your game smoother. Higher FPS never makes a game "less smooth", I have no idea what you are talking about.

Read closer, this is about unlimited FPS in the menu's, not the playable game field itself. I doubt uncapped menu's are that rare in PC gaming. Which is what he was saying.

I did some personal testing and using the in-game FPS meter (ctrl+alt+F) my FPS in the playable gamefield was 60, as it should be. But, for example, on the bridge when I clicked on the Mission Archive console, my FPS was a steady 450 FPS. Because it's just a static image, no animations no nothing, being rendered as fast as it could be by my machine.

Just to tempt fate and brag about my self built system cooling, I left it on that screen for about an hour. No heat issues whatsoever, woot :)

I'm going to issue the commands to cap the menu FPS, but from what I can tell this is no more stressful on a well cooled system than any other type of hard gaming or system benchmarking torture test. I feel bad for those who had to learn lessons the hard way, but I hope they learned them at least.

RE: So who's fault is this?
By afkrotch on 8/2/2010 9:19:32 PM , Rating: 2
vsync is for tearing.

Higher fps rates give you a smooter game, having your fps rates jump all over the place does not. If your framerates are jumping around from like 60 fps, to 100 fps, to 400 fps, to 50 fps, to 70 fps, guess what? There is going to be a noticeable choppiness to the game. To minimize this you just cap the fps rates at 60/100 fps.

If you have crap hardware and it has trouble even hitting the fps cap, then an fps cap isn't going to do much for you.

SC2, doesn't bother me that it has uncapped menus. I have a quad 120mm heatercore watercooling system. Bring it on.

RE: So who's fault is this?
By MatthiasF on 8/1/2010 4:49:37 PM , Rating: 2
It was mentioned that it's happening primarily on Nvidia GPUs, so some people were blaming Physx code in the game.

RE: So who's fault is this?
By Akrovah on 8/4/2010 2:33:39 PM , Rating: 3
Unless I'm mistaken, SCII doesn't use Physx, it uses Havok, which does not make use of hardware acceleration.

RE: So who's fault is this?
By Phoque on 8/1/2010 5:28:24 PM , Rating: 1
"I don't see how they can have any real element of blame pinned on them... "

Especially if it's an Nvidia card, considering their bumpgate and fermi (gf100, gf104 is fine) fiascos.

RE: So who's fault is this?
By Reclaimer77 on 8/1/2010 7:20:27 PM , Rating: 2
Yeah unless something has drastically changed without anyone knowing it, software cannot be responsible for hardware failures such as this.

Also in some cases graphics card manufacturers drivers too severely restricts fan speeds to counter noise, which leads to overheating. Like my Radeon Catalyst drivers, where I had to go in and manually set the fan speed parameters because default was way too conservative.

RE: So who's fault is this?
By hadifa on 8/1/2010 7:24:02 PM , Rating: 2
Well it's not necessarily a bug nor in all fairness it's Blizzard's fault. This issue came up for other games as well, and there are various solutions. As Blizzard statements goes, this happens for systems where the cooling is insufficient. These systems' VGAs were not pressured in the past so every thing seemed to be alright, or have gathered some dust on the heat sink where the cooling efficiency has decreased and then there is some pressure that causes the failure.

The problem is partially because in normal gaming, the graphic component have some breathing time while waiting for say CPU and other components, but on menu screen, that's not the case. Usually games go around this issue by either capping the frame rate or putting a small idle time between frames(say 1 micro second) to prevent this problem.

Galactic Civilization 2 had this issue for people who were creating very complex and involved ship designs. They fixed it by putting some fixed idle interval at the end of each frame.

No it's not Blizzard's responsibility, but knowing their installed base, it's an oversight IMHO.

Starcraft is the new Furmark ;-)

By Divide Overflow on 8/1/2010 9:00:28 PM , Rating: 2
Agreed. This is not Blizzard's problem to fix. The game engine software is working as intended to maximize FPS. If a video card fries itself under load, there is a serious problem with the design of that hardware.

RE: So who's fault is this?
By Galcobar on 8/2/2010 7:42:57 AM , Rating: 2
Does anyone else recall the 196.75 drivers issued by Nvidia which managed to fry cards by effectively disabling the fan controllers?

Blizzard was among the first companies to issue an alert on the issue, but they rely on their massive user-base for this sort of reporting.

Who knows, SC2 could well instruct the card in such a manner that it replicates the issue the 196.75 drivers produced, or interferes with its temperature sensor/throttling protocols.

RE: So who's fault is this?
By EricMartello on 8/2/2010 5:33:45 PM , Rating: 2
I'm in agreement with Plewis00 here...this is not a Blizzard issue (and I'm pretty sure that SC2 isn't overclocking the graphics card without the user knowing it). The GPUs that are overheating are probably in poorly designed systems with cramped cases and insufficient airflow. The blame falls squarely on whoever built the system - most modern graphics cards do include a cooling system that provides sufficient cooling for the card to operate at its full capacity without a time limit.

Also, redrawing a simple scene such as the main menu shouldn't really require the GPU to be maxing out. I would say that a lot of those operations can sit in a cache somewhere since the screen does not need to be updated so frequently, thus leaving the GPU in a mostly idle state.

RE: So who's fault is this?
By callmeroy on 8/3/2010 8:57:14 AM , Rating: 3
As an avid PC gamer (who used to be one of those obsessive tech geeks who not only put together his own gaming rigs but experimented with over clocking anything possible back before the modern BIOS and GUI systems/apps made it much more simplified....I gave that up years ago though btw).....Anyway in all those years gaming and tweaking hardware to get the max out of games...First off I've never heard a claim of a game being blamed on my hardware going volcanic on me...Secondly, I never had any issues of this type to begin with (I've had parts over heat before but I mean it had NOTHING to do with a game).

Also I've been pouring all my gaming time for the past week into playing SCII on two different systems with different hardware on each...until I read this story I didn't even know there WAS a problem!


RE: So who's fault is this?
By Tyranties on 8/14/2010 10:56:29 PM , Rating: 2
The SCII software bug either stops or tricks the GPU into NOT spinning it's fan up to cool the card.

Play Crysis maxed out and really stress the GPU and the fan spins faster. Play SCII's menu and the fan idles at a very low speed. The fried GPU's have all been able to cool themselves adequately, even if over clocked. But the SCII software bug has stopped them from functioning properly and thus they are frying.

My GTX280 fried, it was not OC'd and my PC has so much cooling it's insane. Yet within 2/3 days of SCII too I began to get strange artefacts and crashes, and now the card is completely dead.

I read that during the beta this occurred and blame was placed on 196.xx nvidia drivers. Well my card fried with 256.xx drivers. And after the first initial problems I rolled back to 185.xx drivers and the fan still did NOT spin up during the menu.

From my experience it is not the drivers, but instead SCII is bugging or convincing the GPU that it does not need to spin the fan any faster, when really the unlocked frames are stressing the card dramatically.

End result, fried GPU.

Obviously the hotter running cards, GTX280, 8800GT etc will fry first as they already ran hot, although not hot enough to warrant frying. So I would suggest all users to use the solution in this article. For all you know your card is slowly cooking too.

I read one report of a 5870 user who claimed his card ran at 70 degrees in the SCII menu, he manually cranked his GPU fan to 100% and then he achieved 38 degrees while in the menu of SCII.

Damn hot
By starwhat on 8/1/2010 4:29:27 PM , Rating: 2
Well its running bloody hot here, and the "fix" has no effect.

I don't know anything about cooling, or overclocking or anything. I just bought a computer which I expect to be able to play a game without blowing up!

RE: Damn hot
By kyp275 on 8/1/2010 5:26:47 PM , Rating: 3
I don't know anything about cooling, or overclocking or anything. I just bought a computer which I expect to be able to play a game without blowing up!

Reasonable expectation, and also one which the manufacturer of your PC may not have cared about :P

RE: Damn hot
By Reclaimer77 on 8/2/2010 1:44:03 PM , Rating: 5
How does this get a 5 on Daily Tech??? Give me a break.

Hey, did you actually SAVE the changes on notepad before closing it? Did you hit ctrl+alt+F in game to make sure it has "no effect"?

I want answers to these questions. I'm not taking your word for it. No offense, you seem highly computer illiterate. How about telling us what computer you bought?

Shame on everyone who rated this up to 5 without even the slightest information provided to substantiate his claims. Come on, this is a tech site???

RE: Damn hot
By clovell on 8/2/2010 3:54:14 PM , Rating: 2
Plus - folks who just buy a pre-built computer for gaming typically aren't aware that Graphics Cards run hot - 60C is bad for a CPU, but a GPU can take about 80C before you get anywhere close to having heat issues.

RE: Damn hot
By Lerianis on 8/11/2010 4:38:27 PM , Rating: 2
Guess again! My laptop GPU only gets up to 75C and it's black-screening the ENTIRE computer.

RE: Damn hot
By SlyNine on 8/4/2010 2:40:49 PM , Rating: 2
Make sure you're not forcing Vsynk disabled. I really have no idea how the "fix" is supposed to work. But I bet they just turned on vsynk.

Also, It would run hot. Its a game. It just wouldn't run as hot. So you need to define just how hot "hot" is before anyone can tell you if it's a problem or not.

RE: Damn hot
By walk2k on 8/5/2010 2:07:22 PM , Rating: 2
Sorry but 70-80c IS "damn hot", as in you wouldn't want to touch it, and yet that's a perfectly normal temp for a modern GPU. If it's not crashing or burning out your hardware, what's the problem?

Or just turn on Vsync
By DesertCat on 8/1/2010 5:50:35 PM , Rating: 3
I have a GTX295 and noticed that the fans were blowing pretty hard after an SC2 session. I don't see much point in going above 60 fps in something like SC2 so I just edited a custom profile for the game to turn Vsync on (turned on 4xAA there too). Seems to work like a charm.

RE: Or just turn on Vsync
By DesertCat on 8/1/2010 5:55:16 PM , Rating: 3
Oh yeah, and capping my fps to 60 like that also seemed to improve the synchronization between the audio and mouth animations between missions.

RE: Or just turn on Vsync
By SlyNine on 8/4/2010 2:33:09 PM , Rating: 2
It never makes since to have your FPS above your displays hz. But since few games use triple buffering their is often a penalty for using Vsynk even when FPS are below the displays.

RE: Or just turn on Vsync
By SlyNine on 8/4/2010 2:42:44 PM , Rating: 2
Just reviewed the blog. I guess it is just a frame rate cap. So make sure your drivers control panel isn't forceing vsynk to not be enabled.

RE: Or just turn on Vsync
By SlyNine on 8/4/2010 2:43:39 PM , Rating: 2
So make sure your drivers control panel isn't forceing vsynk to not be *disabled*.

RE: Or just turn on Vsync
By SlyNine on 8/4/2010 2:43:59 PM , Rating: 2
So make sure your drivers control panel isn't forceing vsynk to be *disabled*.

Same Story Different Day
By mindless1 on 8/1/2010 10:50:06 PM , Rating: 2
Someone likes their video card, they can overclock it even/maybe, and if it stays stable over 70C they think to themselves, "video cards are ok running hotter than other chips".

NO! GPUs are soldered to the PCB making them LESS capable of large repetitive thermal swings.

While a GPU manufacturer may spec a high temp to increase yields and allow use of a smaller heatsink because we're already stretched to limits using single-height 'sinks for cost reasons or double-height with limited airflow from one or two low profile fans, if you have a CPU wearing a heatsink twice the size and capacity of your video card but the video card is not only 1/2 the max TDP, something has to give!

Blizzard does have a bug in their game to needlessly run a menu at max FPS with no purpose in doing so, BUT to claim they are to blame for melted cards is just wrong. The video card has to be designed to withstand any commands sent to it, that it officially supports. Otherwise we are being tricked as to the true safe performance of a product, if it can't run at max load it is under-engineered and as such defective by design. A replacement of the same card is NOT a solution, rather the buyer is entitled to a redesigned product or their money back.

RE: Same Story Different Day
By Kurz on 8/2/2010 10:55:43 AM , Rating: 2
I always thought they should have a Spring retension system.

The fact the GPU goes over 70C is the main reason I water cool my system. I want my components to last as long as possible.

RE: Same Story Different Day
By afkrotch on 8/2/2010 9:24:12 PM , Rating: 2
GPU heatsinks do have springs. Usually a bolt with a spring.

RE: Same Story Different Day
By Kurz on 8/3/2010 10:04:15 AM , Rating: 2
I am talking about the actual GPU.

I want a system just like AMD and Intel Cpu chips. CPU's attach to the motherboard with a spring retention system.
Then the Heatsink goes on top of the chip to cool it.

That way there is less risk of the PCB and the Solder getting too hot and weakening the connections.

RE: Same Story Different Day
By SlyNine on 8/4/2010 2:37:11 PM , Rating: 2
I ask for proof of this. How do you know the PCB makes this less capable??

RE: Same Story Different Day
By mindless1 on 8/10/2010 11:37:32 PM , Rating: 2
Haha, proof? Study physics and electronics engineering for a few years THEN we will talk.

... I already stated the reason.

RE: Same Story Different Day
By mindless1 on 8/11/2010 6:01:21 PM , Rating: 2
Ok, I should be more patient. Sorry.

The materials have different coefficients of expansion. Each thermal cycle this puts stress on the junctions causing eventual cracking or delamination and oxidation. This is compounded by the fact that cards now have heatsinks massive enough that the card PCB itself bends being only supported by the slot and rear bracket.

In some instances you see the card manufacturer used a reinforcement frame or attempted to secure the heatsink nearer the perimeter of the card instead of only the 4 holes adjacent the GPU and this helps with the latter problem, but not the former.

Seems odd
By Earthmonger on 8/1/2010 1:38:22 PM , Rating: 3
That nobody received hardware warnings from their cards as they overheated. That's just weird. And couldn't they hear the fans revving like jet engines?

Something more heinous must have been at play to negate those two issues.

RE: Seems odd
By StevoLincolnite on 8/1/2010 1:57:37 PM , Rating: 2
I have an Overclocked, passively cooled Radeon 4670... I heard about this issue yesterday, and since then zip-tied an 80mm case fan to the card (I play StarCraft 2 allot!). - So no... Not everyone can hear there GPU fan spinning up.

RE: Seems odd
By imaheadcase on 8/1/2010 2:06:11 PM , Rating: 2
Most video cards don't have a GPU warning sound. Esp in consumer PCs that you just buy off the shelf.

RE: Seems odd
By Etsp on 8/2/2010 12:13:28 AM , Rating: 2
I have a Sapphire Radeon 4850, and I did have a game that caused it to overheat a couple times. Aion Online. Turns out dust was clogging up the heatsink, so after some canned air, the game ran fine.

Negative effects/warnings: My PC would shut down instantly once the temp hit a certain threshold. That's all.

Wasn't sure what was happening at first, so I thought it was the games fault. Didn't take too long to figure out the real problem though.

That was almost a year ago, and I'm still running fine on that card.

If these people have their cards melting on them, it's pretty obvious that something wasn't stock, whether it was clock speeds, or something else.

RE: Seems odd
By keith524 on 8/2/2010 10:58:00 AM , Rating: 2
that was my thought as well. My ATI graphics card warns me when it is overheating. Actually I get a noise and a pop-up quickly followed by a uncontrollable system shutdown

NVidia problem?
By smitty3268 on 8/1/2010 7:25:41 PM , Rating: 2
Well, I've had zero problems myself, running an ATI 3870 here.

Even though it probably makes sense for Blizzard to enable VSync by default, this is clearly a problem with the hardware and not the game. Are all the problems occuring on laptops with bad cooling, or is it more general? Seems to be some kind of bug in NVidida's drivers.

RE: NVidia problem?
By afkrotch on 8/1/10, Rating: 0
RE: NVidia problem?
By geekgrrle on 8/2/2010 4:19:25 AM , Rating: 3
Well I suppose I'm one of those "stupid ppl with laptops".. however I had no problems with running the game. This is probably because I bought a system that was actually made for gaming and has proper cooling and a chill-pad. (Sager NP8760 /w ATI 5870)

I think this issue has nothing to do with laptops. It has to do with a few things that apply to both laptops and desktops 1) not having the proper equipment
2) not having proper cooling
3) Not enabling hardware heat sensors to power down your system if it runs in to problems
4) people over-clocking their systems and disabling some of those sensors that help to protect it and also not providing enough cooling.

This is not Blizzard's fault and could have happened with any game with high graphics settings. I mean seriously .. it's not like they're Apple the put an antenna where people would normally hold a phone :)

RE: NVidia problem?
By GoodBytes on 8/3/2010 10:07:45 AM , Rating: 2
Actually, any laptop of good quality will provide great cooling. My laptop has a Quadro NVS 160M (equ: Geforce 9400M but with 256MB of memory instead of using the RAM) not a real gaming GPU, but hey StarCraft 2 runs fine on there. And also runs just as well when I overclocked my GPU at near double speed (except for the RAM as it doesn't have a heatsink on to them).

I have the Latitude E6400.

I think it all comes down too: you get what you paid for. If you buy a 600$ laptop, don't expect great cooling. But if you cash out on 1500$+ (Canadian), then it's a different story.

RE: NVidia problem?
By mindless1 on 8/3/2010 4:40:06 PM , Rating: 2
It depends on how you define "of good quality".

Many laptops use one fan that is RPM controlled based on the CPU temperature. In a scenario where the GPU is heavily loaded but the CPU is not, it can potentially make the GPU much hotter than it would otherwise be. Menus like the one Blizzard made would seem to be a scenario like this.

No matter what you pay for the product you can still end up with something that runs hot because they tried to make it smaller or quieter, although I will agree that if you pay the least amount possible then more corners may be cut in the product design, but the opposite is not always true.

Most likely a driver issue
By Mombasa69 on 8/1/2010 3:26:22 PM , Rating: 2
I'm guessing that the problems occur on geforce cards? I found my 2 470's were running hotter than usual on the nvidia 258.96 drivers, so I installed the previous 197.75 drivers and have no problem now.

Starcraft II all maxed out in 1920x1080+ will make your GPU/s work hard, very hard.

RE: Most likely a driver issue
By Mombasa69 on 8/1/2010 3:28:20 PM , Rating: 2
doh 297.75 drivers I meant, sounds geeky boring info, but if it helps you enjoy SCII more, then give it a go.

RE: Most likely a driver issue
By Anubis on 8/2/2010 8:48:38 AM , Rating: 2
im running 185.xx series drivers on my GTX 285 and haven't had an issue. and yes i mean 185 series i had to revert back that far to fix some random driver issue i was having with every new release

PC/Graphics makers fault.
By Acanthus on 8/1/2010 2:24:30 PM , Rating: 3
It is impossible that this is Blizzards fault. I'm not a blizz fanboi, i dont even have SC2...

Much like people blame "furmark" for killing cards, it is the fault of the GPU makers for cheaping out on cooling solutions for the maximum theoretical thermal output of their GPU.

It'll be interesting to see which models are dieing.

RE: PC/Graphics makers fault.
By SlyNine on 8/4/2010 2:51:06 PM , Rating: 2
Well, I think it's reckless to run code designed to open every circuit in .. Anything. GPUs are more dynamic then CPUs.

But now that this kind of "power virus" has occurred in *at least* 2 places in useful code. I have to agree that it is the Video Card makers fault.

I wonder if this effects 4xxx cards because those were commonly damaged by furmark and software like The 5xxx added hardware protection against such scenarios. I wonder if that was successful in protecting them against this kinda of thing.

The bug is not in Starcraft
By kroker on 8/2/2010 8:15:42 AM , Rating: 2
This is definitely not a software bug in Starcraft, it's either a software bug in the video card drivers, or in the hardware itself. Unless the card is overclocked by the user, no rendering engine should be able to burn a card to death, be it Furmark or anything else. I don't care what the GPU manufacturers deem as "urealistical load" on the GPU, the cooling should be able to handle any load that an application could throw at it. I wouldn't expect this from an integrated GPU, those are only meant to run light loads, but heavy loads are the reason people buy discrete GPUs for.

If the GPU manufacturers are desperate enough to push their GPUs to higher frequencies than the cooling may be able to handle at max load, in order to achieve a few more FPS than the competition, then they should at least put some safety nets in there somewhere, it's not hard to do at all. The drivers should monitor the temperature and slow down the chip if it detects overheating, and ideally alert the user of the situation. This should also override any manual fan speed control and revert any overclocking.

Anyway, it's absurd to even mention that Blizzard are responsible for this and that they should replace the damaged hardware. Unless the burned cards where overclocked by the user, this is the GPU manufacturers' fault and they are the ones who should pay the bill.

RE: The bug is not in Starcraft
By SlyNine on 8/4/2010 2:59:53 PM , Rating: 2
The drivers do this already. This has been debated with ATI calling furmark a "power virus". The software protection kicked in to slowly.

The 5xxx added hardware protection. Hardware should be able to handle what it was meant to handle. Take a car, you cannot isolate the engine ( furmark isolates the GPU) and push that at max RPM's for very long without it blowing. Being responsible is still needed.

However in this case the fault is on the videocard maker, as this is real code and it is not reasonable to expect the end user to avoid it. If furmark was the only case then maybe I could agree with ATI/AMD.

Im stuck
By shaidorsai on 8/2/10, Rating: 0
RE: Im stuck
By mindless1 on 8/3/2010 4:42:12 PM , Rating: 2
Since you are so happy I won't dare ruin your mood by informing you that ATI cards have fried too.

RE: Im stuck
By SlyNine on 8/4/2010 3:03:01 PM , Rating: 2
Yep but as ATI 5xxx cards have a better protection against these kinds of things. Aka hardware protection. I would be a bit surprised if the 5xxx series fried from this.

By Jonh68 on 8/2/2010 11:53:24 AM , Rating: 2
I have a gateway laptop with the 8800M series nVidia card and it played just fine with drivers from last fall. I updated with the latest drivers and the sides of the laptop became too hot to touch and the game would crash. I reloaded the older drivers and updated the game and it seems to be working fine. I am not going to re-install the new drivers as one of my photo processing programs that use GPU acceleration stopped working with the latest nVidia drivers. I don't think the game is the primary reason of overheating.

By Reclaimer77 on 8/2/2010 12:42:07 PM , Rating: 2
Well laptops have it pretty rough because most times the CPU and GPU cooling are tied together. Remember, it's not just the GPU that's going flat out in situations like this, it's the CPU too.

False Information...
By eek2121 on 8/2/2010 5:02:48 PM , Rating: 2
I see a lot of false information in this story, as well as in the message posts below. Just a bit of a clarification coming from a developer. There is no API in DirectX that allows you to control the fan or any other functionality related to heat or noise. Everything is controlled automatically by the chipset vendors' drivers.

If the graphics cards are overheating it's due to the cards not having sufficient cooling OR a bug in the drivers. Either way it's impossible for blizzard to be responsible for this.

This story is both worded poorly and horrendously inaccurate.

RE: False Information...
By walk2k on 8/5/2010 1:49:05 PM , Rating: 2
Actually the article is pretty clear... the game was basically doing a stress test on the video card(s), which it doesn't need to do.

Yes a properly designed/installed/maintained/etc video card should be able to handle a stress test, but the game still didn't need to be doing it.

Most people never run such a stress test on their machines, and a lot of people never clean out all the dust and crap which can cause them to run hotter over time.

So you add all that up, and the fact that this game came out in the summer... bam overheating.

It's not entirely Blizzard's fault, but they share some of the blame.

By Einy0 on 8/2/2010 12:41:15 AM , Rating: 3
This is pure insanity. How the hell can anyone think that a software maker should cover hardware failures on their PC.
If you bought from a good manufacturer it should come with a 2-3 year warranty. If it's older than that it's probably not much use anymore anyway. A $150 video card today is way faster than even the top cards from just a few years ago.
This would be like an Xbox 360 owner asking Epic to replace their failed console because it died while playing Gears of War. Insane...

Unfortunately SC2 also killed by graphics card..
By pedrofs2 on 8/8/2010 5:50:47 AM , Rating: 2
I would like to share my experience here as hopefully it will help other people.

I have a 2.5 year old Dell XPS with a Nvidia 8800GT that has been running quite nicely with other games till 2 days ago (Settlers 7, Battlefield, Crysis, etc.)

After installing Startcraft 2 two days ago and loading the main menu for around 30seconds, my card has now "fried".
It did not permanently failed at that exact moment, but it lock my PC and since that moment at the next 2-4 reboots it started producing errors in dell diagnostics (Error Code 5300: 0119) to a point where the card is now effectively dead.

I work in IT and I agree with all the comments posted above that a piece of software/game in **theory** should not be able to fry graphic cards (card and OS should prevent this).

Saying that in this case do we have an exception here??

I wouldn't believe this reports if I had read it on the web, but ... this it is too much of a coincidence - it happened to me and seems like many other people are experiencing the same problem?

PS: This will cost me £200 to get a replacement fix from Dell.

By madman26 on 8/8/2010 7:54:20 PM , Rating: 1
"work in IT" WTF does that mean?

PS: you're a fucking moron.

StarCraft II Beta Folder?
By Suomynona on 8/1/10, Rating: 0
By FoundationII on 8/1/2010 3:16:44 PM , Rating: 2
It actually doesn't (or at least not for me).
The folder in question is just named "StarCraft II"

By ksherman on 8/1/2010 2:24:43 PM , Rating: 2
Makes my MBP run super hot, but if it fries my GPU/logic board, then I will probably be able to have my whole machine swapped out :)

Time for more SC2!

BTW, graphics performance sucks with the 9600M GT 512MB in OS X, can't wait to get back to my 5850 wielding desktop!

not blizzard's fault
By chromal on 8/2/2010 1:54:36 PM , Rating: 2
Well, this sounds familiar. The same concerns were/are being raised by players of the new Valve Source-engine port of Alien Swarm, some of whose menus apparently run at uncapped frame rates, too.

I'll say again what I said then: Software cannot make hardware exceed 100% load. If hardware can't stay within safe operating temperatures at 100% load, it is, by definition, hardware that is inadequately cooled. Folks outraged that Blizzard 'destroyed' their GPU/laptop/pet-dog/whatever need to look in the mirror and assign blame there, for either picking a crappy GPU/laptop/pet-dog/whatever, or modding/overclocking it with inadequate cooling. Either way, I don't own a violin small enough to properly express my sympathy.

Who to lay the blame on?
By Yangorang on 8/2/2010 7:07:05 PM , Rating: 2
From what I've seen so far, this issue really just doesn't appear to be 100% Blizzard's fault. Card's that have been fried seem to be in particular 8800s, and some other notoriously hot cards such as the GTX480, GTX295, and HD4870X2 among others.

I have yet to hear of someone with a liquid cooled HD5870 or some other really well cooled card fried by Starcraft II. This would lead me to believe that Starcraft II is simply a relatively stressful game, and those without adequate cooling are just finding their hardware finally crapping out on them.

Granted, it is had to tell people it is not the game's fault when they have been happily gaming for years and only find their hardware crapping out on one particular game; however, I played Starcraft II for 14 hours straight on my HD5770 (stock cooler) perfectly fine without any issues. My card did get a little bit toasty at 80C even with the fan set at 70%, but it still worked perfectly fine. (ran HWMon in the background to monitor max temps-also my system is on the second floor where room temp is above avg)(and yes I am nerd enough to play through the whole SC2 campaign in one run)

its the game
By nic4rest on 8/3/2010 3:15:27 PM , Rating: 2
I have played every mmo at high settings, fps etc etc i got two 8800 gts and right when i close this game my card dies? ya and my pc is always cleaned and runs perfect... now i had to go out and buy a new 300 card.. thx blizzard thats the last time i buy your product, i know your sad

By Icewind31 on 8/6/2010 11:12:31 AM , Rating: 2
Despite the fact that Starcraft 2 has in fact caused two of my video cards to die, I don't think Blizzard is to be blame. Yes it does make your card go 100% like Furmark, however it's up to the video card manufacturers to make sure cards can withstand that kind of load stress.

Cooling was not an issue for me because I had my GPUs on water and they never went over 50C.

I believe the SC2 menu (like Furmark) will now be my "Burn-In" testing tool. It seems to be able to weed out the bad hardware. The first card that died for me was a Asus Matrix 4850 that ran at stock clock. The next card that died was an eVGA 8800 (however it was a RMA card I got less than a month ago).

While it's a pain to deal w/ RMA all over again, I'm glad I was able to have the problems of these "on the fence" cards surface before warranty expired. Despite the fact that the downtime is starting to cause SC2 to look like SC1 as I move down the chain of spare video cards LOL.

By tedrodai on 8/1/2010 7:05:08 PM , Rating: 1
The zerg snuck around me and laid an egg in my PC while I was attacking the hive! I don't see any way to beat this strategy!! Zerg is UNBALANCED!!!

By Shadowmaster625 on 8/2/2010 1:31:53 PM , Rating: 1
The question is, how much money did M$/Intel/AMD/Nvidia pay them to insert this little "feature" in their game so as to force thousands of idiots to upgrade? (What, you think they just talk about the weather when they're out on the golf course?)

Zerg rushed?
By Trisped on 8/2/2010 4:57:42 PM , Rating: 1
Starcraft II fans beware, your graphics card may get Zerg rushed

I think "Starcraft II fans beware, your graphics card may get swarmed!" is much better. It has a pun which is related to the subject, but doesn't go so far as to sound forced.

Insufficient cooling?
By MatthiasF on 8/1/10, Rating: 0
RE: Insufficient cooling?
By FaceMaster on 8/1/10, Rating: 0
"Vista runs on Atom ... It's just no one uses it". -- Intel CEO Paul Otellini
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki