Print 106 comment(s) - last by Tyranties.. on Aug 14 at 10:56 PM

Starcraft II is overheating some users GPUs, but Blizzard has released a temporary fix.  (Source: Doupe)
Starcraft II fans beware, your graphics card may get Zerg rushed

StarCraft II: Wings of Liberty, the first game in Blizzard's highly anticipated real-time strategy sequel trilogy launched on Tuesday.  Unfortunately, the blockbuster PC title -- which is expected to sell 10 million copies or more -- had some bumps during its launch.

There were a number of minor bugs, but nothing show stopping at first.  Then the reports of melting GPUs hit.

Among those affected was Adam Biessener of Game Informer whose card melted while he was live blogging about his game experience.  He bemoaned, "Three hours of cursing later, I'm posting this from my wife's laptop because both my graphics card and my work laptop appear to be fried."

The problem appears to be located in the main menu, where an uncapped frame rate maxes out the GPU, in some cases pushing it to overheating and potentially permanent failure.

Blizzard has issued a response on its support site, acknowledging that it was aware of the issue, and offering a quick fix.  The company writes:

Certain screens make your hardware work pretty hard

Screens that are light on detail may make your system overheat if cooling is overall insufficient. This is because the game has nothing to do so it is primarily just working on drawing the screen very quickly. A temporary workaround is to go to your Documents\StarCraft II Beta\variables.txt file and add these lines:



You may replace these numbers if you want to.

For eager customers who already lost a graphics card, though, that fix may prove too late.  Blizzard has not announced any plans to replace the lost hardware of victims who experienced the bug.

Many customers are outraged at this.  Writes one victim Lorsaire:

Why was this not addressed already before release, and why were there no breaking news warnings or updates to fix this before people started having damage done to their hardware?  My Nvidia GeForce cost me more than $300 to get a good card that was great for gaming...  Blizzard are you doing anything or have plans to compensate people for the damage you've created?

Of course some of the cards may be covered by manufacturer warranties.  And while it does appear a bug (uncapped framerates) is partially to blame for killing off the cards, a card pushed to the max would generally not die instantly were it not for poorly engineered and/or defective cooling.  It appears that the cards ultimately were done in by the double blow of both a software bug (in SC II) and hardware issues.

The game features intensely addictive multiplayer gaming between three diverse races -- the Zerg, the Protoss, and the Terrans.  It also features a single player campaign in which you play a Terran rebel.  Future titles -- Heart of the Swarm and Legacy of the Void will included Protoss and Zerg campaigns, and possibly deliver new multiplayer features as well.  Just beware the uncapped framerates.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

So who's fault is this?
By plewis00 on 8/1/2010 1:37:42 PM , Rating: 5
I don't know if it's just me who thinks this but surely a piece of software cannot be responsible for a graphics card failure (unless that piece of software stops the fan or something similar)?

If a game is running at an uncapped framerate (I'm sure quite a few games do this anyway) why is that causing a card to burn out? These cards aren't running beyond spec are they - unless we later find out that these people who reported dead cards also had been overclocking them and have insufficient cooling.

While this is annoying, unless Blizzard does something stupid like auto-overclocks graphics card, modifies fan settings or does something else on a hardware level to modify the card, I don't see how they can have any real element of blame pinned on them...

RE: So who's fault is this?
By kaosstar on 8/1/2010 2:32:01 PM , Rating: 5
Agreed. One should be able to have his GPU maxed out pretty much indefinitely without it overheating. As much as I'd like to bash Blizzard, the blame lies with the card manufacturers for providing insufficient cooling, or the end users for having inadequate airflow.

RE: So who's fault is this?
By Qapa on 8/1/2010 6:40:45 PM , Rating: 2
2 possibilities only:
- card running normal and fried => card company is to blame! they must have sensors to detect overheating and throttle down or even stop;
- user also overclocked the card or did something else to the card, then the user might be at fault;

RE: So who's fault is this?
By MonkeyPaw on 8/1/2010 8:40:57 PM , Rating: 1
What might be going on is that cards are running at 2D fan speeds at the game's menu prompt (possibly for noise reduction), but the game is actually maxing out the GPU anyway. If the game isn't using the driver correctly, it might be on the developer. Sounds unlikely though.

RE: So who's fault is this?
By afkrotch on 8/1/2010 10:23:23 PM , Rating: 5
That's manufacturer's problem there. The fans should be spinning up after a certain threshold is hit.

While Blizzard is at fault for making not correctly implementing a framerate cap on their menus, they aren't responsible for the flaws in your hardware.

My GTS 250 and 8800 GTS run it fine. Course I have adequate cooling on my card and in my case.

RE: So who's fault is this?
By Omega215D on 8/2/2010 3:43:26 AM , Rating: 2
Unfortunately my GTX 260 216 by EVGA doesn't appear to speed up the fans when I have a game going but will throttle the clocks. Before every game I have to set the fan settings manually in the console.

RE: So who's fault is this?
By hrah20 on 8/2/2010 4:02:43 PM , Rating: 2
I think it's users fault, I have a radeon HD4890 factory overclocked and never had a problem with starcraft II even though the ati 4 series runs hotter than the new 5 series.

I always check my card temps on CCC & I have a very good ventilation system on my case

RE: So who's fault is this?
By afkrotch on 8/2/2010 8:39:08 PM , Rating: 2
I don't see how it can be the user's fault. The cards are meant to run and the drivers should have thresholds built in. Unless you are voiding the warranty in some way, it's the manufacturer's fault.

Unless new TOS state that you need to be constantly monitoring your card and your case temps need to be at X temp or lower.

RE: So who's fault is this?
By AntDX316 on 8/3/10, Rating: 0
RE: So who's fault is this?
By Omega215D on 8/9/2010 7:27:36 AM , Rating: 2
Users fault that when the fan speed is set on Auto and it stays in the default speed when a game is fired up? Kind of dense don't you think?

RE: So who's fault is this?
By Thalyn on 8/2/2010 5:28:21 AM , Rating: 4
I believe every ATI card since the 9000-series (non-inclusive) has had the split clocks - 2D and 3D modes. However, even though it wasn't until the 3000-series that the decision on which to use was made based on graphics load rather than surface mode (windowed modes, even fullscreen, would use 2D clocks on 2000-series cards and earlier) the temperature of the graphics card was always used to determine what speed the fan should spin. Even if the earlier cards were still in 2D mode, if they got hot than the fan would speed up accordingly.

Unless it was a non-reference design which lacked thermal control or you manually told it to run at a different speed (eg My 4870 is fixed to 34%).

I can't say with any certainty about GeForce cards, though I believe they implemented this same thermal fan control with the 5000-series (load-based speed control wasn't until the 8000-series from memory). Again, though, non-reference cooling solutions and manual overrides could bypass the thermal control.

In short, everyone experiencing graphics hardware failure has inadequate cooling. Whether this is because the cooling they have is blocked (dust or otherwise), inadequate for the power draw of the card or fixed to an inadequate level is largely irrelevant - SC2 is about as dangerous to graphics hardware as 3DMark 01.

RE: So who's fault is this?
By michael67 on 8/2/2010 5:22:11 AM , Rating: 3
There is a third, And most likely cause of overheating cards/components.


I blow out my case every 3~6 months and after cleaning it usually i have a +3c drop in temp's.

But there are lots of people that never clean there PCs 0_o

I have a compressor in the garage and take my PCs there and blow them out, its amazing how mouths dust they collect specially my HTPC that's on 24/7.

Only one thing that really important, when doing this is holding the fans in place otherwise you spin them to dead.

Did that once a test whit a old fan 100psi/7bar made that fan spin at about 12.000 rpm before it died, according to the mobo ;-)

RE: So who's fault is this?
By tallcool1 on 8/2/2010 9:05:27 AM , Rating: 2
That is why I only buy computer cases that have integrated dust filters. Preferably ones that are easy to remove. Take the filters out regularly for cleaning.

RE: So who's fault is this?
By afkrotch on 8/2/2010 8:45:43 PM , Rating: 2
Humidifier and table. Why clean out your case, just set up your living space so you don't have to.

The humidifier will keep the dust/dirt down on the ground. The table will keep your desktop off the ground, so it doesn't suck in the dust/dirt. I only have to clean my case out about once a year. By then, I'm usually upgrading and it would have gotten cleaned out regardless.

RE: So who's fault is this?
By mindless1 on 8/1/2010 10:33:42 PM , Rating: 2
With overclockers that do voltmods, they can cause the GPU to operate beyond safe limits no matter what normal measure of heatsinking is used, but I can't blame end users for inadequate airflow because the card should throttle back or cease function if it gets above a certain temperature enough to damage it.

Blizzard has done nothing wrong, the IDEAL we all strive for is that our GPUs do as much work as possible, as maxed out as they can get up to the point where it causes stuttering or a bottleneck to the framerate.

Otherwise, you are either gaming at lower FPS or have wasted money on a GPU you never fully utilize.

RE: So who's fault is this?
By 0ldman on 8/2/2010 10:19:55 AM , Rating: 2
I am tempted to vote you down just because you said exactly what I was going to say before I got to say it.


RE: So who's fault is this?
By Proxes on 8/2/2010 2:28:09 PM , Rating: 3
I agree. Saying a game is burning out video cards is retarded. Almost every game out there will max a video card.

When I first started playing WotLK my computer would lock up and crash. I noticed the fan on my video card (8800 GT) was freezing. So I bought a 3rd party fan/heat sink. Temps dropped almost 10c and my video card doesn't go over 55c while playing any game, including hours of SC2.

If it was the games' fault then Crysis would be eating video cards left and right.

RE: So who's fault is this?
By Alpha4 on 8/2/2010 7:22:05 PM , Rating: 5
World of Warcraft crashed your GPU?? Windows Aero Glass looks more demanding than that motley collection of jagged edges. Crysis would have set your house on fire.

RE: So who's fault is this?
By walk2k on 8/5/2010 2:03:30 PM , Rating: 2
Actually most games will not max out the GPU, the bottleneck for most games is the CPU or other subsystem (or the code itself.. believe it or not most game code just isn't very efficient). It's only when the game was doing nothing and basically running the video refresh system in a tiny tight little loop that is stressed out the video card. Just normally playing the game does not.

It's the same with CPU (or any other PU) try playing a game and monitoring the temps. Now run a stress-test like prime95 or orthos and watch the temps climb another 5-10c, maybe more with poor cooling. These programs run tight little loops that max out the CPU. This is a situation you almost never find in a real-world application. Even the most intensive number-crunching apps don't run in small loops like that.

RE: So who's fault is this?
By Lerianis on 8/11/2010 4:34:10 PM , Rating: 2
This falls on the manufacturers also for allowing the graphics cards to 'shoot up' in memory clock and other things.

Hell, the laptop card in my Gateway laptop has to be put on 'constant performance' in Rivatuner so that crap doesn't happen, otherwise it is running at 1600/800/600 speeds in 'graphically intensive games'.

These things should be LIMITED unless the person who owns the computer changes it otherwise to the STOCK SPEEDS of the graphics cards.

RE: So who's fault is this?
By Lerianis on 8/11/2010 4:36:55 PM , Rating: 2
I should also say that the graphic drivers should have a 'heat limit' of (at most) 70C before it start underclocking the graphics card all on it's lonesome.

RE: So who's fault is this?
By kingmotley on 8/1/2010 2:35:50 PM , Rating: 5
I totally agree. I would hardly call what blizzard did even a bug. They draw the screen as fast as they can. Perhaps limiting the refresh rate would be an additional feature, but that shouldn't be absolutely required. Most really old games never did that either.

Beyond that, asking the graphics card to draw stuff should never cause it to overheat to the point of burning itself out. The card should have detected it's overheating and either shut down, or underclocked itself to keep it from getting to that point. Poor engineering on the graphics cards part.

RE: So who's fault is this?
By BruceLeet on 8/1/2010 2:51:11 PM , Rating: 2
He bemoaned, "Three hours of cursing later, I'm posting this from my wife's laptop because both my graphics card and my work laptop appear to be fried."

I know first hand laptops generally have insufficient cooling and I've had laptops that sit on my lap for 10 minutes and it caused heat discomfort on my thighs, although it is a bug, obviously unintentional but it could be considered next to stress testing.

RE: So who's fault is this?
By MastermindX on 8/1/2010 3:00:32 PM , Rating: 3
Last time I checked, every FPS (first person shooters) does NOT limit frame rate in any ways. It just renders non stop at the maximum frame rate possible. So... Unless there is some subtlety between rendering a simple menu VS rendering a complex 3D environment that would make the menu more likely to overheat than the 3D one, any FPS would have fried those graphic cards too.

That reminds me of the Civilization II CPU utilization bug, where at the end of a turn the CPU would go to 100% utilization.

But like many have said already. Yes it's a bug... But it's not a hardware killing bug. It's nothing like the bug in nVidia's drivers that affected the speed control of the fan and fried the card in the process.

Is Blizzard legally liable of the cards burning down? If they are, we are in a way more retarded civilization that I could imagine. Hardware should have no problem running at maximum capacity for YEARS before blowing up. If they don't, the hardware maker made it's hardware rating too aggressive.

RE: So who's fault is this?
By Assimilator87 on 8/1/2010 3:14:31 PM , Rating: 5
SC II Menu vs. Furmark

Who shall be the victor!? Please test Anand =P

RE: So who's fault is this?
By HostileEffect on 8/1/2010 3:38:34 PM , Rating: 2
I thought I recall hearing about this lack of FPS cap bug in the beta a very long time ago... then again it may be a different game.

RE: So who's fault is this?
By Zehar on 8/1/10, Rating: 0
RE: So who's fault is this?
By kyp275 on 8/1/2010 5:23:23 PM , Rating: 4
Whatever you may think of the uncapped FPS issue, the fact remains that what killed those card were either poor engineering by the card's manufacturer or maintenance/cooling setup on the end user's part. I certainly see what you're saying when it comes to Blizzard's response, but ultimately the cause of these failures lies elsewhere.

RE: So who's fault is this?
By nic4rest on 8/3/2010 3:21:55 PM , Rating: 1
how can you blame the card when all of a sudden thousands of people cards died? lol are you serious all of a sudden they all got dity cards during a SC2 beta and release wow you must run for government your good

RE: So who's fault is this?
By kyp275 on 8/4/2010 2:24:08 PM , Rating: 2
Where are the "thousands" of cards that died? no seriously, please link me to the source where you got that information.

Not that the number matters in any case, nor are the only cause being dirty cards. It could have been poorly designed cards, or poorly setup case with limited airflow, which in turn limited the performance of the card's cooling system.

Ultimately, the software did nothing other than push the hardware hard, and that's no excuse for the hardware to fail, especially when it's operating within spec. SC2 may have been the trigger for the time bombs that were in some systems, but it certainly did not put it there.

RE: So who's fault is this?
By kyp275 on 8/4/2010 2:24:21 PM , Rating: 2
Where are the "thousands" of cards that died? no seriously, please link me to the source where you got that information.

Not that the number matters in any case, nor are the only cause being dirty cards. It could have been poorly designed cards, or poorly setup case with limited airflow, which in turn limited the performance of the card's cooling system.

Ultimately, the software did nothing other than push the hardware hard, and that's no excuse for the hardware to fail, especially when it's operating within spec. SC2 may have been the trigger for the time bombs that were in some systems, but it certainly did not put it there.

RE: So who's fault is this?
By Reclaimer77 on 8/1/10, Rating: 0
RE: So who's fault is this?
By CptTripps on 8/2/10, Rating: 0
RE: So who's fault is this?
By CptTripps on 8/2/2010 8:30:32 PM , Rating: 2
Forgot my last bit. Blizz has no liability here, hit up your card manufacturer for replacement. I still think it's crap that they left it in when reported in the beta but... still not their problem in the long run.

RE: So who's fault is this?
By afkrotch on 8/1/2010 10:26:13 PM , Rating: 3
Last time I checked, you are wrong about FPS's not having limited frame rates. A huge amount of FPS's will have their framerates capped at 60 fps or 100 fps. It's done not for frying hardware, it's to provide a smoother gameplay.

That way your fps rates around jumping around so much. Like going from 60 fps to 100 fps, to 150 fps, to 40 fps, to etc.

RE: So who's fault is this?
By Reclaimer77 on 8/2/2010 12:55:50 PM , Rating: 2
Capping FPS is to 60 is to prevent "tearing". Not making your game smoother. Higher FPS never makes a game "less smooth", I have no idea what you are talking about.

Read closer, this is about unlimited FPS in the menu's, not the playable game field itself. I doubt uncapped menu's are that rare in PC gaming. Which is what he was saying.

I did some personal testing and using the in-game FPS meter (ctrl+alt+F) my FPS in the playable gamefield was 60, as it should be. But, for example, on the bridge when I clicked on the Mission Archive console, my FPS was a steady 450 FPS. Because it's just a static image, no animations no nothing, being rendered as fast as it could be by my machine.

Just to tempt fate and brag about my self built system cooling, I left it on that screen for about an hour. No heat issues whatsoever, woot :)

I'm going to issue the commands to cap the menu FPS, but from what I can tell this is no more stressful on a well cooled system than any other type of hard gaming or system benchmarking torture test. I feel bad for those who had to learn lessons the hard way, but I hope they learned them at least.

RE: So who's fault is this?
By afkrotch on 8/2/2010 9:19:32 PM , Rating: 2
vsync is for tearing.

Higher fps rates give you a smooter game, having your fps rates jump all over the place does not. If your framerates are jumping around from like 60 fps, to 100 fps, to 400 fps, to 50 fps, to 70 fps, guess what? There is going to be a noticeable choppiness to the game. To minimize this you just cap the fps rates at 60/100 fps.

If you have crap hardware and it has trouble even hitting the fps cap, then an fps cap isn't going to do much for you.

SC2, doesn't bother me that it has uncapped menus. I have a quad 120mm heatercore watercooling system. Bring it on.

RE: So who's fault is this?
By MatthiasF on 8/1/2010 4:49:37 PM , Rating: 2
It was mentioned that it's happening primarily on Nvidia GPUs, so some people were blaming Physx code in the game.

RE: So who's fault is this?
By Akrovah on 8/4/2010 2:33:39 PM , Rating: 3
Unless I'm mistaken, SCII doesn't use Physx, it uses Havok, which does not make use of hardware acceleration.

RE: So who's fault is this?
By Phoque on 8/1/2010 5:28:24 PM , Rating: 1
"I don't see how they can have any real element of blame pinned on them... "

Especially if it's an Nvidia card, considering their bumpgate and fermi (gf100, gf104 is fine) fiascos.

RE: So who's fault is this?
By Reclaimer77 on 8/1/2010 7:20:27 PM , Rating: 2
Yeah unless something has drastically changed without anyone knowing it, software cannot be responsible for hardware failures such as this.

Also in some cases graphics card manufacturers drivers too severely restricts fan speeds to counter noise, which leads to overheating. Like my Radeon Catalyst drivers, where I had to go in and manually set the fan speed parameters because default was way too conservative.

RE: So who's fault is this?
By hadifa on 8/1/2010 7:24:02 PM , Rating: 2
Well it's not necessarily a bug nor in all fairness it's Blizzard's fault. This issue came up for other games as well, and there are various solutions. As Blizzard statements goes, this happens for systems where the cooling is insufficient. These systems' VGAs were not pressured in the past so every thing seemed to be alright, or have gathered some dust on the heat sink where the cooling efficiency has decreased and then there is some pressure that causes the failure.

The problem is partially because in normal gaming, the graphic component have some breathing time while waiting for say CPU and other components, but on menu screen, that's not the case. Usually games go around this issue by either capping the frame rate or putting a small idle time between frames(say 1 micro second) to prevent this problem.

Galactic Civilization 2 had this issue for people who were creating very complex and involved ship designs. They fixed it by putting some fixed idle interval at the end of each frame.

No it's not Blizzard's responsibility, but knowing their installed base, it's an oversight IMHO.

Starcraft is the new Furmark ;-)

By Divide Overflow on 8/1/2010 9:00:28 PM , Rating: 2
Agreed. This is not Blizzard's problem to fix. The game engine software is working as intended to maximize FPS. If a video card fries itself under load, there is a serious problem with the design of that hardware.

RE: So who's fault is this?
By Galcobar on 8/2/2010 7:42:57 AM , Rating: 2
Does anyone else recall the 196.75 drivers issued by Nvidia which managed to fry cards by effectively disabling the fan controllers?

Blizzard was among the first companies to issue an alert on the issue, but they rely on their massive user-base for this sort of reporting.

Who knows, SC2 could well instruct the card in such a manner that it replicates the issue the 196.75 drivers produced, or interferes with its temperature sensor/throttling protocols.

RE: So who's fault is this?
By EricMartello on 8/2/2010 5:33:45 PM , Rating: 2
I'm in agreement with Plewis00 here...this is not a Blizzard issue (and I'm pretty sure that SC2 isn't overclocking the graphics card without the user knowing it). The GPUs that are overheating are probably in poorly designed systems with cramped cases and insufficient airflow. The blame falls squarely on whoever built the system - most modern graphics cards do include a cooling system that provides sufficient cooling for the card to operate at its full capacity without a time limit.

Also, redrawing a simple scene such as the main menu shouldn't really require the GPU to be maxing out. I would say that a lot of those operations can sit in a cache somewhere since the screen does not need to be updated so frequently, thus leaving the GPU in a mostly idle state.

RE: So who's fault is this?
By callmeroy on 8/3/2010 8:57:14 AM , Rating: 3
As an avid PC gamer (who used to be one of those obsessive tech geeks who not only put together his own gaming rigs but experimented with over clocking anything possible back before the modern BIOS and GUI systems/apps made it much more simplified....I gave that up years ago though btw).....Anyway in all those years gaming and tweaking hardware to get the max out of games...First off I've never heard a claim of a game being blamed on my hardware going volcanic on me...Secondly, I never had any issues of this type to begin with (I've had parts over heat before but I mean it had NOTHING to do with a game).

Also I've been pouring all my gaming time for the past week into playing SCII on two different systems with different hardware on each...until I read this story I didn't even know there WAS a problem!


RE: So who's fault is this?
By Tyranties on 8/14/2010 10:56:29 PM , Rating: 2
The SCII software bug either stops or tricks the GPU into NOT spinning it's fan up to cool the card.

Play Crysis maxed out and really stress the GPU and the fan spins faster. Play SCII's menu and the fan idles at a very low speed. The fried GPU's have all been able to cool themselves adequately, even if over clocked. But the SCII software bug has stopped them from functioning properly and thus they are frying.

My GTX280 fried, it was not OC'd and my PC has so much cooling it's insane. Yet within 2/3 days of SCII too I began to get strange artefacts and crashes, and now the card is completely dead.

I read that during the beta this occurred and blame was placed on 196.xx nvidia drivers. Well my card fried with 256.xx drivers. And after the first initial problems I rolled back to 185.xx drivers and the fan still did NOT spin up during the menu.

From my experience it is not the drivers, but instead SCII is bugging or convincing the GPU that it does not need to spin the fan any faster, when really the unlocked frames are stressing the card dramatically.

End result, fried GPU.

Obviously the hotter running cards, GTX280, 8800GT etc will fry first as they already ran hot, although not hot enough to warrant frying. So I would suggest all users to use the solution in this article. For all you know your card is slowly cooking too.

I read one report of a 5870 user who claimed his card ran at 70 degrees in the SCII menu, he manually cranked his GPU fan to 100% and then he achieved 38 degrees while in the menu of SCII.

"When an individual makes a copy of a song for himself, I suppose we can say he stole a song." -- Sony BMG attorney Jennifer Pariser
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki