Study: Generating Capacity of Large-Scale Wind Farms Lower Than Previous Estimates
February 27, 2013 9:41 AM
comment(s) - last by
But there are limits that could hold wind back from growing
A new study from Harvard University's School of Engineering and Applied Sciences says that the
generating capacity of large-scale wind farms
isn't quite as high as scientists previously thought.
The study was led by Harvard applied physicist David Keith, who showed that we may not have access to as much wind power as once thought. Keith is an internationally renowned expert on climate science.
According to Keith's study, individual wind turbines each create a "wind shadow," which is where air is slowed by the drag on the turbine's blades. Wind farms with as many turbines packed into an area as possible but with just the right amount of spacing in between them are optimal for decreasing this drag.
However, the larger these wind farms are, the more they communicate and regional-scale wind patterns are even more important. Keith said previous generating capacity of large-scale wind farms ignored the drags and these wind patterns.
Keith's study said that the generating capacity of large-scale wind farms that are larger than 100 square kilometers could peak anywhere from 0.5 and 1 watts per square meter. Prior estimates put these figures at 2 to 7 watts per square meter.
“If wind power’s going to make a contribution to global energy requirements that’s serious, 10 or 20 percent or more, then it really has to contribute on the scale of terawatts in the next half-century or less,” said Keith.
But there are limits that could hold wind back from growing. Keith said that if wind were to exceed 100 terawatts, it would have a huge impact on global winds and eventually climate -- which could negatively affect climate more than doubling CO2.
“Our findings don't mean that we shouldn’t pursue wind power—wind is much better for the environment than conventional coal—but these geophysical limits may be meaningful if we really want to scale wind power up to supply a third, let’s say, of our primary energy,” said Keith.
“It’s clear the theoretical upper limit to wind power is huge, if you don't care about
the impacts of covering the whole world with wind turbines
. What’s not clear—and this is a topic for future research—is what the practical limit to wind power would be if you consider all of the real-world constraints. You'd have to assume that wind turbines need to be located relatively close to where people actually live and where there's a fairly constant wind supply, and that they have to deal with environmental constraints. You can’t just put them everywhere.”
Keith concluded that we'll need to find sources for tens of terawatts of carbon-free power "within a human lifetime" in order to stabilize the Earth's climate.
“It’s worth asking about the scalability of each potential energy source—whether it can supply, say, 3 terawatts, which would be 10 percent of our global energy need, or whether it’s more like 0.3 terawatts and 1 percent," said Keith.
This article is over a month old, voting and posting comments is disabled
RE: Yet another reason to switch to nuclear
2/28/2013 1:57:06 AM
Ahhh, something I very much enjoy talking about. That is to say, why the chernobyl disaster happened.
As this guy said.
The soviet RBMK1000 reactors, like those used at Chernobyl, were actually quite impressive in many ways. They were cheap to build, and easy to operate compared to most other reactors at the time. However, it had some major flaws, that, coupled with operators who weren't aware of those flaws because it was kept hidden, meant that the disaster was inevitable.
Prior to the accident, the operators of the Chernobyl NPP were planning to run a test on the Unit #4 to see how long the turbines would continue to generate power to power the reactors main coolant pumps, before the diesel generators kicked in. For the test, they wanted the reactor to be in a certain power range. As they decreased power by inserting the control rods, a side effect of the design of the reactor meant that with lower power, a certain element was produced that "poisoned" the core, further lowering output.
After a time, the reactor reached a dangerously low ~30MW output, practically in a shutdown state. In order to compensate for this, and try to bring it back up to the roughly 700MW they needed for the test, and to combat the core poisoning, they fully retracted almost all of the 200+ control rods. The design of the reactor was unique, it used graphite as a neutron modulator, along with light water, to keep the reaction under control. This is similar to a Boiling water reactor in terms of operation, but the steam doesn't all go through the generator, some steam and water that go through the steam seperator go back through the main coolant pumps and into the reactor.
The temperature of the coolant inlet greaty affected the power output of the reactor. So at the time the test began, the reactor was very unstable, the control rods fully retracted, the reaction being kept under control by the cooler water, as the steam/heat was being used to drive the turbines.
When they began the test and shut down the turbine, heat was no longer being dissapated by the turbines, and the coolant inlet temperature of the reactor increased dramatically, which in turn increased the reactivity. As this happened, the water in the core started to boil, generating steam bubbles, or "voids"
This is where the Soviet design was quite bad. The RBMK reactors had a very, very high positive void coefficient. Compare that to pretty much all reactors elsewhere, including our own here int he states. What that means is, as bubbles, or "Voids" are created, the rate of reaction increases, versus in a negative void coefficient reactor, which means that voids DECREASE the reaction. So as the water boiled, the reaction increased, increasing steam generation, a positive feedback loop.
The power output of the reactor increased to ridiculous levels, and the unit operators SCRAMed the reactor, inserting the control rods. This is where the other major flaw of the RBMK reactors lay. The control rods had graphite tips, which cleared water out of the channels as they were inserted. This had the unintended effect of actually increasing the reaction rate in the bottom of the core, which was what caused the initial steam explosion inside the core.
That explosion damaged and cracked the control rods, preventing them from fully inserting, only being inserted partway at the time. The reaction continued to increase as the water boiled away. The pressure inside the vessel grew and grew, until the 1000ton pressure vessel head literally blew off, damaging the building, and exposing the reactor core to the outside. In the explosion, tons of highly radioactive fuel and graphite moderator were scattered, and caught on fire, sending radiation into the atmosphere.
If they had figured out what happened sooner, and had evacuated sooner, it wouldn't have been so bad. But the dosimeters/geiger coutners they had at the time only measued up to 3.6mRoentegens/hr. So they decided that was as high as it was and chose not to inform the public about it. It wasn't until later, after firement and others had died from the radiation trying to put out the fire, that they found out that the radiation levels were actually over 10,000Roentegens/hr.
And thus you have the Chernobyl disaster. I'm not a nuclear engineer myself, but I have a very good ability to understand and comphrehend how things work by reading and looking at it. It wouldn't have been nearly so bad if the reactor operators had known about the flaws, but they were told that it was perfect and there couldn't be any design flaws. The RBMK1000 was perfect, as far as they knew.
"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes
Wind Power from Hawaiian Islands Could Meet Oahu's Increasing Energy Needs
March 18, 2011, 10:22 AM
Study: Wind Farms = Bird Killers
June 7, 2010, 11:51 AM
Cool Science Video of the Day: Carnivorous Leech Eats Giant Jungle Worm
October 16, 2014, 6:44 PM
Facebook CEO and Founder, Mark Zuckerberg, Donates $25M to Fight Ebola
October 14, 2014, 5:06 PM
Chagrined Over Leaks, CDC Confirms First U.S. Ebola Diagnosis in Dallas, Texas
September 30, 2014, 5:55 PM
Nail Polish May Soon be Able to Detect Date Rape Drugs
August 26, 2014, 7:57 AM
SpaceX Falcon 9-R Rocket Suffers Malfunction, Self-Destructs During Test Flight
August 23, 2014, 9:36 AM
Texas Chosen as Site for SpaceX's First Commercial Launchpad
August 5, 2014, 1:44 PM
Most Popular Articles
Google Launches "Same-day Delivery" Subscription Service for $95/year
October 14, 2014, 10:37 AM
Google Announces Android 5.0 “Lollipop”, Nexus 9 Tablet, and Nexus 6 “Phablet”
October 15, 2014, 12:41 PM
Ireland to Close Loophole Apple and Google Used to Evade EU Taxes
October 13, 2014, 10:45 PM
Update: Motorola Droid Turbo Coming Oct 28, 48-hour Battery Life Confirmed
October 19, 2014, 9:19 PM
Samsung Develops 802.11 AD Wi-Fi, But Will it be Sunk by Poor Penetration?
October 14, 2014, 4:30 PM
Latest Blog Posts
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
Space Terrorism is a Looming Threat For the United States
Apr 23, 2014, 7:47 PM
Facebook Aims to Provide Internet to "Every Person in the World" with Drones, Satellites
Apr 1, 2014, 10:20 AM
Retail Mobile Sites Experience Outages in Light of Simplexity's Bankruptcy
Mar 14, 2014, 8:48 AM
More Blog Posts
Copyright 2014 DailyTech LLC. -
Terms, Conditions & Privacy Information