backtop


Print 75 comment(s) - last by PlasmaBomb.. on May 1 at 9:51 AM


The image of a hurricane-spawning smokestack was used to promote the film, An Inconvenient Truth.
Author of the theory that global warming breeds stronger hurricanes recants his view

Noted Hurricane Expert Kerry Emanuel has publicly reversed his stance on the impact of Global Warming on Hurricanes. Saying "The models are telling us something quite different from what nature seems to be telling us," Emanuel has released new research indicating that even in a rapidly warming world, hurricane frequency and intensity will not be substantially affected.

"The results surprised me," says Emanuel, one of the media's most quoted figures on the topic.

The view that global warming has limited impact on hurricane strength has been previously reported in numerous DailyTech articles.

Emanuel, professor of Atmospheric Science at MIT, is the author of numerous books and research papers on climate change. For over twenty years, he has argued that global warming breeds more frequent and stronger storms.  In fact, his 1987 paper is often cited as the first appearance of the theory itself.

His 2005 research -- published just one month before Hurricane Katrina struck -- made world headlines, and was heralded as the "final proof" that Global Warming was already having severe impacts on daily lives.  Overnight, Emanuel became a media darling.  The following year, Time Magazine named him to their "100 People Who Shape Our World" list.

In 2006, Al Gore used an image of a smokestack spawning a hurricane to promote his movie, An Inconvenient Truth.

Emanuel's newest work, co-authored with two other researchers, simulates hurricane conditions nearly 200 years in the future. The research -- the first to mesh global climate models with small-scale high-resolution simulations of individual storms -- found that while storm strength rises slightly in some areas, it falls in others -- and the total number of worldwide storms actually declines slightly.

Emanuel's reversal is certain to reverberate through political circles as well; many politicians and candidates are using the hurricane threat to compel action on climate change.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Causes
By MozeeToby on 4/14/2008 10:44:37 AM , Rating: 1
Maybe you can answer this for me, why on earth would you hardcode a number to be two decimal degits in length. The reason ussually given is to save memory/storage space, but that just doesn't make sense.

The smallest data size that would be used would be an 8 bit short, which would give a year range from 0-255 (127 if you are lazy and leave it signed). If I were the coder I would have used a short and displayed the last two digits, which would not cause any problems for processing since the full number exists whether it is displayed or not.


RE: Causes
By Michael01x on 4/14/2008 11:33:31 AM , Rating: 4
Initially it was about memory, storage, and cost. Every digit or character of data to be tracked had to be accounted for and justified. The up-side was that you had very efficient, fast, and disciplined code. Memory was used responsibly, and cleared as soon as it was no longer needed. The elimination of the two-digit century was justified by the belief it was extraneous information and that any code written would undoubtedly be replaced long before it became an issue.

Then two things happened, both due to human nature. One, tracking a two-digit year became a standard convention (We humans are creatures of habit) leading to the problem continuing even in systems where a four-digit year could have been accommodated easily, and older systems kept hanging around and not being replaced (If it ain't broke, don't fix it, combined with the human packrat mentality).

Thus, the Y2K problem.

The downside to the cheap and abundant memory we have nowadays is lazy programmers and bloatware. Slow and inefficient code that is sloppily written and poor at cleaning up after itself.

The bottom line being is that the Y2K problem was very real and very avoidable. But the foundational blame goes to yet another human trait, prevalent when discussing global warming: shortsightedness.


RE: Causes
By MozeeToby on 4/14/2008 11:59:05 AM , Rating: 1
so you're saying that in the code somewhere there exists this code...

year++;
if year > 100
year = 0;

because if there isn't (and why would there be?) and you were using any standard numerical datatype, incrementing past year 100 wouldn't break anything. Holding the number 127 takes exactly the same number of bits that holding number 100 does. I understand that at the time memory was expensive and therefore wellmanaged but limiting the year to two decimal digits adds complexity to the code without producing any gains in memory management.


RE: Causes
By masher2 (blog) on 4/14/2008 12:54:27 PM , Rating: 4
Mozee,

Most systems stored date values as a single unit (e.g. a certain number of seconds past a given date), not as its separate components of years, months, and days. This simplifies date arithmetic, but requires the value to be converted for textual display on the screen or for user input.

That's where most of the problems arose, as those "packing and unpacking" routines assumed a two-digit textual century. A value input as Jan 1, 2000 would be interpreted as Jan 1, 1900...and a system-generated value for the year 2000 would, once displayed on the screen then reparsed, be 'clipped' to the previous century.

Additionally, there were some primitive systems that stored dates as the actual text values, rather than any numeric form. These of course were space-limited within the data representation itself.


RE: Causes
By IGoodwin on 4/14/2008 1:28:04 PM , Rating: 3
Looking at the 'date' issue considering modern langages and data types will give a misleading impression of the situation. A standard 'date' data type was not readily available. The priority was to have human readable data, closely followed by space considerations.

While bindary data types were available, having a date in human readable form was often a higher priority, maning no conversion from database to presentation, as this would have been a large overhead considering the power of the systems at the time. This means the date vey often would be stored in the local stadard form as a character string. Meaning every application, location, or programmer whim, had different date logic. Most databases were not externally described and were little more than flat files interpreted through a structure defined, maybe differently, in each program that used it.

Please also note, that on IBM, equipment for sure, there was hardware support for human readable numbers, meaning a single byte for each digit.

Not an excuse, but an explanation. There was a case, too many years ago, where I had to work on a routine that needed to display the last 12 payrol records for an employee. The programming language did not support reading backwards trough a file, meaning some godawful code was required to remeber up to 12 records read in an array. Necessary at the time, but anyone looking back after reading a data file backwards was intoroduced would be idiotic.


RE: Causes
By Earl E on 4/14/2008 12:25:44 PM , Rating: 2
The hardcode was in 1970, I started working at the bank in 1999. Don't care why they did what they did. Just fixed it so you wouldn't be mad at the bank 1-1-2000. And that is what humans do, resolve problems before it becomes an emergency.


"There is a single light of science, and to brighten it anywhere is to brighten it everywhere." -- Isaac Asimov

Related Articles
















botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki