As an update to my story earlier this month on the discovery of global warming on Mars, I thought it appropriate to survey the rest of the solar system. Global warming was detected on Jupiter last year, and the warming is apparently behind the formation of a second red spot. Global warming on Neptune's moon Triton has also been noted, with severe atmospheric changes as a result. And even tiny Pluto has experienced moderate warming in recent years, with temperatures rising a full 3.5 degrees.
The common denominator in all these cases, the Earth included, is of course the Sun, which is in the middle of an extremely active period at present. The last time it was so active was during the Medieval Warm Period of 700 years ago, a period where the Earth was warmer than it is today. Interestingly enough, the period in which it was least active (the Maunder Minimum) corresponds with the Little Ice Age the earth experienced in the 17th century.
Such correlations are causing many scientists to consider the Sun the primary cause of terrestrial climate change. The initial problem with this theory was that the changes in solar flux didn't appear to be enough to account for the warming.
However the research of scientist Henrik Svensmark of the Danish Space Research Institute has provided the missing link. Increased solar activity not only warms the earth directly, it increases the strength of the solar winds. This reduces the amount of cosmic radiation striking the earth, which directly reduces the formation rate of clouds. Less clouds = more warming.
Astrophysicist Nir Shaviv reconstructed 550 million years of Earth's climate change history. He found that 2/3 of the temperature variance could be explained by changes in cosmic flux alone, without even considering the direct influence of solar heating.
This has always been a weak point of CO2-based models, which have never been able to succesfully explain these warming and cooling trends in our past.