Whether or not you believe the hype from global warming, there’s new research out this week that’s pretty convincing. A couple papers released this week in Nature show that extreme weather is becoming more common. The research was completely impossible a few years ago for lack of computing power, but now with modern distributed computing technology the researchers were able to get a clearer view of what’s going on with our Earth.
One of the paper’s key concept is measuring 50 years of weather data to identify single-day and five-day extreme precipitation. Using this data, the researchers compared the history with eight climate models run under three different conditions: stable climate, natural disruptions, and man made disruptions. The comparisons showed that the natural disruptions, solar flares and volcanos, would decrease it following current trends. Instead there was an upward trend in these events.
A second paper looked at one specific event: a series of extreme floods that took place in England and Wales in 2000. Using the single model of northern climate, including ocean temps and ice cover, they were able to recreate conditions from 1900. Those outputs were then run through another model to figure when other events similar to the UKs’ occurred.
To get all this data run through the different models, the researchers turned to climateprediction.com, a distributed computer project similar to SETI@home. Using people’s idle times as screensavers, the researchers then had the power to run the programs.
All scenarios showed flooding. With the models, however, the scientists were able to show both how the weather would be with and without humans. All results with humans showed that risk went up 20% or more and in two out of three cases the risk went up to 90%. So, they conclude that climate change very likely contributed to the floods in 2000. Now that we have these scary results, now might be the time to run to the store and grab gallons of water.
[via ars technica]