As the threat of global warming is increasing day by day a large number of people still question the seriousness and legitimacy of the issue. One frequent attack the scientific community has had to face was over the efficiency of climate models.
It has always been argued that climate models overestimate temperature rises and other calculations. For so long scientists have been trying to establish the models are indeed reliable. A recent study climate models that included even the models from the 70s might just put an end all reliability concerns surrounding a model.
Zeke Hausfather of UC Berkeley, Henri Drake and Tristan Abbott of MIT, and Gavin Schmidt of the NASA Goddard Institute for Space Studies has performed a review of the said models and published it in “Geophysical Research Letters” publication. They have found that the “climate models published over the past five decades were generally quite accurate in predicting global warming in the years after publication.”
The new results shun the deniers claim that Models are not to be trusted.
— Glen Peters (@Peters_Glen) June 13, 2017
When trying to predict the climate we need to create a model that can simulate our Earth. But its a very complicated affair and we do not have a duplicate Earth to be used as an experimental control group. Instead, we rely on two things: historical data and climate physics. The model thus created is tested with new values(carbon emissions) to predict the rise in temperature.
Most of the older climate models are now out of date, after being replaced with much sophisticated scientific principles and computational capabilities. But what is remarkable from the current study is that even these models were able to predict temperature rises fairly accurate.
Accuracy of any given model is governed by two contributing factors. One-how biophysical systems like oceans and atmosphere react to external radiative forcings(like carbon dioxide and other greenhouse gases). The other being the amount of these radiative forcings.
While the reaction of biophysical systems comes under the purview of standard physics. The amount of radiative forcing is dependent on various factors such as demographics, sociology, economiy which doesn’t quite come under the standard laws of physics.
Historically it was the James Hansen climate model in the 1980s that helped garner public interest in global warming. Ironically sceptics use the same model to question the validity of climate models as the results predicted by Hansen overshoot actual recording by around fifty percentage.
Hausfather and his team’s new analysis tell us that James Hansen’s model wasn’t wrong. He got the physics part of the equation right. But his assumption of greenhouse gases such as methane and cholorofluro carbons were significantly higher than actual future values.
When used with the corrected figures his model predicted the rise in global temperature more accurately.
The researchers reached the above findings by testing the models against two different metrics.
1.Temperature Vs Time
2.Temperature Vs Change in Radiative Forcings (Implied TCR)
Here are the two metrics plotted against observed Global Mean Surface Temperatures (GMST)
To sum it all up the review found that 10 of the 17 models were consistent with observed GMST. 14 of the 17 models were consistent with the observed relationship between forcings and temperature change. Two had an implied TCR that was too high, one too low. All of this translates to the fact that the models are indeed accurate.
Having accurate models are but in no means a sign of relief. The results from the more sophisticated models in the present are presenting a grim future for life on this planet.
Hopefully, the increased trust in climate models shall enable scientists and environment tackle the allegations arising from the sceptics and establish the gravity of climate problem on a wider scale to prompt government agencies in taking the much-needed stead fast corrective measures.