Historical models shed light on global warming
The uncertainty associated with projections of end-of-century global warming by Earth System Models (ESMs) can be understood in terms of two components, according to research by NOC scientist Tom Anderson (with co-workers Ed Hawkins and Phil Jones).
One component is a quantitatively robust baseline warming that occurs as a result of the core physics of the “greenhouse effect”, as investigated by Svante Arrhenius and Guy Stewart Callendar. The second component is due to system feedbacks which, although uncertain in magnitude, cause increase of warming beyond the baseline.
As such, Anderson argues, the ESMs “provide a compelling case that global climate will continue to undergo significant warming in response to ongoing emissions of CO2 and other greenhouse gases to the atmosphere.”
Professor Edward Hill, Executive Director of the NOC, said, “this is an important paper which addresses the frequently asked question of how much we can trust model-based projections of future climate change given the complexity of the climate system and uncertainties associated with multiple climate feedback processes. The paper elegantly strips this question back to its bare essentials by revisiting simple models of the 19th and early 20th century, which predict warming based on the core physics of radiative transfer in the atmosphere. The result is quantitatively robust baseline warming, a startlingly straightforward conclusion.
The complex models run on supercomputers today also include climate feedbacks that add to the baseline. There is uncertainty as to just how much extra warming these feedbacks cause, but not in the sign of their impact: amplification of warming. Model projections for ongoing global warming during the 21st century are therefore essentially trustworthy, albeit with a spread of results due to feedback uncertainties.”
Read the paper, published this week in Endeavour.