Inside Science >> Modelling the future
Defra scientists Professor Robert Watson and Dr Rupert Lewis examine the science behind climate modelling
In climate science it’s often said that “the past is no guide to the future.” The inherent (chaotic) complexity of the climate system, coupled with the recent unprecedented increases in volumes of greenhouse gases (GHG) puts us into uncharted territory where past patterns cannot be simply extrapolated forward.
This has created a demand for climate models – mathematical simulations of how the climate works, based on what is known about the physics of the climate system. These tell us what happens to the energy received from the Sun: how it affects the atmosphere, oceans, land and ice etc, and how perturbations to the climate system – through, for example, changes in GHG emissions – affect this energy balance and the future climate.
The 1960s saw a significant breakthrough that has really set the path for climate modelling, with the creation of “general circulation models”. These divide the earth into a three-dimensional grid and “run” the basic climate equations for each cell of the grid.
If we were to divide the atmosphere into one-degree grids (about 110km long at the Equator), and about 20 vertical layers, this gives more than one million sets of climate equations that all must be run to move the whole model forward one time-step.
If such a time-step is half an hour, and we wanted to look forward to the end of the century, each of these million calculations would need to be recalculated about 1.5 million times – hence the close links between modelling progress and developments in high-performance computing.
Model teams test their approaches using various “hind-casting” techniques to look back and see how well they are able to reproduce the observed historical and paleontological climate records. It has been more difficult to test uncertainty in models for forward projections in time.
Now, “ensemble” techniques – where the same model is run thousands of times, varying the parameters within plausible ranges – are applied using computers to develop a model where the probability of particular outcomes can be estimated against a large range of results.
An analogy would be sampling 10 playing cards (a single model run) from a full pack. There’s no immediate way of knowing whether, say, the two red and eight black cards picked by chance are a good representation of the pack. If, however, we sampled 10 cards thousands of times, we would get a very good idea of the risk of betting on this 8:2 outcome.
The UKCP09 climate projections use exactly this approach – taking thousands of samples (model runs) to create “probabilistic projections”. This allows decision makers to get a feel for how likely particular model outcomes or ranges of outcomes are.