Can we trust climate models?

Scientists use models like the Community Climate System Model (CCSM, shown here) to increase their understanding of the world's climate patterns and learn how they may affect regions around the globe. Credit: PNNL

Scientists use models like the Community Climate System Model (CCSM, shown here) to increase their understanding of the world’s climate patterns and learn how they may affect regions around the globe. Credit: PNNL

Computers crash, freeze, corrupt documents, and otherwise make us swear at them every day. At such moments I briefly blow my own fuse, and my computer becomes my enemy – until I remember it’s revolutionised how I work, communicate and access information. But knowing how easily they can go wrong – and how easily a small, overlooked, mistake in a piece of software can cause unexpected problems later – makes me cautious. That extends to writing this blog, when I often wonder just how much we can rely on the computer models used so widely by scientists studying global warming. So this year I’ve been asking researchers questions like: Why even use models? How can we trust that they’re accurate? How should we understand what they come up with?

These questions go deep into how science works, using evidence from what people see, or experiments we conduct, to build or knock down ideas. The best evidence is directly measured, in as much detail as possible. Today that’s available in some cases, but not all, and we can’t go back in time to get data over the long time periods that might be ideal. For example, this previously limited our understanding of global warming’s effect on tropical cyclones, Bruno Chatenoux from the Global Change and Vulnerability Unit at the United Nations Environment Program in Geneva, Switzerland told me in February. “Formal detection of trends in the existing records is challenged by data quality issues and record length,” he told me. “Model projections suffer less from this, but have other challenges, such as whether they are accurately representing all of the relevant physical processes.”

And while there are a lot of processes to represent, researchers have worked hard to establish them, underlined Xuefeng Cui from Beijing Normal University, China, in July. “Climate models have been developed by groups of scientists to include atmosphere, oceanography, land, biology, chemistry, physics, computing science for about 40 years,” he said. “They have a solid scientific foundation and model the climate system in reasonable resolution.”

This kind of model’s not about beauty – or is it?

In February 2012, simulations by researchers from Princeton University and the Massachusetts Institute of Technology said increases in sea level and storm intensity brought on by climate change could make devastating storm surges more frequent. Using the New York City area as a model, the researchers found that floods experienced every century could instead occur every one or two decades. The worst simulated flood (left) was a 15.5-foot storm surge at Manhattan's Battery (black star) that stemmed from a high-intensity storm (black line) moving northeast and very close to the city. A weaker but larger northwest-bound storm (right) that was further from the city would result in floodwater nearly 15 feet deep as its strongest winds pushed water toward the Battery. The colored contours represent the maximum surge height, from 0 (blue) to 5 (violet) meters. Credit: Ning Lin

In February 2012, simulations by researchers from Princeton University and the Massachusetts Institute of Technology said increases in sea level and storm intensity brought on by climate change could make devastating storm surges more frequent. Using the New York City area as a model, the researchers found that floods experienced every century could instead occur every one or two decades. The worst simulated flood (left) was a 15.5-foot storm surge at Manhattan’s Battery (black star) that stemmed from a high-intensity storm (black line) moving northeast and very close to the city. A weaker but larger northwest-bound storm (right) that was further from the city would result in floodwater nearly 15 feet deep as its strongest winds pushed water toward the Battery. The colored contours represent the maximum surge height, from 0 (blue) to 5 (violet) meters. Credit: Ning Lin

Richard Turco from the University of California, Los Angeles, who I spoke to for an environmentalresearchweb article published in March, worked with early atmospheric chemistry simulations when doing his PhD in the late 1960s. “The typical computer code might consist of one or two big boxes of punch cards that had to be lugged around to the computer centre,” he recalled. “Occasionally, unfortunately, we would drop the boxes and scramble the programs up, so we had to reassemble them. You’d drop off your cards one day and maybe come back the next day to pick up the output. Things today are much faster, obviously. You want to run a program, you push a button. And nowadays the chemistry, dynamics, and aerosol microphysics, have been packaged together, making an atmosphere coupled with an ocean and maybe a biosphere model as well. The advances have been astounding during my career.”

Such advances also bring with them a need for caution to avoid the modern equivalent of a scrambled set of cards, Richard added. “There’s a danger in that people are running models in the same way you would use a complex instrument to make a measurement, where you don’t really understand its innards,” he said. “The models have become more complex and the users have got more remote from the details. You need a team of engineers and technicians who are constantly testing and maintaining the model.” He underlined that this is being done, for example by comparing different models and making sure they get similar results from the same inputs, as well as direct maintenance.

The most important of such efforts to “cross-calibrate” models are the Coupled Model Intercomparison Projects that have fed into the major Intergovernmental Panel on Climate Change (IPCC) assessment reports. But scientists across the world also often check how well models simulate past climate data, like temperature, when they’re run with actual conditions, such as greenhouse gas concentrations. Sometimes simulating the past can be used to check how well models represent different parts of the climate system, such as the shrinking Arctic ice. That’s what Muyin Wang from the University of Washington did in September, finding clear evidence that human greenhouse gas emissions are speeding its loss. “It’s important to be able to reproduce past climate and variations to be confident in models’ predictions,” Muyin told me. “If you are interviewing someone for a job, you look at their resumé, to see if they did a good job in the past. Then you know that they can do the job going forward. It’s a similar idea here, if models can simulate the past climate, then they’re the models we want to use in the projection.”

The uncertain facts of life

The Community Atmosphere Model–Spectral Element simulates long-term global climate. Improved atmospheric modeling with the US Oak Ridge National Laboratory's new supercomputer Titan will help researchers better understand future air quality as well as the effect of particles suspended in the air. The new Titan system will be able to simulate from one to five years per day of computing time, up from the three months or so that its predecessor Jaguar was able to churn through in a day. Simulations on supercomputers can tell us about infrastructure investments needed to deal with the consequences of global climate change. Credit: Oak Ridge National Laboratory

The Community Atmosphere Model–Spectral Element simulates long-term global climate. Improved atmospheric modeling with the US Oak Ridge National Laboratory’s new supercomputer Titan will help researchers better understand future air quality as well as the effect of particles suspended in the air. The new Titan system will be able to simulate from one to five years per day of computing time, up from the three months or so that its predecessor Jaguar was able to churn through in a day. Simulations on supercomputers can tell us about infrastructure investments needed to deal with the consequences of global climate change. Credit: Oak Ridge National Laboratory

At other times scientists go on from modelling historic climate to try and understand the “uncertainty” range of future climate projections, like Dan Rowlands from Oxford University, UK, did. In March, he told me how his team ran 10,000 simulations of the past, each treating basic processes slightly differently, in one climate model. The scientists then threw out the simulations that didn’t match with history, before simulating the future and finding a wider range, predicting slightly higher temperatures on average. To do this, they used spare time donated by the general public on 30,000 computers in the climateprediction.net scheme. We can only speculate whether the owners were on good terms with their machines at the time.

Together these efforts help ensure models’ accuracy, establish trust in their output and improve our understanding of the basic science. They also provide projections like those in the IPCC’s last report that covered an uncertainty range of temperatures likely with climate change. In one example, if we use a mixture of renewable and fossil fuels, models project temperatures between 1.7 and 4.1°C higher than 100 years previously by the end of the 21st century. You might like a narrower uncertainty range – and it’s possible scientists might be able to give us one. But Xuefeng Cui stressed that these model projections always have uncertainties, and shouldn’t be regarded as exact predictions.

Last month, two other researchers also told me how uncertainty in climate models is a “fact of life”. In one, Clara Deser from the US National Center for Atmospheric Research (NCAR) in Boulder, Colorado even told me where the accuracy limit might be. In the other, Paul Higgins from the American Meteorological Society in Washington DC found that the ‘carbon cycle’ of chemicals moving through living creatures and the environment could increase model uncertainty. But that shouldn’t stop us from acting to fight climate change, he emphasised. “If you go out driving, the very fact that what’s going to happen to you while you’re out driving is uncertain is why you buckle your seatbelt,” he said. “If you waited until you were sure you were going to be in an accident, it would be too late and you would have managed the risk very poorly. Quantifying uncertainty helps us to understand risk.”

Journal References:

Peduzzi, P., Chatenoux, B., Dao, H., De Bono, A., Herold, C., Kossin, J., Mouton, F., & Nordbeck, O. (2012). Global trends in tropical cyclone risk Nature Climate Change, 2 (4), 289-294 DOI: 10.1038/nclimate1410
Wei, T., Yang, S., Moore, J., Shi, P., Cui, X., Duan, Q., Xu, B., Dai, Y., Yuan, W., Wei, X., Yang, Z., Wen, T., Teng, F., Gao, Y., Chou, J., Yan, X., Wei, Z., Guo, Y., Jiang, Y., Gao, X., Wang, K., Zheng, X., Ren, F., Lv, S., Yu, Y., Liu, B., Luo, Y., Li, W., Ji, D., Feng, J., Wu, Q., Cheng, H., He, J., Fu, C., Ye, D., Xu, G., & Dong, W. (2012). Developed and developing world responsibilities for historical climate change and CO2 mitigation Proceedings of the National Academy of Sciences, 109 (32), 12911-12915 DOI: 10.1073/pnas.1203282109
Muyin Wang and James E. Overland (2012). A sea ice free summer Arctic within 30 years: An update from CMIP5 models Geophys. Res. Lett. DOI: 10.1029/2012GL052868
Rowlands, D., Frame, D., Ackerley, D., Aina, T., Booth, B., Christensen, C., Collins, M., Faull, N., Forest, C., Grandey, B., Gryspeerdt, E., Highwood, E., Ingram, W., Knight, S., Lopez, A., Massey, N., McNamara, F., Meinshausen, N., Piani, C., Rosier, S., Sanderson, B., Smith, L., Stone, D., Thurston, M., Yamazaki, K., Hiro Yamazaki, Y., & Allen, M. (2012). Broad range of 2050 warming from an observationally constrained large climate model ensemble Nature Geoscience, 5 (4), 256-260 DOI: 10.1038/ngeo1430
Deser, C., Knutti, R., Solomon, S., & Phillips, A. (2012). Communication of the role of natural variability in future North American climate Nature Climate Change, 2 (11), 775-779 DOI: 10.1038/nclimate1562
Higgins, P., & Harte, J. (2012). Carbon Cycle Uncertainty Increases Climate Change Risks and Mitigation Challenges Journal of Climate, 25 (21), 7660-7668 DOI: 10.1175/JCLI-D-12-00089.1

4 Responses to “Can we trust climate models?”

  1. andyextance Says:

    As well as his great seatbelt analogy, Paul Higgins also sent me this comment about climate models:

    For what it’s worth, I think climate models are extremely valuable tools. We’ve learned an enormous amount from them and I think it’s fair to say both that they are terrific tools for learning about climate and that they incorporate our best understanding. It’s also worth noting that they have been extensively validated and found to be accurate in many respects. Of course, that doesn’t mean they can do everything equally well.

  2. jyyh Says:

    I’m not sure about the following, but…
    Isn’t there a rule of complex systems that says something like “When not sure about what drives a complex system, it’s better to include as many variables as possible.”? In the climate context, many of the variables have been already ordered in their relevance, like
    1) CO2 is currently more of a greenhouse gas than (free) methane
    2) Solar variations cannot drive the recent warming
    3) ENSO is the primary modifier of tropical weather

    As these are quantified to a degree, shouldn’t there be a model that would have the additional parametrizations and adjustable datasets of various relevant factors to get the correlation better, like
    1) Local variation in the uptake of CO2 by plants
    2) parametrization of clouds (however crudely gridded)
    3) methane feedback (crude estimation (tropical, agricultural, permafrost, hydrates)
    4) uncertainties in anthropogenic CO2 emissions (whole holocene)

    though these would have to be limited to having a maximum effect of the uncertainty in the correlation between the model and measurements (in order to keep the model approaching the real). This would of course give only the minimum effect of these a-bit-unkown factors. Then, as the maximum correlation of known unkown factors has been reached, the scientists could cautiously modify the model to explore the effects of, let’s say explosive methane bursts in the Arctic, which would drive the development of the model to the extreme scenarios that (by paleoclimatology) are possible.

    I guess what I’m trying to say I prefer the most complex model possible, never mind the grid scale. They’re supposed to be Climate models, not weather models. Yes it’s fancy to explore the possible outcomes of hurricane frequency when tropical atlantic ocean warms, but as there is still very much to be explored in the even larger scale phenomena like the
    1) location and behavior of jet streams,
    2) full extent of the Arctic amplification
    3) the double ITCZ problem
    4) the effect of changing storm tracks on WAIS
    it may be asked how accurately the models perform predicting various phenomena around the world like
    1) monsoon/ITCZ movements/intensity
    2) snowpack duration in spring
    3) glacier outflow
    3) effect of Gulf of Mexico oil leaks on the US drought (tongue-in-cheek)

    I mean, the specialized models might do well in reconstructing the past behavior of a complex system, but if (let’s say) the jet stream shifts north in the past area of hurricane tracks, will a change in this larger part of the whole system alter the result and how? The things known for certain (select accuracy) should be fixed (to a place) in the models and the additional known unknowns should be adjustable. One has to have an idea of the house (blueprints), before cutting the crossbeams to a lenght. Now it begins to feel I’m repeating myself and probably this has been said elsewhere too and better.

    Thanks for a clear picture of the validation process, I guess I couldn’t have done better myself (haven’t tried)


Leave a comment