CMIP Phase 6 (CMIP6)

CMIP6 modellers, data managers, and data users can find the answers to most of their questions in one of the three specialized guides available at the PCMDI CMIP6 website.

Overview of the CMIP6 Experimental Design and Organization

The overview paper on the CMIP6 experimental design and organisation is published in GMD (Eyring et al., 2016). This CMIP6 overview paper presents the background and rationale for the new structure of CMIP, providing a detailed description of the CMIP Diagnostic, Evaluation and Characterization of Klima (DECK) experiments and CMIP6 historical simulations, and includes a brief introduction to the 23 CMIP6-Endorsed MIPs.

A brief summary can be found in this overview presentation and below. After a long and wide community consultation, a new and more federated structure was put in place consisting of three major elements:

  1. A handful of common experiments, the DECK (Diagnostic, Evaluation and Characterization of Klima) and CMIP historical simulations (1850 – near-present) that will maintain continuity and help document basic characteristics of models across different phases of CMIP,
  2. Common standards, coordination, infrastructure and documentation that will facilitate the distribution of model outputs and the characterization of the model ensemble, and
  3. An ensemble of CMIP6-Endorsed Model Intercomparison Projects (MIPs) that are specific to CMIP6 and that will build on the DECK and CMIP historical simulations to address a large range of specific questions and fill the scientific gaps of the previous CMIP phases.

References:

CMIP6 Special Issue

A CMIP6 Special Issue is published in GMD (see here). This special issue describes the new design and organization of CMIP and the suite of experiments of its next phase (i.e., CMIP6) in a series of invited contributions. The description of the experiments and forcing data sets define CMIP6 in detail. The papers provide the required information to produce a consistent set of climate model simulations that can be scientifically exploited to address the three broad scientific questions of CMIP6: (1) How does the Earth system respond to forcing?, (2) What are the origins and consequences of systematic model biases?, and (3) How can we assess future climate changes given climate variability, predictability and uncertainties in scenarios? The special issue will include an overview paper on the CMIP6 design and organization, contributions from CMIP6-endorsed MIPs and descriptions of the forcing data sets. 

CMIP6 Data Request

ContactsMartin Juckes

The CMIP6 Data Request consolidated data requirements from 23 Model Intercomparison Projects (MIPs) into a single database  (Juckes et al., 2020).  The data request is available as an XML database, through a python package (also available as source code), and can be browsed online. For more details see the CMIP6 Data Request landing page.

References:

Juckes, M., Taylor, K. E., Durack, P. J., Lawrence, B., Mizielinski, M. S., Pamment, A., Peterschmitt, J.-Y., Rixen, M., and Sénési, S.: The CMIP6 Data Request (DREQ, version 01.00.31), Geosci. Model Dev., 13, 201–224, https://doi.org/10.5194/gmd-13-201-2020, 2020.

CMIP6 Model Evaluation System

ContactsVeronika Eyring and Peter Gleckler 

Over the last decades significant progress has been made in model evaluation. The CMIP community has now reached a critical juncture at which many baseline aspects of model evaluation need to be performed much more efficiently to enable a systematic and rapid performance assessment of the large number of models participating in CMIP. Such an evaluation system will be implemented for CMIP6. Our initial goal is that two capabilities will be used to produce a broad characterization of CMIP DECK and historical simulations as soon as new CMIP6 model experiments are published to the Earth System Grid Federation (ESGF):

At the WGCM meeting, it was decided that the results of these tools can be displayed on a public (rather than a password restricted) website. The results will initially be water-marked until a quality control has occurred. This strategy was supported by the WGCM.

(a) the Earth System Model Evaluation Tool (ESMValTool, Eyring et al., 2016a) is a community-developed diagnostic and performance metrics tool for the evaluation of Earth system models with observations. It includes other well-established model evaluation packages such as the NCAR Climate Variability Diagnostics Package (CVDP, Phillips et al., 2014). The collection of standard namelists for example allows to reproduce the figures from the climate model evaluation chapter of IPCC AR5 (Chapter 9) and parts of the projection chapter (Chapter 12). The ESMValTool is available as open source software on GitHub. The website here shows results produced with the ESMValTool for CMIP5 simulations. This website will be updated with CMIP6 results as soon as the model output is submitted to the ESGF. All modelling groups are encouraged to check the results for their model.

(b) the PCMDI Metrics Package (PMP, Gleckler et al., 2016) emphasises a diverse suite of summary statistics to objectively gauge the level of agreement between model simulations and observations across a broad range of space and time scales. It is built on the Python based Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT), a powerful software tool kit that provides cutting-edge data management, diagnostic and visualisation capabilities. The PMP is available as open source software on GitHub.

Since these tools are freely available on GitHub, modelling groups participating in CMIP can additionally make use of these packages. They could choose, for example, to utilize the tools during the model development process in order to identify relative strengths and weaknesses of new model versions also in the context of the performance of other models or they could run the tools locally before publishing the model output to the ESGF. Mechanisms are in place to enable contributions from the broader community. Both tools are designed to readily work across ESGF nodes with the intent of ultimately expediting routine analysis by alleviating the needs for data transfer. We expect the benefits of this activity to become increasingly apparent during the research phase of CMIP6. We encourage the community to consider contributing additional diagnostics and metrics to these CMIP6 evaluation tools. More details on this approach can be found in Eyring et al. (2016b).

References:

CMIP5 survey

  • CMIP5 Survey – sent out to representatives of the climate community end of June 2013
  • Synthesis of CMIP5 Survey (August 2013) (Presentation by Veronika Eyring and Ron Stouffer at the Workshop ‘Next Generation Climate Change Experiments Needed to Advance Knowledge and for Assessment of CMIP6’, 5 August 2013, Aspen, CO, USA
  • Synthesis of CMIP5 Survey (October 2013) (Presentation by Veronika Eyring and Ron Stouffer at the WGCM 17th Session, 1-3 October 2013, Victoria, Canada)
To top