
A new open-access tool that dramatically speeds up the evaluation of climate models has been launched by an international team of scientists. The Rapid Evaluation Framework (REF) allows researchers to compare model outputs with real-world observations, providing immediate insight into model performance.
The Rapid Evaluation Framework (REF) was launched in March at the CMIP 2026 Community Workshop 2026, held in Kyoto, Japan. It was developed by a team of scientists working on the Coupled Model Intercomparison Project (CMIP), a global collaboration that develops, compares, and improves climate models used in major reports, such as those produced by the Intergovernmental Panel on Climate Change (IPCC). The latest phase of CMIP, CMIP7 (Dunne et al., 2025), is expected to imminently start delivering data using the latest generation of models.
The REF makes it much faster and easier to assess how well these climate models perform by automatically comparing their outputs against real-world observations. Until now, evaluating climate models could take months and require downloading terabytes of data. The REF automates much of the process, running checks across a wide range of measurements and producing results that are available online for anyone to access.
Dr Ranjini Swaminathan, co-lead of the Model Benchmarking Task Team and scientist at the UK’s National Centre for Earth Observation, University of Reading, said: “This tool brings together climate scientists and Earth observation researchers to quickly check how accurately climate simulations reflect reality. The better we can do that, the more reliable our picture of future climate change becomes, and the better equipped policymakers and communities are to respond to it.”
The REF will initially be used to evaluate results from the CMIP7 Assessment Fast Track, which has been designed to respond to the needs of national and international climate assessments, including the IPCC Seventh Assessment Report. IPCC authors are a key user group for this release of the REF, to facilitate their need to rapidly evaluate and assess the latest scientific advancements. By accelerating model evaluation, the REF could help ensure that the latest climate science is incorporated more quickly into major international assessments, including those of the IPCC.
The REF is freely available online and can also be deployed locally at modelling centres. Results are delivered through an interactive dashboard, alongside downloadable outputs in formats such as netCDF, CSV and PNG, enabling more bespoke and in-depth analysis. The REF also includes an API, allowing users to run evaluation metrics from established community tools.
The dashboard will be hosted on the next generation of the Earth System Grid Federation (ESGF), the global infrastructure used to distribute CMIP data.

Three community evaluation and benchmarking packages are included in the CMIP Assessment Fast Track REF (ESMValTool, ILAMB/IOMB and PMP). Development of the open-source REF framework software, led by Climate Resource, and contributions from the Netherlands e-Science Centre has been funded by the European Space Agency. There were US contributions from the established community benchmarking ILAMB/IOMB, PMP and CMEC packages, as well as the work to ensure deployment on ESGF. DLR, National Centre for Earth Observation (NCEO), Science and Technology Facilities Research Council and CEDA have all contributed time-in-kind of their staff members on the delivery team.
The REF is expected to expand beyond CMIP7 to support a wide range of climate modelling activities under the World Climate Research Programme (WCRP). A new governance panel under the WCRP Core Project ESMO is being established to guide its future development.
If you would like to support the maintenance and expansion of the REF, please contact:
Find out more about the REF at https://climate-ref.org/
BACKGROUND AND ALTERNATIVE QUOTES
Quotes may be used with attribution to the named author and organisation.
Birgit Hassler, co-lead of the Model Benchmarking Task Team and research scientist at DLR Oberpfaffenhofen, said: “It is fantastic to see that the community is so excited about the possibilities the REF offers. We had so many engaging conversations with community members who have ideas already on how to expand the REF and its usefulness.”
Ranjini Swaminathan, co-lead of the Model Benchmarking Task Team and Core Scientist at the UK’s National Centre for Earth Observation, University of Reading, said: “This tool brings together climate scientists and Earth observation researchers to quickly check how accurately climate simulations reflect reality. The better we can do that, the more reliable our picture of future climate change becomes, and the better equipped policymakers and communities are to respond to it.”
Forrest Hoffman, co-lead of the Model Benchmarking Task Team and ESGF Executive Committee co-chair said: “The REF was built by the international community to serve the needs of modelling centres and the research community. Enabling three open-source evaluation packages, spanning different Earth system components, to run in tandem and produce an integrated dashboard of diagnostic outputs is a remarkable technical achievement in such a short period of time. It will be an invaluable tool for identifying model uncertainties and prioritizing model improvements.”
Jared Lewis, Delivery Team Manager and Chief Technical Officer at Climate Resource said: “The availability of these pre-processed results would have saved me months of time over my career. Sharing these common, pre-computed diagnostics gives a great starting point for new science, especially for colleagues in the Global South.”
John Dunne, co-chair of the CMIP Panel and Research Oceanographer at the Geophysical Fluid Dynamics Laboratory, Princeton University said: “This release marks an important milestone advancing global accessibility and usability of CMIP data. We are all very proud of the accomplishment and appreciative of the hard work by this growing community allowing this accomplishment.”
Demiso Daba, Team Member of Model Benchmarking Task Team and Researcher at Arba Minch University, said: “The Rapid Evaluation Framework significantly reduces the time needed to assess climate model performance. By making pre-processed diagnostics openly available, it helps researchers focus more on scientific insights rather than data preparation.”
NOTES FOR EDITORS
The Coupled Model Intercomparison Project (CMIP) is an international climate modelling project, within the World Climate Research Programme. It is designed to better understand past, present and future changes in Earth’s climate. CMIP has been organised in different phases, each with new and improved climate model experiment protocols, standards, and data distribution mechanisms. Since CMIP6, the community has aimed to evaluate the simulations published on the Earth System Grid Federation (ESGF) rapidly, providing near‑real‑time assessment to users.
CMIP’s scientific governing panel, the CMIP panel, established the CMIP7 Model Benchmarking Task Team – a dedicated task team, the Model Benchmarking Task Team, with a number of objectives which included developing a framework that allows quick simulation access and evaluation.
The Earth System Grid Federation (ESGF)is a global distributed infrastructure that archives and provides access to massive Earth system datasets, sponsored by different agencies in different regions and countries. Together with CMIP, they provide the standardised simulation data (such as for CMIP5, CMIP6, CMIP7) used for analysis, synthesis, and community assessments. These data represent many petabytes of data available to researchers worldwide.
The CMIP Model Benchmarking Task Team is made up of members with benchmarking and evaluation expertise across land, oceans, atmosphere, land & sea ice, Earth systems, climate adaptation and downscaling. The Task Team members devised the original scoping of the Framework, have proposed, consulted on and finalised the diagnostics being used within it and have provided scientific steer for guidance, standards, dashboard design and documentation generated for the REF. The task team members are listed here: Climate Model Benchmarking – Coupled Model Intercomparison Project