Whenever an extreme weather event occurs nowadays, the question is invariably asked to which extent the event can be attributed to anthropogenic climate change; was this heat wave less likely in the past? Should such extreme rainfall, in this particular river basin, be expected to intensify in a warmer future climate? These questions are paramount to assess the impact of climate change and it is essential to give a substantiated answer that is the result of transparent scientific analysis of the data. The aim of this project is to develop a protocol for these analyses and lay the foundations of an operational service for extreme event attribution under the umbrella of the European Copernicus program.

The Netherlands eScience Center is working together with the KNMI and University of Oxford to build digital infrastructure to perform climate attribution studies. We are making climate data available in the Copernicus Data Store (CDS) and create the processing and statistical tools in the Copernicus Toolbox. Future analyses using these tools will then benefit from the massive amount of data in the CDS and have the means to construct a transparent, reproducible and well-performing workflow to compute and model climate extremes. Ultimately, these data and tools will accelerate and improve attribution studies, such that the general public can be rapidly and reliably informed about the role of climate change during extreme weather events.

Related links:

Homepage Copernicus

C3S Attribution Workshop

C3S_62 Prototype Extreme Events and Attribution Service

Global Climate Models are a vital source of information on Climate Change. However, gaining insight from the vast amount of data available is problematic. Data is spread out across the world, and with the data sizes in the Petabyte range and increasing, downloading climate model data is quickly becoming infeasible, let alone doing analysis of this data.

Copernicus Climate Change Service

The Copernicus Climate Change Service is still in the development phase and will combine observations of the climate system with the latest science to develop authoritative, quality-assured information about the past, current and future states of the climate in Europe and worldwide. ECMWF operates the Copernicus Climate Change Service on behalf of the European Union and will bring together expertise from across Europe to deliver the service.

The MAGIC project

Within the Copernicus Climate Change Service, the MAGIC project is developing solutions that will help users assess Global Climate Models projections using well-established metrics and manipulation tools and receive outputs tailored to their needs. In particular, the project aims to provide products that address the needs of the coastal, water, insurance and energy sectors.

The system will allow users to access, visualize and manipulate the large data sets that are produced by climate models without having to download them to their own machine. It will combine software that have been developed by partners, either individually or within earlier European projects, into one single system. The software will contain modules to calculate standardized metrics and indices for each model, so that the models’ performance can be assessed quickly.

Users’ benefits are

  • no need to download and store large data sets
  • data access from anywhere
  • easily performing the same analysis for several datasets
  • automatically generated metrics for indicating data sets’ quality
  • logged commands to make work reproducible
  • pre-defined functionalities for reducing programming work-load
  • easier usability of climate model data by tailored tools for specific sectors (insurance, water, energy, coastal) 

The MAGIC project is funded by the Copernicus European Union Programme. The lead contractor is the Royal Netherlands Meteorological Institute (KNMI). The eScience Center is in change of the technical work done in the project.

In climate models it will not be possible to capture all relevant processes through a higher resolution or better process description. Ocean models currently use already near eddy-resolving horizontal resolutions (e.g. 0.1°) but many important processes such as upper ocean turbulence and sub-mesoscale eddies, are not adequately captured at this resolution. 

To overcome this problem one needs to exploit the property that high-frequency components in the flow get into statistical equilibrium much faster than low-frequency components and, moreover, are locally determined by low-frequency components. This can be accomplished by coupling an implicit low-resolution model to an explicit high-resolution ocean model. One runs the high-resolution model alternatingly with the low-resolution model, for a short and long time period, respectively. 

In fact, we will run an instance of the high-resolution model for each grid cell of the low-resolution model, using initial and boundary values computed at low resolution. This leads to an embarrassingly parallelizable set of high-resolution models. 

Hence, very suitable for Exascale architectures. For each low-resolution grid cell, the statistics (mean, variance) resulting from these computations will be used to define a stochastic term (state-dependent) in the low-resolution model that parametrizes the behavior of the high-resolution model. 

This process is repeated until the model gets into statistical equilibrium. For the coupling of the models, we will extend the eScience tool OMUSE, developed by NLeSC in a recent project, to one which can deal with one low-resolution model that can interact with many high-resolution models.

Many recent studies have discussed whether future climate change will be punctuated by abrupt shifts, so-called tipping points. As it would be difficult for societies and ecosystems to cope with such events, it is important to assess the associated risk. However, the list of climate tipping points put forward in scientific studies mainly results from idealized models, qualitative arguments and visual inspection. We will explore the possibility of future abrupt climate change more systematically by using the vastly increasing amount of climate model data.

As a crucial first step toward this goal we will explore the potential of change-point detection and edge detection algorithms to detect and interpret abrupt changes in large model ensembles from existing projects. Moreover, we will scan these datasets for a change in climate variability in order to learn if these changes can predict if, where and why abrupt shifts will occur. 

Our approach will update our knowledge on abrupt climate change and will allow a systematic assessment of the feasibility of data mining tools. We thereby set the scene for a larger project with new perturbed-physics ensembles that will allow us to quantify the risk of future abrupt climate change in complex models.

Image: Asian Development Bank (FlickrCC)

eScience infrastructure for Ecological applications of LiDAR point clouds. The lack of high-resolution measurements of 3D ecosystem structure across broad spatial extents impedes major advancements in animal ecology and biodiversity science. We aim to fill this gap by using Light Detection and Ranging (LiDAR) technology to characterize the vertical and horizontal complexity of vegetation and landscapes at high resolution across regional to continental scales.

The newly LiDAR- derived 3D ecosystem structures will be applied in species distribution models for breeding birds in forests and marshlands, for insect pollinators in agricultural landscapes, and songbirds at stopover sites during migration. This will allow novel insights into the hierarchical structure of animal-habitat associations, into why animal populations decline, and how they respond to habitat fragmentation and ongoing land use change. 

The processing of these massive amounts of LiDAR point cloud data will be achieved by developing a generic interactive eScience environment with multi-scale object-based image analysis (OBIA) and interpretation of LiDAR point clouds, including data storage, scalable computing, tools for machine learning and visualization (feature selection, annotation/segmentation, object classification, and evaluation), and a PostGIS spatial database. The classified objects will include trees, forests, vegetation strata, edges, bushes, hedges, reedbeds etc. with their related metrics, attributes and summary statistics (e.g. vegetation openness, height, density, vertical biomass distribution etc.). 

The newly developed eScience tools and data will be available to other disciplines and applications in ecology and the Earth sciences, thereby achieving high impact. The project will foster new multi-disciplinary collaborations between ecologists and eScientists and contribute to training a new generation of geo-ecologists.

Image: Sagar (Flickr CC License) 

Blue-Action is a 5-year European project of Horizon 2020 Blue Growth coordinated by the Danish Meteorological Institute with 41 partners. The objective of the project is to actively improve our ability to describe, model, and predict Arctic climate change and its impact on Northern Hemisphere climate, weather and their extremes, and to deliver valuated climate services of societal benefit.

A transdisciplinary approach

Blue-Action will provide fundamental and empirically-grounded, executable science that quantifies and explains the role of a changing Arctic in increasing predictive capability of weather and climate of the Northern Hemisphere.To achieve this Blue- Action will take a transdisciplinary approach, bridging scientific understanding within Arctic climate, weather and risk management research, with key stakeholder knowledge of the impacts of climatic weather extremes and hazardous events; leading to the co-design of better services.

“Bridging scientific understanding within Arctic climate, weather and risk management research.”

This bridge will build on innovative statistical and dynamical approaches to predict weather and climate extremes. In dialogue with users, Blue-Arctic will take stock in existing knowledge about cross-sectoral impacts and vulnerabilities with respect to the occurrence of these events when associated to weather and climate predictions. Modeling and prediction capabilities will be enhanced by targeting firstly, lower latitude oceanic and atmospheric drivers of regional Arctic changes and secondly, Arctic impacts on Northern Hemisphere climate and weather extremes. Coordinated multi-model experiments will be key to test new higher resolution model configurations, innovative methods to reduce forecast error, and advanced methods to improve uptake of new Earth observations assets are planned. Blue-Action thereby demonstrates how such an uptake may assist in creating better optimized observation system for various modelling applications. 

Improved robust and reliable forecasting

The improved robust and reliable forecasting can help meteorological and climate services to better deliver tailored predictions and advice, including sub-seasonal to seasonal time scales, will take Arctic climate prediction beyond seasons and to teleconnections over the Northern Hemisphere. Blue-Action will through its concerted efforts therefore contribute to the improvement of climate models to represent Arctic warming realistically and address its impact on regional and global atmospheric and oceanic circulation.

Project website: http://www.blue-action.eu

Blue-Action is funded by the EU Horizon 2020 Programme and specifically by the Blue-Growth BG-10-2016 call “Impact of Arctic changes on the weather and climate of the Northern Hemisphere”.
The Blue-Action project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 727852.

TWEX-Future will take a novel “Tales of future weather” approach in which we use scenarios tailored to a specific region and stakeholder in combination with numerical weather prediction models. This approach will offer a more realistic picture of what future weather extremes might look like, hence facilitating adaptation planning and implementation with local, actionable and reliable climate information to support the decision-making under consideration of various barriers to adaptation.

Climate change projections clearly indicate that heavy precipitation events will increase particularly in higher northern latitudes. Large-scale dynamical processes, such as atmospheric rivers that evolve from extra-tropical cyclones are known for causing anomalous strong orographic rainfall that can lead to severe flooding events at the west coast of Norway. To capture such events in realistic details, the current approach of downscaling coarse-resolution global climate model simulations has critical shortcomings. Hence, a seamless chain from global high-resolution climate modelling to regional downscaling to impact assessments is needed.

In TWEX-Future, we use case studies of high-impact flood events selected jointly with Norwegian stakeholders, such as Statkraft, and perform a holistic autopsy of the events (physical hazard, vulnerability, and barriers to adaptation). We will simulate selected events in the present and future using a combination of high-resolution global Earth system models and regional Numerical Weather Prediction models, while maintaining the stakeholder’s operational chain for analyzing the impacts of the event in the future. This will provide a valuable basis to explore, in a Norwegian context, whether the “Tales of Future Weather” approach offers added-value to current practice.

Image: Hønefoss during the 20 year flood in 2007 (CC License) 

The world is rapidly urbanizing. In 1950 around 29% of the global population lived in cities. This number augmented to around 52% in 2010, and 67% of the population is expected to live in cities in 2050. This trend requires dedicated management to keep the enlarged cities and their surrounding countryside liveable, healthy and productive. Urban physical properties, such as high-rise buildings, concrete roads, water canals, or trees alter the local environment, often in such a way that meteorological and hydrological conditions change. For instance, during heat waves thermal comfort in cities is often poor compared to the surrounding countryside. In addition, the relatively large impervious areas make cities vulnerable to flooding in case of heavy rain.

“How will climate change affect the urban environment?”

This illustrates the need to understand urban hydrometeorology (the transfer of water and energy between the land surface and the lower atmosphere). How will climate change affect the urban environment? What will the consequences be for human thermal comfort, and how we should we arrange water management in cities? Important questions that we haven’t been able to answer yet, because of limited long-term observations and limited computational capacity for urban scale simulations.

Developing an environmental re-analysis on the scale of the urban environment

The ERA-URBAN project takes up this eScience challenge and develops an environmental re-analysis on the scale of the urban environment; a long-term archive of urban energy and water balances at very high resolution (100m). Developing such an archive is now feasible as a direct and immediate extension of the (modelling and observational) infrastructure that has been built within NLeSC’s Summer in the City project. 

Summer in the City aims at forecasting urban weather and climate on a scale of 100 m using a detailed atmospheric model. It concentrates on two cities, Amsterdam and Wageningen, each studied in detail for one summer period. Generalizing its approach to multiple cities and extending its scope to precipitation and water balance is a next logical step as it allows the systematic and consistent study of urban hydrometeorological properties among cities in different climate types and local climate zones. This also allows to explore different characteristics of the landscape surrounding the urban areas, which is a prerequisite for answering urgent, currently open research questions.

Making the data in the long-term archive meaningful, insightful and useful

The goal of ERA-URBAN is to make the data in the long-term archive meaningful, insightful and useful for scientists, local-scale urban planners, policy makers, (local) companies and individual citizens. This requires a high performance computing effort by multi-model simulations, combined with data-assimilation of large volume multi-source hydrometeorological observations. 

“A high performance computing effort by multi-model simulations.”

A number of cities of contrasting size on the pan-European scale will be covered in this project. The development of the ERA-URBAN archive allows for exploring three scientific themes, related to pressing problems of the urban hydrometeorological management:

  • trends in human thermal comfort in European cities 
  • trends in extreme precipitation
  • origins of the recently reported hysteresis in the annual cycles of urban climate

Publically available

Business partners in wind energy and water management will explore the practical applicability of ERA-URBAN, which will be made publically available for use in science, engineering, and consultancy.

Image: Shutterstock

55% of The Netherlands is below sea level. This area contains 60% of the population and generates 65% of the Gross National Product. Obviously, The Netherlands requires efficient risk and water management.

Efficient risk assessment requires large-scale flood simulations with high precision in case a dike or dam breaks. Continuous monitoring of man-made infrastructures in search of small deviations or breaches. And impact assessment of urban area re-organization.

Increasing precision

For precise and effective risk assessment, we must increase precision in high-resolution flood simulations, improve accuracy of semantic trajectory determination of water channel networks, and provide the means to align and compare data sets at different resolutions to study the spatial evolution over time of an area or structure.

“The Netherlands requires efficient risk and water management.”

Unfortunately, current solutions, consisting of PostGIS combined with stand-alone applications and libraries, lack the necessary flexibility and scalability and require extensive preprocessing of the data. The goal of this project is to modernize generation and manipulation of these datasets by using a Geospatial Database Management System (DBMS). The unique advantage of this approach is that, unlike previous solutions, it stores the raw data sets, and transforms, combines and processes them only when needed. This will vastly improve flexibility and performance.

3D city models

To achieve this goal the project team will extend a Geospatial Database Management System (G-DBMS) with a flexible storage schema for 2D/3D geospatial datasets (point cloud, raster, vector, etc.). This is used to store semantically rich objects needed for the personalization of 3D digital city models (i.e., data re-generation with user defined parameters). These 3D digital city models form the basis for flow simulations, urban planning and under- and over- ground formation analysis. Additionally they are very important for automated anomaly detection on manmade structures.

“3D digital city models form the basis for flow simulations.”

In this G-DBMS, topological and geometric functionality for 3D raster manipulation will become first-class citizens. For near real-time 3D model generation and manipulation some of these operators will be complemented with a GPU version. Application specific functionality, such as constrained Delaunay triangulation and the marching cubes algorithm for surface re-construction, will also be added as they are important tools in the work done by our commercial partners.

Within this project, spatial analysis tailored to different use case scenarios is done on demand and fast enough to be used by modern risk management systems to, for example, determine trajectory escape routes. In addition, it will provide the means to identify and quantify deviations on flow patterns, such as wind and water, while modeling under- and over- ground surfaces. In other words, it addresses the challenges identified by three major companies in the sector: Fugro, Geodan, and Deltares.

On the shoulders of a successful COMMIT/ project

Furthermore, this project stands on the shoulders of a successful COMMIT/ project, “spatiotemporal data warehouses for trajectory exploitation” (P19), and the strategic partnership between CWI Database group, Netherlands eScience Center (NLeSC), TU Delft 3D Geo-information group, VU Geographic Information Systems (GIS) group and Geodan to develop core technology for “Big Data Analytics in the Geo-Spatial Domain”.

Image: Pixabay

Global biodiversity is currently declining at an unusually high rate. National and international initiatives to counteract this decline bring about a pressing demand to quantify current and future human impacts on biodiversity.

A major opportunity

A major opportunity to improve biodiversity modelling lies in the increasing availability of large-scale, multi-species data sets and the ongoing advances in modelling techniques and software. This allows for a species-by-species approach to biodiversity modelling based on so-called species distribution models (SDMs). SDMs are quantitative (regression-based) relationships between the abundance or occurrence of a species on the one hand and a set of environmental factors on the other. Clear advantages of an SDM approach to biodiversity modelling include its great flexibility and the complementary information it provides to aggregated biodiversity measures like overall species richness. On the contrary, the SDM approach to biodiversity modelling has been criticised because it typically ignores biotic interactions, which may significantly modify species distributions. However, a consistent approach to systematically include biotic interactions in SDMs has not yet been developed.

A procedure to account for biotic interactions in species distribution models

In this path-finding project, we aim to establish and test a procedure to account for biotic interactions in species distribution models. To that end, we propose to establish an interacting species distribution modelling (iSDM) approach consisting of multiple simultaneous algorithms that combine species occurrence data with data on environmental conditions and prior knowledge on species habitat preferences and species interactions (e.g., predator-prey relationships, competition, facilitation). We focus on climate and land use as these are two main determinants of species distributions.

We will test the approach for a restricted set of species with high-quality data available on occurrence and potential interactions, such as birds, mammals and vascular plants, in well-studied ‘simple’ ecosystems with relatively few interacting species, like the Arctic.

Validation

The approach will be validated in two ways: i) based on cross-validation and ii) by stacking the iSDM predictions over the different species and comparing the species richness estimates thus obtained with location-specific species richness observations.

A significant stepping stone 

In a later stage, we intend to apply the iSDM approach developed in this path-finding project to a larger set of species and regions in order to systematically quantify the importance of biotic interactions for various taxonomic groups and at different spatial scales. Thus, we will shed light on the yet unresolved issue regarding the scale dependency of biotic interactions. In addition, this path-finding project provides a significant stepping stone towards a global-scale species-by-species biodiversity assessment model, which is to be developed within the IMAGE-GLOBIO modelling framework of PBL Netherlands Environmental Assessment Agency. This aspect is further explained in section.

Image: Craig ONeal – Pool of Spoonbills (CC License) 

Updates

Stay abreast of our latest news, events and funding opportunities

  • Dit veld is bedoeld voor validatiedoeleinden en moet niet worden gewijzigd.