MPE CDT Student Cohort 2017

Elena Saggioro

Based at: University of Reading
Research project: Causal approaches to climate variability and change
Supervisors: Professor Ted Shepherd (Lead Supervisor, Department of Meteorology, University of Reading), Professor Sebastian Reich (Department of Mathematics and Statistics, University of Reading), Dr Jeff Knight (Met Office)

Project summary: Although there is high confidence in thermodynamic aspects of global Climate Change, there is still great uncertainty in its dynamical aspects at regional scale. This is due to the role of atmospheric circulation, whose response to external forcing is still poorly understood. Progress in its understanding is not always helped by Climate Models, which at the present time show great spread in regional simulations. In order to constrain this spread to its physically plausible range, seasonal observational data are employed. But how to proceed when comparing data and model outputs? Today, the commonly used statistical procedure relies on correlations. However, when it comes to attribute an observed behaviour to some cause, correlation is not sufficient. Correlation does not imply causation, but to translate this statement into practice is not trivial at all. Nevertheless, recent decades have seen great leaps forward in the understanding and implementation of causal inference thanks to the Aritificial Intelligence community. Formal statistical methods based on Bayesian Causal Networks (BCN) have been developed and succesfully applied. A BCN translates into sequential evaluation of conditional probabilities, which allow to give a causal interpretation of correlated variables. These methods are only just beginning to be considered within climate science. An example is found in Kretschmer et al. (2016), wherein the scheme of winter Arctic circulation is described as a BCN in which the nodes of the network are the stationary time-series of the relevant climate variables.
The present project aims to develop a bayesian causal network to explore the puzzling Southern Hemisphere (SH) late-spring circulation variability. In particular, we aim to understand the causal relationship between the annual late-spring breakdown of the SH stratospheric polar-vortex and its tropospheric manifestations. A recent work of Byrne et al. (2017) suggests that tropospheric seasonal variability is better understood as a result of interannual variability in the timing of the breakdown event. To address this, the project will develop a causal network suitable for non-stationary climate phenomena, such as seasonal regime transitions. A particular focus will be quantifying the improvement in signal-to-noise ratio obtained by using a non-stationary rather than a stationary statistical framework.

References:
Byrne, N.J., Shepherd, T.G., Woollings, T. and Plumb, R.A., 2017. J. Clim., 30, 7125–7139.
Kretschmer, M., Coumou, D., Donges, J.F. and J. Runge, 2016: J. Clim., 29, 4069¬–4081.

Niraj Agarwal

Based at: Imperial College London
Research project: Data-Driven Reduced Order Modelling of Multiscale Oceanic Variability
Principal Supervisor: Prof. Pavel Berloff (Department of Mathematics, Imperial College London)
Co-advisor: Peter Dueben, (ECMWF)

Summary: The oceanic turbulent circulation exhibits multiscale motions on very different space and time scales interacting with each other; e.g., jets, vortices, waves, and large-scale variability. In particular, mesoscale oceanic eddies populate nearly all parts of the ocean and need to be resolved in order to represent their effects on the general ocean and atmosphere circulations. However, capturing effects of these small-scale flows is highly challenging and requires non-trivial approaches and skills, especially when it comes to representing their effects in non-eddy resolving ocean circulation models. Therefore, the main goal of my project is to develop data-driven eddy parameterizations for use in both eddy-permitting and non-eddy-resolving ocean models. Dynamical models of reduced complexity will be developed to emulate the spatio-temporal variability of mesoscale eddies as well as their feedbacks across a large range of scales. These can serve as a low-cost oceanic component for climate models; and therefore the final aim of this project is to use the existing observational data to feed eddy parameterizations in comprehensive ocean circulation and climate models such as the ones used in global weather forecasts or in Climate Model Intercomparison Project(CMIP) models like CMIP7.

We will employ a variety of both common and novel techniques and methods of statistical data analysis and numerical linear algebra to extract the key properties and characteristics of the space-time correlated eddy field. The key steps involved in this framework are, a) first, find the relevant data-adaptive basis functions, i.e. the decomposition of time evolving datasets into their leading spatio-temporal modes using, for example, variance-based methods such as Principal Component Analysis (PCA) and, b) once the subspace spanned by above basis functions are obtained, we derive the evolution equations that emulate the spatio-temporal correlations of the system using methods such as nonlinear autoregression, artificial neural network, Linear Inverse Modelling (LIM), etc.

The proposed new science will help develop a state-of-the-art data-adaptive models.

Mariana Clare

Based at: Imperial College London
Research project: Advanced numerical techniques to assess erosion/flood risk in the coastal zone
Supervisors: Matthew Piggott (Lead supervisor, Department of Earth Science & Engineering, Imperial College London) and Colin Cotter (Department of Mathematics, Imperial College London). Industry supervisor: Dr Catherine Villaret (East Point Geo Consulting).

Summary: An estimated 250 million people live in regions that are less than 5 metres above sea level. Hence with sea level rise and an increase in both the frequency and severity of storms as a result of climate change, the coastal zone is becoming an ever more critical location for the application of advanced mathematical techniques. Models are currently used to assist in the design of coastal zone engineering projects including flood defences and marine renewable energy arrays. There are many challenges surrounding the development and application of appropriate coupled numerical models because they include both hydrodynamic and sedimentary processes and need to resolve spatial scales ranging from sub-metre to 100s of kilometres.

My project aims to develop and use advanced numerical modelling and statistical tools to improve the understanding of hazards and the quantification and minimisation of erosion and flood risk. Throughout this project, I will consider the hazards in the context of idealised as well as real world scenarios. The main model I will use in my project is XBeach, which uses simple numerical techniques to compute dune erosion, scour around buildings and overwash. XBeach is also currently used, to a limited degree, with Monte Carlo techniques to generate a large number of storm events with different wave climate parameters. Uncertain atmospheric forcing is very important in erosion/scour processes and flood risk, which are intimately linked in many situations and cannot be considered in isolation. In my project I will explore how the new technique of Multi-level Monte Carlo simulations can be combined with XBeach to quantify erosion/flood risk. I am not only interested in the effects of extreme events, but also the cumulative effect of minor storm events for which Monte Carlo techniques are particularly appropriate. I will also explore how an adaptive mesh approach can be coupled with the statistical approach to assess the risk to coastal areas.

George Chappelle

Based at: Imperial College London
Research project: Rate-induced tipping in nonautonomous random dynamical systems
Supervisors: Martin Rasmussen (Imperial College London, Department of Mathematics), Jochen Broeker (University of Reading, Department of Mathematics and Statistics), Pavel Berloff (Imperial College London, Department of Mathematics)

Summary: The concept of a tipping point (or critical transition) describes a phenomena where the behaviour of a physical system changes drastically, and often irreversibly, compared to a small change in its external environment. Relevant examples in climate science are the possible collapse of the Atlantic Meridional Overturning Circulation (AMOC) due to increasing freshwater input, or the sudden release of carbon in peatlands due to an external temperature increase. The aim of this project is to develop the mathematical framework for tipping points and therefore contribute to a deeper understanding of them.

A number of generic mechanisms have been identified which can cause a system to tip. One such mechanism is rate-induced tipping, where the transition is caused by a parameter changing too quickly - rather than it moving past some critical value. The traditional mathematical bifurcation theory fails to address this phenomena. The goal of this project is to use and develop the theory of non-autonomous and random dynamical systems to understand rate-induced tipping in the presence of noise. A question of particular practical importance is whether it is possible to develop meaningful early-warning indicators for rate-induced tipping using observation data. We will investigate this question from a theoretical viewpoint and apply it to more realistic models.

Stuart Patching

Based at: Imperial College London
Research project: Analysis of Stochastic Slow-Fast Systems
Supervisors: Xue-Mei Li (Department of Mathematics, Imperial College London, Lead supervisor), Darryl Holm ( Department of Mathematics, Imperial College London), Dan Crisan (Department of Mathematics, Imperial College London)

Summary: The Gulf Stream can be thought of as a giant meandering ribbon-like river in the ocean which originates in the Caribbean basin and carries warm water across the Atlantic to the west coast of Europe, keeping the European climate relatively mild. In spite of its significance to weather and climate, the Gulf Stream has remained poorly understood by oceanographers and fluid dynamicists for the past seventy years. This is largely due to the fact that the large-scale flow is significantly affected by multi-scale fluctuations known as mesoscale eddies. It is hypothesised that the mesoscale eddies produce a backscatter effect which is largely responsible for maintaining the eastward jet extensions of the Gulf Stream and other western boundary currents.

The difficulty in modelling such currents lies in the high computational cost associated with running oceanic simulations with sufficient resolution to include the eddy effects. Therefore approaches to this problem have been proposed which involve introducing some form of parameterisation into the numerical model, such that the small scale eddy effects are taken into account in coarse grid simulations.

There are three main approaches we may consider in including this parameterisation: the first is stochastic advection, the second is deterministic roughening and the third is data-driven emulation. These approaches have all be explored for relatively simple quasi-geostrophic ocean models, but we shall attempt to apply them to more comprehensive primitive equation models which have greater practical applications in oceanography. In particular we shall be using the MITgcm and FESOM2 models, to which we shall apply our parameterisations and run on a low-resolution grid and compare the results with high-resolution simulations.

Louis Sharrock

Based at: Imperial College London
Research project: Large Scale Inference With Applications to Environmental Monitoring
Supervisors: Nikolas Kantas (Department of Mathematics, Imperial College London, Lead supervisor), Professor Alistair Forbes (NPL)

Summary: This project aims to develop new methodology for performing statistical inference in environmental modelling applications. These applications require the use of a large number of sensors that collect data frequently and are distributed over a large region in space. This motivates the use of a space time varying stochastic dynamical model, defined in continuous time via a (linear or non-linear) stochastic partial differential equation, to model quantities such as air quality, pollution level, and temperature. We are naturally interested in fitting this model to real data and, in addition, on improving on the statistical inference using a carefully chosen frequency for collecting observations, an optimal sensor placement, and an automatic calibration of sensor biases. From a statistical perspective, these problems can be formulated using a Bayesian framework that combines posterior inference with optimal design.

Performing Bayesian inference or optimal design for the chosen statistical model may be intractable, in which case the use of simulation based numerical methods will be necessary. We aim to consider computational methods that are principled but intensive, and given the additional challenges relating to the high dimensionality of the data and the model, must pay close attention to the statistical model at hand when designing algorithms to be used in practice. In particular, popular methods such as (Recursive) Maximum Likelihood, Markov Chain Monte Carlo, and Sequential Monte Carlo, will need to be carefully adapted and extended for this purpose.

Adriaan Hilbers

Based at: Imperial College London
Research project: Understanding climate-based uncertainty in power system design
Supervisors: Prof Axel Gandy (Statistics Section, Department of Mathematics, Imperial College London), Dr David Brayshaw (Department of Meteorology, University of Reading)

In the face of climate change, considerable efforts are being undertaken to reduce carbon emissions. One of the most promising pathways to sustainability is decarbonising electricity generation and electrifying other sources of emissions such as transport and heating. This requires a near-total decarbonisation of power systems in the next few decades.

Making strategical decisions regarding future power system design (e.g. what power plant to build) is challenging for a number of reasons. The first is their complexity: electricity grids can be immensely complicated, making the effect of e.g. an additional power plant difficult to estimate. The second is the considerable uncertainty about future technologies, fuel prices and grid improvements. Finally, especially as more weather-dependent renewables are added, there is climate-based uncertainty: we simply don’t know what the future weather will be, or how well times of high demand will line up with times of high renewable output.

This project aims to both understand the effect of climate-based uncertainty on power system planning problems and develop methodologies for robust decision-making under these unknowns. This will be done in the language of statistics, using techniques such as uncertainty quantification, data reduction and decision-making under uncertainty. Furthermore, this investigation will employ power system models, computer programs simulating the operation of an electricity grid.

Georgios Sialounas

Based at: University of Reading
Research project: Hierarchical Model Adaptivity
Supervisors: Tristan Pryer (University of Reading, Department of Mathematics and Statistics, Lead supervisor), Jennifer Scott (University of Reading, Department of Mathematics and Statistics and Oliver Sutton (University of Reading, Department of Mathematics and Statistics)

Summary: The goal of this project is to develop a model adaptive approximation strategy that in real time is able to choose the appropriate PDE model to approximate over portions of the domain. Thus the scheme is able to couple hierarchical models for use in, for example, atmospheric approximation.

Alexander Alecio

Based at: Imperial College London
Research project: Uncertainty quanti cation, linear response theory and predictability for nonequilibrium systems near phase transitions
When modelling complicated physical systems such as the ocean/atmosphere system with relatively simple mathematical models based on (ordinary/partial, deterministic/stochastic) dierential equations, we expect some discrepancy between the mathematical model and the actual physical system. It is by now well understood that model error, plays an important role in the delity of the mathematical model and on its predictive capabilities. Model uncertainty, together with additional sources of randomness due, e.g. to incomplete knowledge of the current state of the system, sensitive dependence on initial conditions, parameterization of the small scales etc, should be taken into account when making predictions about the system under investigation.

In addition, many climatological models exhibit 'tipping points' - critical transitions where the output of the model changes disproportionately compared to the change in a parameter. [LHK+08] documents several, the most pertinent to British weather being the Stommel-Cessi box model for Atlantic thermohaline circulation, which suggests the collapse of the Atlantic Meridional Overturning Circulation, upon small changes in freshwater input.

Weather forecasting bodies overcome these inherent difficulties ensemble techniques (or probabilistic forecasting), running multiple simulations accounting for the range of possible scenarios. A forecast should then skilfully indicate the confidence the forecaster can have in their prediction, by accurately representing uncertainty [AMP13]. Clearly, model uncertainty can have a dramatic effect on the predictive capabilities of our mathematical model when we are close to a noise induced transition, a tipping point or a phase transition. This poses an important mathematical question: how can we systematically quantify the propogation of uncertainty through the model, from model parameters and initial conditions, to model-output, even in cases of 'tipping'?

[LHK+08] Timothy M. Lenton, Hermann Held, Elmar Kriegler, Jim W. Hall, Wolfgang Lucht, Stefan Rahmstorf, and Hans Joachim Schellnhuber. Tipping elements in the earth's climate system. Proceedings of the National Academy of Sciences, 105(6):1786{1793, 2008.

[AMP13] H. M. Arnold, I. M. Moroz, and T. N. Palmer. Stochastic parametrizations and model uncertainty in the Lorenz '96 system. Philosophical Transactions of the Royal Society of London Series A, 371:20110479{20110479, April 2013.

Supervisor: G.A. Pavliotis (Imperial College London); V. Lucarini (U. Reading)

Rhys Leighton Thompson

Based at: University of Reading
Research project: Diffusion models of Earth’s Outer Radiation Belt using Stochastic Parameterisations
Supervisors: Clare Watt, Department of Meteorology, Reading (Main), Paul Williams, Department of Meteorology, Reading (Co-Supervisor)

Abstract: Space Weather is the name given to the natural variability of the plasma and magnetic field conditions in near-Earth Space. 21st Century technology is increasingly reliant on space-based assets and infrastructure that are vulnerable to extreme space weather events. Due to the sparse nature of in-situ measurements, and the relative infancy of numerical space plasma physics models, we lack the ability to predict the timing and severity of space weather disruptions to either mitigate their effects, or adequately plan for their consequences.

In this project, we focus on important improvements to the numerical modelling of the Earth’s Outer Radiation Belt; a highly-variable region of energetic electrons in near-Earth space. In the Outer Van Allen Radiation Belt, electrons are trapped by the Earth's magnetic field and can be accelerated to a significant fraction of the speed of light. At such high energies, they pose significant hazards to spacecraft hardware. Most importantly for mankind’s reliance on space-based systems, the Outer Radiation Belt encompasses orbital paths that are of great use to society (e.g. geosynchronous orbit, and Medium Earth Orbits that contain global positioning and navigation systems).

The student will construct idealised numerical models of simple 1D diffusion problems with Dirichlet or Neumann boundary conditions and investigate their behaviour when appropriate stochastic parameterisations of diffusion coefficients are chosen. Initial and boundary values will be chosen to mimic realistic values in near-Earth space, and the solutions from the stochastic model will be compared with solutions from a traditional deterministic model. Given the novel nature of stochastic parameterisations in the field of space plasma physics modelling, the results from the MRes project will provide an important demonstration of the differences between stochastic and deterministic modelling and offer ideas of how to shape space weather models moving forward.

Manuel Santos

Based at: University of Reading
Research project: Transfer Operator and Linear Response in GFD
Supervisors: Valerio Lucarini (Department of Mathematics & Statistics, University of Reading, lead supervisor), Jochen Broecker (Department of Mathematics & Statistics, University of Reading) Tobias Kuna (Department of Mathematics & Statistics, University of Reading)

Climate is a complex forced non-equilibrium dissipative system that can be understood as a high-dimensional dynamical system. Moreover, climate is subject to different kinds of forcing that create fluctuations in the governing dynamics. This project will address the problem of dimensionality reduction towards constructing a suitable mathematical framework that deals with the response of climate to forcing and eventually have a better understanding of climate sensitivity.

The goal of the PhD project is to investigate the problem of dimensionality reduction in the transfer operator approach to climate modelling. Transfer operators govern the global evolution of density functions of the system but, for practical reasons, projected spaces are considered. This dimensionality reduction of dynamical process affects the predictive/operative potential of the transfer operator. Attempting to solve this problem entails the development of new theoretical tools for dealing with the representation of the evolution of probability in several types of projected space and an accurate analysis of the transfer operator. Further, response theories exist for the general transfer operators. However, how do we construct a response theory where the non-Markovianity is taken into account?

Leonardo Ripoli

Based at: University of Reading
Research project: Constructing Parameterisations for GFD systems – a comparative approach
Supervisor: Valerio Lucarini (Department of Mathematics and Statistics, University of Reading)
Co-advisor: Paul Williams (Department of Meteorology, University of Reading), Niklas Boers (Grantham Institute - Climate Change and the Environment, Imperial College London)

Description: The construction of parameterisation for multi-scale systems system is a key research area for GFD, because the dynamics of atmosphere and of the ocean covers a wide range of temporal and spatial scales of motion (Berner et al. 2017). Additionally, the variability of the geophysical fluids is characterized by a spectral continuum, so that it is not possible to define unambiguously a spectral gap separating slow from fast motions. As a result, usual mathematical methods based on homogeneization techniques cannot be readily applied to perform the operation of coarse graining. As shown in recent literature (Chekroun et al. 2015, Wouters and Lucarini 2012, 2013, Demayer and Vannitsem 2017, Vissio and Lucarini 2017), the lack of time scale separation leads unavoidably to the presence of non-markovian terms when constructing the effective equations for the slower modes of variability - which are those we want to explicitly represent - able to surrogate the effect of the faster scales - which are, instead, those we want to parameterise.
Two methods have been proposed to deal effectively and rigorously with this problem:
1) The direct derivation of effective evolution equations for the variables of interest, obtained through a perturbative expansion of the Mori-Zwanzig operator (Wouters & Lucarini 2012, 2013);
2) The reconstruction of the effective evolution equations for the variables of interest though an optimization procedure due to Kondrashov et al. (2015) and Chekroun et al. (2017).
Both methods (which we refer to as top-down and bottom-up, respectively) lead to the definition of parameterisation including a deterministic, a stochastic, and a non-markovian (memory effects) component. The two methods are conceptually analogous, but have never been compared on a specific case study of interest. The MSc project here proposed builds upon the earlier results of Vissio and Lucarini (2017) and deals with constructing and comparing the two parameterisation for the 2-level Lorenz ’96 system, which provides a classic benchmark for testing new theories in GFD. The goal will be to understand merits and limits of both parameterisations and to appreciate their differences in terms of precision, adaptivity, and flexibility.

Ben Ashby

Based at: University of Reading
Research project: Adaptive Finite Element Methods for Landslide Prediction
Supervisors: Tristan Pryer – (Lead supervisor, University of Reading, Department of Mathematics and Statistics), Oliver Sutton – (University of Reading, Department of Mathematics and Statistics), Cassiano Antonio Bortolozo – (CEMADEN: National Centre for Natural Disaster Monitoring and Alert - Brazil)

Summary of the project: The goal of this project is to develop and analyse innovative, cutting edge numerical tools for landslide prediction. These will be tested and eventually used by practitioners who require fast numerical computations to reliably predict these events. We will construct efficient numerical models that are able to simulate these phenomena and ultimately quantify uncertainty associated to them.

Ieva Dauzickaite

Based at: University of Reading
Research project: Solving the optimisation problem of ensembles of weak-constraint 4DVar with fixed initial conditions
MRes project supervisors: Peter Jan van Leeuwen (lead supervisor, Department of Meteorology, University of Reading), Jennifer Scott (Department of Mathematics and Statistics, University of Reading), Amos Lawless (Department of Mathematics and Statistics, University of Reading).

Summary of the MRes project: This project is at the boundary of numerical optimisation and nonlinear data assimilation in geophysical systems, such as atmosphere and ocean. In numerical weather prediction (NWP) predictions from a numerical model are combined with observations to obtain a better description of the system, and hence a better prediction of the future of that system, including uncertainties. The best NWPs to date come from the ECMWF system, which explores an ensemble of variational optimisation problems. Two major issues prevent them from improving their approach further: the Gaussian, so linearity, assumption on the predicted state before assimilation, and the fact that errors in the model equations are (largely) ignored in the data-assimilation problem. Furthermore, by exploring so-called perturbed observations in their ensemble, they effectively include extra linearity assumptions. Finally, in more detail, each ensemble member does have a different model error, but, inconsistently, these errors are ignored in the data assimilation. The objective of project is to make serious headway with both these issues, potentially leading to a breakthrough in NWP.

The two problems highlighted above will be addressed as follows. Firstly, we can identify the ensemble of optimised model trajectories as particles in a particle filter, in which perturbed observations are not needed, and the data-assimilation method is immediately fully nonlinear. Secondly, this particle-filter interpretation allows for a reformulation of the optimisation problem to a simpler one in which the so-called background term is absent, and the model error term becomes dominant. The emphasis of the project will be on finding ways to solve this new optimisation problem that arises when the ensemble of variational problems is viewed as an ensemble of particles in a particle filter. This has two aspects, namely, the optimisation problem itself, and the fact that several similar optimisation problems need to be solved, one for each particle, allowing for exchange of information during the optimisations. We will start with the 1-dimensional Lorenz 96 model. This will enable the student to become familiar with the problem and the potential solution methods. Then we will consider intermediate complexity models, specifically the shallow-water equation system. This will allow the student to explore the issues that will be encountered in high-dimensional systems, for which numerical efficiency becomes important.

Sebastiano Roncoroni

Based at: University of Reading
Research project: Non-linear transient adjustment of the Southern Ocean to wind changes
Supervisors: David Ferreira (Lead supervisor, University of Reading, Meteorology), Maarten Ambaum (University of Reading, Meteorology)

Abstract: Despite its remote location, the Southern Ocean plays an important role in global climate. As an example, two current systems in the Southern Ocean- the Antarctic Circumpolar Current and the circumpolar Meridional Overturning Circulation- control the exchange of heat, salt and carbon. These properties are especially relevant in the context of climate changes, as recent estimates suggest that the Southern Ocean has absorbed about 40% and 75% of anthropogenic CO2 and heat in the last 150 years.
It is therefore important to ascertain whether the Southern Ocean will keep on absorbing heat and carbon at the same rate in the near future. To this end, the purpose of my MRes project is to investigate how the Southern Ocean current systems respond to changes in surface wind forcing.
This question is motivated by the observation that the Southern Hemisphere jet stream (which drives the Southern Ocean circulation) has strengthened and shifted poleward over recent decades, mostly in response to the Antarctic ozone depletion.
Most previous studies on the subject focus on the long-term, equilibrium response of the Southern Ocean to wind stress changes. In this case the eddy-compensation regime is the standard theoretical framework, and it is thought that the transport properties of the Southern Ocean should not be strongly influenced by surface stress changes.
However, to predict climate changes on a shorter timescale (inter-annual to decadal) it is important to understand transient adjustment processes. Previous researches have addressed this issue formulating a linear theory to describe the relation between wind input, potential energy and kinetic eddy energy: the goal of my project is to extend this treatment developing a fully non-linear mathematical model of the adjustment mechanism.
Specifically, the task is to adapt a non-linear dynamical system model of storm track variability (thus, a model originally designed for atmospheric phenomena) to the oceanic case.
In particular, we are interested in characterising the relationship between the timescales of the forced system and those of the unforced jet-eddy interaction. This will also include comparison of theoretical results with the outputs of numerical eddy resolving models.


Marco Cucchi

Based at: University of Reading
Research project: Sensitivity of Extremes in Simplified Models of the Mid-latitude Atmospheric Circulation
MPE CDT Aligned student

Supervisors: Valerio Lucarini (lead supervisor) and Tobias Kuna

Project Abstract: In this project I’m going to investigate extreme events in simplified atmospheric models of the mid-latitudes using the point of view of Extreme Value Theory (EVT; Coles 2001). The idea here is to extend the work Felici et al. (2007a, 2007b), where it was first shown that EVT can be used to look at extremes generated by an atmospheric model, going beyond the diagnostic analysis, and taking advantage of the theoretical framework presented in Lucarini et al. (2016). I’m going to investigate the properties of extremes of observables where different levels of spatial and temporal coarse graining procedures are performed, so to understand the effect of averaging on our estimates of extremes. Additionally, statistics of extremes of coarse grained fields will be compared with what obtained running models with coarser resolution. Finally, I will investigate the response of the extremes to both time-independent and dependent perturbations affecting the dynamics, using response theory and pullback attractors. Throughout this work both deterministic and stochastic perturbations will be investigated, and results will be used for model error assessment and analysis of multiscale effects.
As a practical application, this work will lead to the definition of functions describing societal and economic impact of extreme climatic events, along with financial and insurance tools able to manage time-dependent risk assessment.

Jennifer Israelsson

Based at: University of Reading
Research project: Developing novel methods for for early warning of high impact weather in Africa
Supervisors: Dr Emily Black (Lead supervisor, Department of Meteorology, University of Reading), Dr Claudia Neves (Department of Mathematics and Statistics, University of Reading)

Project summary: Farmers in Africa are highly vulnerable to variability in the weather. Robust and timely information on risk can enable farmers to take action to improve yield. Ultimately, access to effective early warning improves global food security. Monitoring weather conditions is, however, difficult in Africa because of the heteorogeneity of the climate, and the sparcity of the ground-observing network. Remotely sensed data are an alternative to ground observations – but only if the algorithms have skill across the whole rainfall distribution; and if the rainfall estimates are integrated into effective decision support frameworks.

The recent development of a novel system for agricultural decision support system,TAMSAT-ALERT, addresses the question: Given the state of the land surface, the climatology, the stage in the period of interest, and the meteorological forecast, what is the likelihood of some adverse event?
TAMSAT-ALERT works by driving an impact model with multiple possible realisations of the weather, and then interpreting the resulting ensemble in terms of risk. Since the likelihood of an adverse event may depend both on weather in the past, and in the future, the realisations of the weather are derived by splicing together historical observations with possible weather futures.
A limitation of TAMSAT-ALERT is the implicit assumption that the observed historical climatology accurately represents the actual current climatology. This assumption is open to challenge, especially for events that are strongly affected by meteorological extremes.

My MRes project will carry out initial assessments of the effect of climate change on the likelihood of extreme rainfall/temperature events in Africa, and subsequently of adverse agricultural outcomes. The project will also assess the representation of extreme events in TAMSATv3.