MPE CDT Student Cohort 2017

Elena Saggioro

Based at: University of Reading
Research project: Causal approaches to climate variability and change
Supervisors: Professor Ted Shepherd (Lead Supervisor, Department of Meteorology, University of Reading), Professor Sebastian Reich (Department of Mathematics and Statistics, University of Reading), Dr Jeff Knight (Met Office)

Project summary: Although there is high confidence in thermodynamic aspects of global Climate Change, there is still great uncertainty in its dynamical aspects at regional scale. This is due to the role of atmospheric circulation, whose response to external forcing is still poorly understood. Progress in its understanding is not always helped by Climate Models, which at the present time show great spread in regional simulations. In order to constrain this spread to its physically plausible range, seasonal observational data are employed. But how to proceed when comparing data and model outputs? Today, the commonly used statistical procedure relies on correlations. However, when it comes to attribute an observed behaviour to some cause, correlation is not sufficient. Correlation does not imply causation, but to translate this statement into practice is not trivial at all. Nevertheless, recent decades have seen great leaps forward in the understanding and implementation of causal inference thanks to the Aritificial Intelligence community. Formal statistical methods based on Bayesian Causal Networks (BCN) have been developed and succesfully applied. A BCN translates into sequential evaluation of conditional probabilities, which allow to give a causal interpretation of correlated variables. These methods are only just beginning to be considered within climate science. An example is found in Kretschmer et al. (2016), wherein the scheme of winter Arctic circulation is described as a BCN in which the nodes of the network are the stationary time-series of the relevant climate variables.
The present project aims to develop a bayesian causal network to explore the puzzling Southern Hemisphere (SH) late-spring circulation variability. In particular, we aim to understand the causal relationship between the annual late-spring breakdown of the SH stratospheric polar-vortex and its tropospheric manifestations. A recent work of Byrne et al. (2017) suggests that tropospheric seasonal variability is better understood as a result of interannual variability in the timing of the breakdown event. To address this, the project will develop a causal network suitable for non-stationary climate phenomena, such as seasonal regime transitions. A particular focus will be quantifying the improvement in signal-to-noise ratio obtained by using a non-stationary rather than a stationary statistical framework.

Byrne, N.J., Shepherd, T.G., Woollings, T. and Plumb, R.A., 2017. J. Clim., 30, 7125–7139.
Kretschmer, M., Coumou, D., Donges, J.F. and J. Runge, 2016: J. Clim., 29, 4069¬–4081.

Niraj Agarwal

Based at: Imperial College London
Research project: Data-driven reduced order modelling of ocean variability
Principal Supervisor: Prof. Pavel Berloff (Department of Mathematics, Imperial College London)
Co-advisor: Peter Dueben, (ECMWF)

Project summary: The large scale dynamical system of ocean currents exhibits numerous types of complex motions that co-exist on very dierent spatio-temporal scales but without clear scale separation between them. Therefore, along with high computational cost of simulating them in high resolution, naturally comes the need to develop a reduced-order modelling methodology, where significant fraction of turbulent oceanic motions can be statistically reproduced by some model of much reduced complexity. The project aims at developing such reduced-order stochastic models using state-of-the-art statistical data-driven reduction methods that describe the evolution of a few tens/hundreds of spatio/temporal modes capturing the essential statistical properties and correlations of the underlying flow. The data-driven and numerically inexpensive methodology can also be applied to other complex high-dimensionality systems in science and engineering. If the techniques are successful, this project will improve our understanding of climate variability, provide essential knowledge to build better coupled ocean-climate models, and ultimately help to make predictions of climate change one of the major challenges our society is facing.

Mariana Clare

Based at: Imperial College London
Research project: Assessing extreme events in the coastal zone
Supervisors: Matthew Piggott (Lead Supervisor, Department of Earth Science & Engineering, Imperial College ) and Colin Cotter (Department of Mathematics, Imperial College London)

Summary: An estimated 250 million people live in regions that are less than 5 metres above sea level. Hence with sea level rise and an increase in both the frequency and severity of storms as a result of climate change, the coastal zone is becoming an ever more critical location for the application of advanced mathematical techniques. Models are currently used to assist in the design of coastal zone engineering projects including flood defences and marine renewable energy arrays. There are many challenges surrounding the development and application of appropriate coupled numerical models, as they include hydrodynamic as well as sedimentary processes and need to resolve spatial scales ranging from sub-metre to 100s of kilometres.
My project aims to develop and use advanced numerical modelling and statistical tools to improve the understanding of hazards and the quantification and minimisation of risk. I will focus on the issue of scour around offshore wind farm foundations and arrays, initially using a 2D depth-averaged model where a sediment transport/scour module will be implemented. The issue of scouring is particularly important as new arrays will need to be designed to withstand potentially more extreme oceanographic conditions, due both to the exploitation of new sites and climate change. My MRes will also consider how extreme events, such as tsunamis, storm surges, and extreme rainfall, can affect the sediment flow and hence the scouring pattern. The Thames estuary is chosen as a motivating case study area. This is an important region to assess because of the existent (and the design of a new) Thames barrier and the presence of the world’s largest offshore wind farm in the outer estuary.

George Chappelle

Based at: Imperial College London
Research project: Transit times and mean ages for non-autonomous and random compartmental systems with application to the terrestrial carbon cycle
Supervisors: Martin Rasmussen (Imperial College London, Department of Mathematics) and Valerio Lucarini (University of Reading, Department of Mathematics and Statistics)

Summary of project: Compartmental models play an important role in the modeling of many biological systems ranging from pharmacokinetics to ecology. Key values in understanding the dynamics of these systems are the transit time (the mean time a particle spends in the compartmental system), and the mean age (the mean age of particles still in the system). This project is motivated by an interest in studying the dynamics of the terrestrial carbon cycle which is typically modelled as a number of discrete pools of carbon in plant biomass, litter and soil organic matter. Many of the best studied models of the dynamics of carbon are linear, which reflects the fact that changes in carbon pools are proportional to the pool size. Perhaps the most well-known examples are studies of how terrestrial carbon dynamics respond to climate change. In these, it is often assumed that the specific rates (per unit carbon) of carbon inputs and losses from the system change over time as a function of changes in climate, such as temperature. For example, increases in temperature are normally assumed to increase the rates of soil decomposition. As a consequence, the compartmental models of interest are nonautonomous, i.e. they depend on time. Nonautonomous compartmental systems are special cases of linear nonautonomous differential equations, which, in contrast to the linear autonomous case, cannot be solved analytically in general. Yet, both the mean age of particles in the system and the transit time remain of great interest for these time-dependent systems, as both quantities can be potentially measured in the actual systems being modelled.
The first part of the project aims at extending the current results in this area to discrete time. The second part of the project aims at the analysis of transit times and mean ages of randomly perturbed nonautonomous compartmental systems. Here the transfer rates will be perturbed by bounded multiplicative noise, and an analysis of the structure and stability of the corresponding (random) mean age system will be achieved. Finally, the impact of noise on the crucial quantities will be studied theoretically and by means of a modified version of the Carnegie–Ames–Stanford approach (CASA) model, which is a nine-dimensional model for the terrestrial carbon cycle.

Stuart Patching

Based at: Imperial College London
Research project: Analysis of Stochastic Slow-Fast Systems
Supervisors: Xue-Mei Li (Department of Mathematics, Imperial College London, Lead supervisor), Darryl Holm ( Department of Mathematics, Imperial College London), Dan Crisan (Department of Mathematics, Imperial College London)

Project summary: A key issue in weather prediction is the interaction between (slow) Rossby waves and (fast) gravity waves. The appearance of gravity waves in numerical weather simulations are responsible for significant inaccuracies in forecasts, as there is a large discrepancy between their effect in the real atmosphere and in numerical models. In order to gain insight into this problem, it is helpful to study the rotating shallow water equations and a simplified version of them, the Lorenz ’86 model.

I will consider versions of these equations modified to include stochastic effects, which represent the uncertainties present in any atmospheric model. I will apply techniques from stochastic averaging to study the properties of these equations and related fluid equations. In particular I will make use of the Hamiltonian nature of the Lorenz ’86 model to develop understanding of the stochastic rotating shallow water equations.

Louis Sharrock

Based at: Imperial College London
Research project: Large Scale Inference With Applications to Environmental Monitoring
Supervisors: Nikolas Kantas (Department of Mathematics, Imperial College London, Lead supervisor), Professor Alistair Forbes (NPL)

Abstract / Summary of MRes Project: This project looks at developing methodology for performing statistical inference for environmental modelling applications. These applications require the use of a large number of sensors that collect data frequently and are distributed over a large region in space. This motivates the use of space time varying stochastic dynamical models to model environmental quantities such as air quality, pollution level, temperature. We are interested in fitting these models to real data collected and in addition on improving on the statistical inference using a carefully chosen frequency for collecting observations, optimal sensor placement or automatic calibration of sensor biases. From a statistical perspective, these problems can be formulated using a Bayesian framework that combine posterior inference with optimal design. Performing Bayesian inference or optimal design for these models is typically intractable so we typically need to rely on simulation based numerical methods. We will be looking at computational methods that are principled but intensive; given the additional challenge related to the high dimensionality of the models and data, attention must be paid to the statistical model(s) at hand when designing algorithms to be used in practice. In particular, popular methods such as Sequential Monte Carlo (SMC) or Markov Chain Monte Carlo (MCMC) must be carefully extended to accommodate the particular models in the application.

Adriaan Hilbers

Based at: Imperial College London
Research project: Representing climate uncertainty in power systems planning
Supervisors: David Brayshaw (Department of Meteorology, University of Reading, Main Supervisor), Axel Gandy (Department of Mathematics, Imperial College London)

Description: Both policymakers and private bodies continuously make strategical decisions on future power systems design and operation, with the aim of making electricity supply affordable, secure and environmentally sustainable. To make informed decisions, they frequently employ power systems models (PSMs). For example, researches could use a PSM to test the effect of installing a new wind farm vs. a new power plant on electricity price, security and emissions and relay their findings to policymakers before investment decisions are made.

Since electricity networks are complex, realistic PSMs are complicated and require a lot of computing power. As a result, they rarely use more than a year’s worth of weather & demand data, and often less. However, there is evidence (e.g. Bloomfield et al, 2016) that inter-year weather variability leads to significantly different optimal power system design or operation. This effect is amplified as the share of electricity generated by weather-dependent renewables (particularly wind and solar) increases. Choosing the “wrong year” of weather data can push users of PSMs into making inefficient or incorrect policy decisions. Ideally, a PSM would be run for multiple decades of weather and demand data, but this is computationally unfeasible in realistic models.

This project attempts to use statistical methods to understand the effect of multi-year weather variability on model outputs, and design a framework that ensures PSMs give the “right” answer (e.g. the one based on many years of weather data) even if not all weather data can be inputted into the model.

Georgios Sialounas

Based at: University of Reading
Research project: High performance linear algebra for multi-scale phenomena
Supervisors: Tristan Pryer (University of Reading, Department of Mathematics and Statistics, Lead supervisor), Jennifer Scott (University of Reading, Department of Mathematics and Statistics and Oliver Sutton (University of Reading, Department of Mathematics and Statistics)

Summary: Mathematically accurate descriptions of physical phenomena produce (systems of) PDEs that are often complicated and expensive to solve computationally. Such models are often reduced, using physical reasoning to simpler, more manageable models which contain fewer information than the original model but are deemed to be satisfactory approximations, leading to a hierarchy of increasingly simplified models. This strategy of simplification and model reduction is particularly prevalent in the area of climate and weather prediction, where the models (and even their reductions) are often very complicated and computationally expensive. If we reduce a model and apply that to the whole domain, then we face the risk of not accurately modelling localized physical effects pertaining to information which are not contained in the simplified model to start with. Such information may be relevant to a subset of the domain but are nonetheless important when it comes to arriving to a physically relevant solution. The connection between this and weather prediction is that as we try to improve the reliability of our model in time then we will inevitably need to model, accurately, more physical processes. This will necessitate the use of complicated and computationally expensive models further up in the model hierarchy. However, the cost of solving the complicated model throughout the domain is prohibitive and may not yield very different results from the simpler models. In order to improve upon this approach, we would like to be able to solve locally, the model which is most physically relevant to the effects present in a particular part of the domain. At the same time, we would like to solve the more complicated model optimally: that means that we solve it in as small a part of the domain as possible for physical correctness so as not to waste computational resources. There are various mathematical challenges in this regard. For example, we need to decide when, where and how to choose between models in real time during a computation. Also, we need to couple them in such a way so as not to introduce numerical artefacts in our solution. The area of research which aims to address these questions is hierarchical model adaptivity, whereby the hierarchy of models is brought about by the reduction in complexity from model to model using physically based assumptions. In this project we will examine linear ow governed by the Stokes problem. The
Stokes equation will be reduced to the vectorial Laplacian in real time on the basis of a posteriori error estimation locally. The aim will be to come up with a coupling of the two models that is robust and does not produce numerical artefacts. Furthermore, coming up with a coupling that is not specific to this problem can potentially enable the generalization of this to problems pertaining to climate model simulations where model reduction is common.

Alexander Alecio

Based at: Imperial College London

Rhys Leighton Thompson

Based at: University of Reading
Research project: Diffusion models of Earth’s Outer Radiation Belt using Stochastic Parameterisations
Supervisors: Clare Watt, Department of Meteorology, Reading (Main), Paul Williams, Department of Meteorology, Reading (Co-Supervisor)

Abstract: Space Weather is the name given to the natural variability of the plasma and magnetic field conditions in near-Earth Space. 21st Century technology is increasingly reliant on space-based assets and infrastructure that are vulnerable to extreme space weather events. Due to the sparse nature of in-situ measurements, and the relative infancy of numerical space plasma physics models, we lack the ability to predict the timing and severity of space weather disruptions to either mitigate their effects, or adequately plan for their consequences.

In this project, we focus on important improvements to the numerical modelling of the Earth’s Outer Radiation Belt; a highly-variable region of energetic electrons in near-Earth space. In the Outer Van Allen Radiation Belt, electrons are trapped by the Earth's magnetic field and can be accelerated to a significant fraction of the speed of light. At such high energies, they pose significant hazards to spacecraft hardware. Most importantly for mankind’s reliance on space-based systems, the Outer Radiation Belt encompasses orbital paths that are of great use to society (e.g. geosynchronous orbit, and Medium Earth Orbits that contain global positioning and navigation systems).

The student will construct idealised numerical models of simple 1D diffusion problems with Dirichlet or Neumann boundary conditions and investigate their behaviour when appropriate stochastic parameterisations of diffusion coefficients are chosen. Initial and boundary values will be chosen to mimic realistic values in near-Earth space, and the solutions from the stochastic model will be compared with solutions from a traditional deterministic model. Given the novel nature of stochastic parameterisations in the field of space plasma physics modelling, the results from the MRes project will provide an important demonstration of the differences between stochastic and deterministic modelling and offer ideas of how to shape space weather models moving forward.

Manuel Santos

Based at: University of Reading
Research project: Transfer Operator and Linear Response in GFD
Supervisors: Valerio Lucarini (Department of Mathematics & Statistics, University of Reading, lead supervisor), Jochen Broecker (Department of Mathematics & Statistics, University of Reading) Tobias Kuna (Department of Mathematics & Statistics, University of Reading)

The need of ensemble methods in geophysical flows, weather and climate relies on their sensitive dependence on initial conditions. In a very natural manner, introducing small perturbations on the initial conditions induces a collection of trajectories in the phase plane with which one would like to make statistical statements such as a forecast. Analogously, perturbing the governing dynamics affects the behaviour of ensembles as well as the invariant measure of the system, therefore changing the way we do statistics.

My project is concerned with calculating the linear response of geophysical dynamical systems to perturbations. In order to perform this analysis, we shall make a transfer operator approach. The transfer operator is a function between probability measure spaces that can give information about the response of the systems. By working on the phase plane one can construct discretised versions of the transfer operator to obtain estimates and approximations of the response. My MRes aims to deal with the mathematical formulation of the transfer operator techniques and to apply them to geophysical models, starting with classical equations given in Lorenz-84 atmospheric circulation model and Lorenz-96.

Leonardo Ripoli

Based at: University of Reading
Research project: Constructing Parameterisations for GFD systems – a comparative approach
Supervisor: Valerio Lucarini (Department of Mathematics and Statistics, University of Reading)
Co-advisor: Paul Williams (Department of Meteorology, University of Reading), Niklas Boers (Grantham Institute - Climate Change and the Environment, Imperial College London)

Description: The construction of parameterisation for multi-scale systems system is a key research area for GFD, because the dynamics of atmosphere and of the ocean covers a wide range of temporal and spatial scales of motion (Berner et al. 2017). Additionally, the variability of the geophysical fluids is characterized by a spectral continuum, so that it is not possible to define unambiguously a spectral gap separating slow from fast motions. As a result, usual mathematical methods based on homogeneization techniques cannot be readily applied to perform the operation of coarse graining. As shown in recent literature (Chekroun et al. 2015, Wouters and Lucarini 2012, 2013, Demayer and Vannitsem 2017, Vissio and Lucarini 2017), the lack of time scale separation leads unavoidably to the presence of non-markovian terms when constructing the effective equations for the slower modes of variability - which are those we want to explicitly represent - able to surrogate the effect of the faster scales - which are, instead, those we want to parameterise.
Two methods have been proposed to deal effectively and rigorously with this problem:
1) The direct derivation of effective evolution equations for the variables of interest, obtained through a perturbative expansion of the Mori-Zwanzig operator (Wouters & Lucarini 2012, 2013);
2) The reconstruction of the effective evolution equations for the variables of interest though an optimization procedure due to Kondrashov et al. (2015) and Chekroun et al. (2017).
Both methods (which we refer to as top-down and bottom-up, respectively) lead to the definition of parameterisation including a deterministic, a stochastic, and a non-markovian (memory effects) component. The two methods are conceptually analogous, but have never been compared on a specific case study of interest. The MSc project here proposed builds upon the earlier results of Vissio and Lucarini (2017) and deals with constructing and comparing the two parameterisation for the 2-level Lorenz ’96 system, which provides a classic benchmark for testing new theories in GFD. The goal will be to understand merits and limits of both parameterisations and to appreciate their differences in terms of precision, adaptivity, and flexibility.

Ben Ashby

Based at: University of Reading
Research project: Adaptive Finite Element Methods for Landslide Prediction
Supervisors: Tristan Pryer – (Lead supervisor, University of Reading, Department of Mathematics and Statistics), Oliver Sutton – (University of Reading, Department of Mathematics and Statistics), Cassiano Antonio Bortolozo – (CEMADEN: National Centre for Natural Disaster Monitoring and Alert - Brazil)

Summary of the project: Many areas of Brazil are particularly vulnerable to landslides, particularly those with mountainous terrain and a heavy rainy season. These events affect both human populations and infrastructure; there have been very large death tolls when landslides occur near major population centres, and long-distance power transmission can be disrupted by damage to power lines, resulting in large-scale power cuts.

Currently, real-time monitoring of the moisture content is available, but the potential for early warnings of landslide events is limited by the lack of means to predict, rather than just observe.

This project will attempt to address this problem by developing numerical methods to numerically solve the equations that govern groundwater flow and predict dangerous conditions in the soil, and hopefully give the potential to issue warnings. Thus, being able to perform computations quickly as well as accurately is crucial if these simulations are to be of value.

For the MRes project, I will focus on a simple model for groundwater flow known as Darcy flow. In this case, one of the major difficulties is the structure of the soil, which can be highly irregular. The flow of water through the soil depends on its permeability, which can vary by orders of magnitude over short distances. The numerical approximation schemes must therefore be robust and flexible to deal with this sort of data. I plan to use the discontinuous Galerkin class of finite element methods with mesh adaptivity that can provide high resolution of the data where necessary while still remaining computationally efficient.

Ieva Dauzickaite

Based at: University of Reading
Research project: Solving the optimisation problem of ensembles of weak-constraint 4DVar with fixed initial conditions
MRes project supervisors: Peter Jan van Leeuwen (lead supervisor, Department of Meteorology, University of Reading), Jennifer Scott (Department of Mathematics and Statistics, University of Reading), Amos Lawless (Department of Mathematics and Statistics, University of Reading).

Summary of the MRes project: This project is at the boundary of numerical optimisation and nonlinear data assimilation in geophysical systems, such as atmosphere and ocean. In numerical weather prediction (NWP) predictions from a numerical model are combined with observations to obtain a better description of the system, and hence a better prediction of the future of that system, including uncertainties. The best NWPs to date come from the ECMWF system, which explores an ensemble of variational optimisation problems. Two major issues prevent them from improving their approach further: the Gaussian, so linearity, assumption on the predicted state before assimilation, and the fact that errors in the model equations are (largely) ignored in the data-assimilation problem. Furthermore, by exploring so-called perturbed observations in their ensemble, they effectively include extra linearity assumptions. Finally, in more detail, each ensemble member does have a different model error, but, inconsistently, these errors are ignored in the data assimilation. The objective of project is to make serious headway with both these issues, potentially leading to a breakthrough in NWP.

The two problems highlighted above will be addressed as follows. Firstly, we can identify the ensemble of optimised model trajectories as particles in a particle filter, in which perturbed observations are not needed, and the data-assimilation method is immediately fully nonlinear. Secondly, this particle-filter interpretation allows for a reformulation of the optimisation problem to a simpler one in which the so-called background term is absent, and the model error term becomes dominant. The emphasis of the project will be on finding ways to solve this new optimisation problem that arises when the ensemble of variational problems is viewed as an ensemble of particles in a particle filter. This has two aspects, namely, the optimisation problem itself, and the fact that several similar optimisation problems need to be solved, one for each particle, allowing for exchange of information during the optimisations. We will start with the 1-dimensional Lorenz 96 model. This will enable the student to become familiar with the problem and the potential solution methods. Then we will consider intermediate complexity models, specifically the shallow-water equation system. This will allow the student to explore the issues that will be encountered in high-dimensional systems, for which numerical efficiency becomes important.

Sebastiano Roncoroni

Based at: University of Reading
Research project: Non-linear transient adjustment of the Southern Ocean to wind changes
Supervisors: David Ferreira (Lead supervisor, University of Reading, Meteorology), Maarten Ambaum (University of Reading, Meteorology)

Abstract: Despite its remote location, the Southern Ocean plays an important role in global climate. As an example, two current systems in the Southern Ocean- the Antarctic Circumpolar Current and the circumpolar Meridional Overturning Circulation- control the exchange of heat, salt and carbon. These properties are especially relevant in the context of climate changes, as recent estimates suggest that the Southern Ocean has absorbed about 40% and 75% of anthropogenic CO2 and heat in the last 150 years.
It is therefore important to ascertain whether the Southern Ocean will keep on absorbing heat and carbon at the same rate in the near future. To this end, the purpose of my MRes project is to investigate how the Southern Ocean current systems respond to changes in surface wind forcing.
This question is motivated by the observation that the Southern Hemisphere jet stream (which drives the Southern Ocean circulation) has strengthened and shifted poleward over recent decades, mostly in response to the Antarctic ozone depletion.
Most previous studies on the subject focus on the long-term, equilibrium response of the Southern Ocean to wind stress changes. In this case the eddy-compensation regime is the standard theoretical framework, and it is thought that the transport properties of the Southern Ocean should not be strongly influenced by surface stress changes.
However, to predict climate changes on a shorter timescale (inter-annual to decadal) it is important to understand transient adjustment processes. Previous researches have addressed this issue formulating a linear theory to describe the relation between wind input, potential energy and kinetic eddy energy: the goal of my project is to extend this treatment developing a fully non-linear mathematical model of the adjustment mechanism.
Specifically, the task is to adapt a non-linear dynamical system model of storm track variability (thus, a model originally designed for atmospheric phenomena) to the oceanic case.
In particular, we are interested in characterising the relationship between the timescales of the forced system and those of the unforced jet-eddy interaction. This will also include comparison of theoretical results with the outputs of numerical eddy resolving models.

Marco Cucchi

Based at: University of Reading
Research project: Sensitivity of Extremes in Simplified Models of the Mid-latitude Atmospheric Circulation
MPE CDT Aligned student

Supervisors: Valerio Lucarini (lead supervisor) and Tobias Kuna

Project Abstract: In this project I’m going to investigate extreme events in simplified atmospheric models of the mid-latitudes using the point of view of Extreme Value Theory (EVT; Coles 2001). The idea here is to extend the work Felici et al. (2007a, 2007b), where it was first shown that EVT can be used to look at extremes generated by an atmospheric model, going beyond the diagnostic analysis, and taking advantage of the theoretical framework presented in Lucarini et al. (2016). I’m going to investigate the properties of extremes of observables where different levels of spatial and temporal coarse graining procedures are performed, so to understand the effect of averaging on our estimates of extremes. Additionally, statistics of extremes of coarse grained fields will be compared with what obtained running models with coarser resolution. Finally, I will investigate the response of the extremes to both time-independent and dependent perturbations affecting the dynamics, using response theory and pullback attractors. Throughout this work both deterministic and stochastic perturbations will be investigated, and results will be used for model error assessment and analysis of multiscale effects.
As a practical application, this work will lead to the definition of functions describing societal and economic impact of extreme climatic events, along with financial and insurance tools able to manage time-dependent risk assessment.

Jennifer Israelsson

Based at: University of Reading
Research project: Developing novel methods for for early warning of high impact weather in Africa
Supervisors: Dr Emily Black (Lead supervisor, Department of Meteorology, University of Reading), Dr Claudia Neves (Department of Mathematics and Statistics, University of Reading)

Project summary: Farmers in Africa are highly vulnerable to variability in the weather. Robust and timely information on risk can enable farmers to take action to improve yield. Ultimately, access to effective early warning improves global food security. Monitoring weather conditions is, however, difficult in Africa because of the heteorogeneity of the climate, and the sparcity of the ground-observing network. Remotely sensed data are an alternative to ground observations – but only if the algorithms have skill across the whole rainfall distribution; and if the rainfall estimates are integrated into effective decision support frameworks.

The recent development of a novel system for agricultural decision support system,TAMSAT-ALERT, addresses the question: Given the state of the land surface, the climatology, the stage in the period of interest, and the meteorological forecast, what is the likelihood of some adverse event?
TAMSAT-ALERT works by driving an impact model with multiple possible realisations of the weather, and then interpreting the resulting ensemble in terms of risk. Since the likelihood of an adverse event may depend both on weather in the past, and in the future, the realisations of the weather are derived by splicing together historical observations with possible weather futures.
A limitation of TAMSAT-ALERT is the implicit assumption that the observed historical climatology accurately represents the actual current climatology. This assumption is open to challenge, especially for events that are strongly affected by meteorological extremes.

My MRes project will carry out initial assessments of the effect of climate change on the likelihood of extreme rainfall/temperature events in Africa, and subsequently of adverse agricultural outcomes. The project will also assess the representation of extreme events in TAMSATv3.