These are some of the PhD projects offered to prospective students at the Physics department. If you are interested in any of them, please contact Ekkehard Ullner ( You can also contact one of the supervisor of the project you are interested in; you will find their contact details listed at the top of each project.

At the moment, there is no funding available for these projects. However, we will do our best to help interested students apply for fellowships and other sources of funding.

Quantitative Risk Assessment for Food Safety (allergens, microbial pathogens or chemical contaminants)

Main supervisor: Norval Strachan (

An expectation of the society we live in is that the food that we eat is safe. That is

  •  it does not contain harmful bacteria (e.g. Listeria, Salmonella, Campylobacter, E. coli O157) or their toxins (e.g. aflatoxins, botulinum and staphylococcal toxins) that can cause disease.
  •  it does not include allergens (e.g. peanuts, sesame seeds etc) unless they are declared on the packaging.
  •  it does not contain levels of chemical contaminants (e.g. pesticides. PCBs, dioxins, plastics etc,) that are detrimental to health either now or in the future.

But how do you determine whether food is safe? A possible way is to check foodstuffs by microbiological or chemical methods for the contaminants mentioned above. However, it is not practical to test every batch of food for every possible contaminant. Hence, there is a need to assess risk without having complete information about possible contaminants in every piece of food. But what is risk?

Risk comprises two different aspects. The first is the probability that the contaminant is in the food. The second is the severity – how ill it will make you if you eat it (mild illness, hospitalisation, death). Risk assessment is the scientific/technical approach of estimating the risk. The risk assessment approach can be broken into 4 steps which are:

  • Identification of the hazard (type of contaminant and foodstuff it may be found in). * Characterisation of the hazard (the response of individuals to an ingested dose and subsequent illness).
  • Exposure assessment (estimating likely contamination rates from farm to fork and calculating the dose ingested per meal).
  • Risk characterisation (working out risk across the exposed population determining number of illnesses, hospitalizations and deaths each year).

Together with the supervisors, the student will select the type of hazard to be studied (e.g. from list above) and the type of food (e.g. dairy, fish, meat, salads, complex foods etc.).

A quantitative risk assessment model will be built from farm to fork. Key variables at each step of the chain will be identified and data will be obtained from the literature and the supervisors own research.

The model can be implemented in various ways depending on the preference and skillset of the student. This ranges from using existing software implemented on excel to the possibility of developing the model in a high level language. All training will be provided and there may also be the opportunity to work alongside industry.

By the end of the project, the acquired skillset will make the student well placed for employment in risk assessment in not only the food sector but also other sectors that value similar expertise (e.g. environmental science, finance etc.).

How to make molecular machines: modelling ribosome biogenesis

Main supervisor: Maria C Romano (

Ribosomes are arguably the most important biological molecular machines. Cells can be seen as factories of proteins, with ribosomes being the machines producing them. Therefore, understanding how ribosomes themselves are made and how their production is controlled depending on the environment of the cell, is a fundamental question in cell biology. Despite of extensive information on the process of ribosome biogenesis gathered in recent years, the regulation of ribosome production upon changes in external cellular conditions remains an outstanding open question. The main aim of this project is to develop a mathematical model of ribosome biogenesis taking into account the current knowledge about the biochemical pathways. In particular, we will aim at identifying the rate-limiting steps and most crucial mechanisms determining the production rate of ribosomes, as well as how this production can be finely tuned depending on external cellular resources and environmental conditions. We will also explore the links between the cell cycle and ribosome biogenesis, as well as metabolism. In contrast to a large mathematical model comprising a very high number of components, our objective is to develop a model as simple as possible, but nevertheless predictive, so that it allows us gaining insight into the fascinating process of ribosome biogenesis. The PhD student will work in a dynamic and interdisciplinary team of researchers working at the interface between physics and biology, integrating theoretical and experimental results.

Noisy translation: modelling of stochastic effects on protein production

Main supervisor: Maria C Romano (

This project will focus on the mathematical modelling of a fundamental process in the cell: translation of the messenger RNA into a protein. A messenger RNA (mRNA) contains the sequence of nucleotides transcribed from the DNA that encode a certain protein. Molecular machines called ribosomes bind to the mRNA sequence and move along the nucleotide sequence thereby translating the sequence of codons (groups of 3 consecutive nucleotides) into the sequence of amino acids that form the protein. Like cars on a narrow countryside road, ribosomes cannot overtake each other, so that queues of ribosomes can form on mRNAs. In this project we will develop a mathematical model to describe how ribosomes move along the mRNA sequence, thereby predicting protein production rates. This is a fundamental problem in molecular biology, as the amount of different kinds of proteins produced largely determined the behaviour of a cell.

In particular, we will focus on stochastic effects of the process of translation, predicting the extent of fluctuations in protein production expected from different mRNAs, taking into account effects such as codon composition, mRNA secondary structures and global competition for translation resources. Model predictions will be directly compared with experimental results so that a series of rounds of model refinement and validation can be performed. The PhD student will work in a dynamic and interdisciplinary team of researchers working at the interface between physics and biology, integrating theoretical and experimental results.

Chaos and fractals in fluid motion

Main supervisor: Alessandro Moura (

The advection of particles and fields by fluid flows is a problem of great interest for both fundamental physics and engineering applications. This area of research encompasses phenomena such as the dispersion of pollutants in the atmosphere and oceans, the mixing of chemicals in chemical and pharmaceutical industry, and many others. The dynamics of these flows is characterised by chaotic advection, which means that particles carried by the flow have complex and unpredictable trajectories; this is an example of the phenomenon of chaos. One consequence of chaotic advection is that any given portion of the fluid is deformed by the flow into a complicated scale-invariant shape with fractal geometry. The exotic geometric properties of this fractal set leads to anomalous behaviour in important dynamical properties of the flow, such as its mixing rates and the rates of chemical reactions and other processes taking place on the flow.

The goal of this project is to investigate the mixing and transport properties of open chaotic flows and develop a general theory capable of predicting and explaining the transport properties of these systems. The theory will be based on the advection-diffusion partial differential equation. The main idea is that the main eigenvalues and eigenmodes of the advection-diffusion operator describe the long-time transport properties of the system. The scaling and behaviour of the eigenmodes will be estimated by developing approximations based on the fractal geometrical properties of the chaotic advection, and will also be calculated numerically for some simple flows. Mixing and reaction dynamics will then be expressed in terms of the eigenmodes and eigenvalues. To test the theory, we will apply it to the flow configuration describing an experiment performed to study geophysical transport mechanisms, and we will compare the theoretical predictions to the experimental findings.

Statistical physics of DNA replication

Main supervisor: Alessandro Moura (

The goal of this project is to apply statistical physics and probability theory to the problem of DNA replication in living cells. The replication of DNA is one of the most important processes in all of biology. DNA encodes all the information that is passed on to the next generation of cells, and it must be rapidly and faithfully copied when the time comes for cells to divide. Dramatic advances in sequencing technology and microscopy in the last two decades have allowed unprecedented experimental access to the inner workings of cells. We now have quantitative measurements of the dynamics of DNA replication in populations and even in individual cells. This means that the approaches of physics and applied mathematics can be applied to the study of DNA replication, and can be used to gain better understanding and new insights on this crucial phenomenon.

The replication of DNA is executed by molecular machines called DNA polymerase, which travel along the DNA molecule as it replicates it. The DNA polymerases must be assembled from several molecules at the starting point of replication – specific locations on the DNA called replication origins. These precursor molecule move in the nucleus through Brownian motion, making this a stochastic process. Once assembled, they travel through the DNA, “unzipping” it into its component strands and performing the replication as it goes. Because the DNA polymerase is a molecular machine, it is subject to thermal fluctuations from the environment, which affects how it moves. To further complicate things, the DNA is a busy place: many processes are taking place in it at the same time, especially protein synthesis. So the DNA polymerases can collide with other molecules bound to DNA and get stuck for a while. For all those reasons, DNA replication is expected to be highly stochastic. However, most models assume that the DNA polymerases travel at constant speed on the DNA. We will formulate mathematical models taking the stochastic nature of the movement of DNA polymerases into account. We will create numerical simulations to test the predictions from our theory, and compare our results to experimental data available from collaborators. We will also examine the assembly of the DNA polymerases from its component molecules, and model its waiting-time statistics.

Boundary Effects in Active Matter Systems

Main supervisor: Francesco Gonelli (

The study of collective properties of active matter is a fast-emerging interdisciplinary research field. Collections of interacting active particles describe the collective motion (or “flocking”) observed in systems as diverse as vertebrate groups (bird flocks, fish schools, mammal herds, etc.), insect swarms, colonies of bacteria, molecular motors, as well as driven granular matter.

So far, active matter has been mainly studied in the bulk, addressing large systems and disregarding the effects of the border. In many finite systems of biological (i.e. bird flocks) or experimental interests (i.e. active colloids), however, boundary effects cannot be easily disregarded and could indeed impact the bulk dynamics.

This theoretical project, at the forefront of active matter research, will investigate the effects of boundaries in finite flocking systems making use of both analytical methods and direct numerical simulations.

Enhancing predictability in chaotic systems

Main supervisor: Francesco Gonelli (

Predictability is the degree to which we are able to correctly forecast the future state of a system given the (imperfect) knowledge of its present state. While our ability to accurately describe many physical systems via a deterministic set of equations may naively lead one to the impression that the future of such systems can be determined with a high predictability, this is dramatically untrue for a large class of deterministic system displaying chaos.

The goal of this research program – which involves theoretical analysis and numerical simulations -- is to apply fundamental results and recent developments (to which the supervisors significantly contributed) of dynamical system theory to enhance our ability to predict the future evolution of complex and chaotic dynamical systems -- such as weather and climate systems.

Neuronal Dynamics from a complex network perspective

Main supervisor: Ekkehard Ullner (

The human brain is possibly the most intriguing complex system we know. The combination of experimental work and theoretical approaches has gained many insides in the last century but we are still far away from a real understanding.

The goal of the project is to combine dynamical-system and statistical-mechanics tools to understand the functioning of neural networks. While most of computational neuroscience focuses on rate models, it is obvious that neurons communicate by emitting spikes and it is thereby worth, if not necessary, to explore more realistic setups involving pulse-coupled neurons. In such a context, we wish to make use of concepts such as synchronization, phase-transitions, response theory to improve our comprehension. On a more specific level, the project unfolds by means of direct numerical simulations of the "microscopic" equations, combined with the analysis of "macroscopic" equations describing suitable probability densities. Depending on the interest of the potential applicants, the focus can be adjusted and be more theoretically or numerically oriented.

The student will learn techniques to characterise dynamical systems by means of chaotic measures (e.g. Lyanunov exponents, entropy, dimension), network measures and universal approaches to analyse large data. The project will familiarise the student with model building with a neuronal context and beyond, transferable to other natural and manmade complex networks. This includes the understanding of different levels of abstraction, the necessary and meaningful choice of simplifications for the model building in the context of scientific question. Programming competences are welcome.

The mathematics behind the smartness of neural networks

Main supervisor: Murilo Baptista (

Intelligence is one of the pillars that allows animals to master their environment. Scientific approaches recently proposed have been capable of simulating networked systems that reproduce similar emergent manifestations of behavior as those observed in the brain. This big scientific area was coined as Artificial Intelligence (AI). Our society is today intrinsically connected to it. Why and how a neural network can be trained to process information and produce a logically intelligent output is today still a big mystery, despite the explosive growth in this area. Its success in solving tasks cannot today be fully explained in physical or mathematical terms. Contributing to this challenge is the grand goal of this PhD project: the creation of a general theory that describes the fundamental mathematical rules and physical laws, relevant properties and features behind the “intelligent” functioning of trained neural networks. To this goal, this project will focus in a simpler but also successful type of machine learning approach, named Reservoir Computing (RC). However other more popular approaches will also be considered. In RC, the learning phase to train a dynamical neural network to process information about an input signal only deals with the much easier task of understanding how the neural network needs to be observed, without dealing with the more difficult task of doing structural changes in it (e.g. as no deep learning). We aim at showing with mathematical rigour the contribution of the configurations and emergent behaviour of a dynamical network into the informational processing of an input signal leading to an intelligent response about it. We want to show why chaos in neural networks can enhance the smart behaviour of trained neural networks. Another goal will be to determine how “intelligence”depends on the particular way a network is observed to construct the output functions. Today, output functions are constructed based on the randomly chosen observation of some neurons in the network. The outputs of this project will potentially contribute to a better understanding of how our own brain computes. But, it will also contribute towards industrial exploitation of neural networks, by developing a mathematical formalism to create simpler but smarter neural networks that can process quicker more information with less computational resources.

Fast, reliable and secure wireless IoT chaos-based communication

Main supervisor: Murilo Baptista (

This PhD project tackles scientific issues aimed at creating and laying the mathematical foundations of an innovative Wireless IoT Communication System that is reliable (information arrives), fast and light (less power, less hardware, less computation, higher bit rates), universal (to mainly function underwater, but also appropriated to other wireless media), secure (not disclosing information to untrusted agents). This PhD project will be separated in 2 main scientific challenges. The first challenge will be to understand under which conditions and configurations chaotic signals propagating in non-ideal channels can naturally support networked underwater communication systems involving several agents. The innovation here is to extend previous works that have shown the use of chaos for communication systems involving two agents in ideal channels to networked communication in the non-ideal channel. The second challenge is focused on network communication in non-ideal channels with trusted agents, knowledge required for the creation of the IoT Communication chaos-based system proposed. In the non-ideal channel, the received signal is a composition by interference of strongly distorted signals coming from several transmitters and propagating over several paths. The goal is to show that fast and light neural networks can be trained to recover information from a unique trusted transmitter, potentially enable “smart” data analytics about the information received. Communication can be done only by the trusted agents who know specifics of the training, knowledge that will provide support for the creation of a secure IoT communication system.

New calcium phosphate bone cements for potential biomaterial applications; bone repair and/or scaffold fabrication

Supervisors: Iain Gibson ( and Jan Skakle (

Calcium phosphates bone cements are widely used for bone-replacement materials due to their chemical similarity to natural bone but also their ability to set and harden in situ and their ability to be injected as a paste. The impetus for such research is the ageing and more active population – the need to keep quality of life and activity, and to provide materials with longer endurance within the body.

In our biomaterials group the focus is on the synthesis and characterisation of various bioceramics, particularly in the targeted doping of the parent calcium phosphates with the aim of improving the properties, such as solubility, cell interaction, handling, and mechanical properties. Our research extends from basic science and fundamental materials chemistry through to biological testing and commercial application of such materials.

The focus of the project, therefore, is to synthesise and control the doping of different elements into parent calcium phosphates that will be one or more of the reactant phases of the cement formulations. These will be used in developing new calcium phosphate bone cement formulations. These materials have potential application in fabricating 3D scaffolds for bone tissue engineering and regenerative medicine, or as medical devices. The materials developed will be characterise using X-ray diffraction, Raman and IR spectroscopy, solid state NMR, electron microscopy, surface area and porosity quantification, solubility testing and mechanical testing.

Whilst some chemical knowledge is highly desirable, knowledge of the solid state and a willingness to learn and engage in interdisciplinary research is essential for such work, where chemistry, materials science, physics, engineering, biology and medicine meet.

Modelling the deformation of cells

Main supervisor: Francisco Peres (

Cells are the elementary building blocks of living organisms. The correct functioning of living organisms is therefore conditioned to the ability of living cells to withstand forces and deformations and to promptly adapt to their mechanical environment. Alteration of the cell mechanical properties can contribute to disease such as cancer. It is, therefore, crucial to identify the conditions that compromise the mechanical resilience of cells.

We recently observed that cells pocked by a microscopic cantilever respond with sudden avalanche-like events [1]. Avalanche dynamics have been observed in solids (e.g. during fracture or deformation of shape memory alloys) but are more surprising for living cells which are regarded as a soft material. The behaviour is, however, not completely unexpected since cells exhibit solid and liquid-like properties.

Another intriguing manifestation of solid-like behaviour of cells is the recently observed degradation of the cytoskeleton integrity under cyclic loading [2]. This behaviour is reminiscent of the degradation in solids which has been explained in terms of deformation-induced dislocations [3]. The mechanism behind degradation for cells, however, is unknown.

This project will study avalanches and degradation of cells under mechanical load. The study will be based on network models inspired from models of avalanches in solids [3,4]. Predictions of the model will be validated through comparison with experimental data. After validation, models will be used to make new predictions that can motivate new experiments.

Learning from the past to predict and reduce the risk of infectious disease pandemics

Main supervisor: Francisco Perez-Reche (

The World Health Organisation reports that infectious diseases cause 63% of childhood deaths and 48% of premature deaths. There is the ongoing risk of epidemics and pandemics that can cause widespread morbidity and mortality (Spanish flu, ebola, SARS, E. coli, Listeria etc).

Infectious diseases can reach humans in many different ways: they can be transmitted between healthy and infected people, through consumption of infected food or water, through contact with animals, etc. The world is massively interconnected enabling people, animals and food to move rapidly between continents along with infectious disease agents.

Tracing the origin and spread of infectious diseases has never been more challenging and more important. The spectacular developments in detection and whole genome sequencing of disease agents as well as the computational power which enables timely processing of big data offers the opportunity to tackle this problem.

For example, the geographical spread of infectious disease is generally insufficient to trace back the labyrinth of possible pathways through which humans become infected. However, combining geographical information on disease spread with information on the genetic evolution of the infectious disease agents has promise [1-3]. Methodologies to achieve this are still in their infancy and this project is an opportunity to make a significant contribution in tackling this critical problem. The project will:

  1. Use computer workstations to simulate the spread and evolution of infectious disease agents. Simulations will be based on geographical and whole genome sequence datasets of real pathogens. The simulations will generate virtual histories of their spread and evolution.
  2. Use these results to develop methods that explain how epidemics occurred.
  3. Use this knowledge to predict future epidemics and to develop and simulate strategies to reduce infectious disease risk.

Heat conduction in classical one-dimensional system

Main supervisor: Antonio Politi (

Heat transport in classical one-dimensional systems is a long-standing problem, which goes back to the beginning of the 19th century, when Fourier formulated the famous diffusion heat law. In the last two decades, much progress has been made thanks to numerical simulations and, more recently, to the application of fluctuating-thermodynamics arguments [1,2] The resulting message is that whenever a one-dimensional system has only internal forces (i.e. the momentum is conserved), heat conductivity diverges in the thermodynamic limit (in the limit of infinitely long chains). On the other hand, numerical simulations suggest that the above scenario does not arise in some setups. Is this evidence of strong finite-size corrections, or even of the need to revisit the theory? The starting point of the project is a simple model, where hard-core collisions combine with harmonic interactions. As a result, it is found that heat conductivity remains, unexpectedly, finite in the thermodynamic limit. The plan consists in generalizing the model to test the robustness of the observed anomaly and to eventually understand its origin. The model is simple enough to allow for extensive simulations. Furthermore, I have some ideas for a novel computational approach, which would further help in performing simulations. Finally, more realistic systems will be explored in a second stage, with the goal of testing the degree of universality of the “anomaly”.

Altogether, the research project will allow the student to familiarize themselves with different numerical methods that can be used outside the specific field addressed by the Thesis work. Additionally, the supervisor is connected with many scientists all over the world, thus offering the chance to get in contact with other approaches and research environments. Last but not least, the selected topic is very challenging: a meaningful progress would be very welcome in the community.

Statistical Data Analysis of medical health records

Main supervisor: Bjorn Schelter (

Massive amounts of data related to long term health conditions is routinely collected by professionals. The analysis of these data sets is challenging not only because of the volume of the data, but especially because of its variety. Standard techniques fail at combing and exploit all information contained therein into one coherent framework that is amenable to statistical analysis.

In this project we will build on our previous research to devise such a framework. Approaches such as mixed models for repeated measures, generalised linear models, state space modelling, and non-parametric statistics will provide the basis for this demanding project.

Efficient visualisation and presentation of the results will be used to communicate the findings to medical professionals. Involvement of various stakeholders will guarantee a tailored solution maximising the benefit for patients.

Young researchers will be embedded in an active and striving environment, with state of the art facilities and strong connections to local stakeholders and industry.

Machine learning approaches to advance clinical decision making

Main supervisor: Bjorn Schelter (

Massive amounts of data related to long term health conditions is routinely collected by professionals. The analysis of these data sets is challenging not only because of the volume of the data, but especially because of its variety. Standard techniques fail at combing and exploit all information contained therein into one coherent framework that is amenable to statistical analysis.

In this project we will build on our previous research to devise such a framework. Approaches such as machine learning (artificial neural networks, support vector machines, decision trees, etc) and state space modelling will provide the basis for this demanding project.

Efficient visualisation and presentation of the results will be used to communicate the findings to medical professionals. Involvement of various stakeholders will guarantee a tailored solution maximising the benefit for patients.

Young researchers will be embedded in an active and striving environment, with state of the art facilities and strong connections to local stakeholders and industry.

Loop quantum gravity with conformal and scale invariance

Main supervisor: Charles Wang (

A new theoretical framework of loop quantum gravity that incoperates conformal and scale invariance is recently been constructed. It supports a large class of theories of gravity and gravity-matter systems, including general relativity and scale-invariant scalar-tensor and dilaton theories. Its consistent matter coupling must also be conformal or scale invariant, including importantly standard model-type systems. The aim of this PhD project is to further develop the new theory in two significant directions as follows.

  •  Mathematical foundation. The new loop quantization is based on an extended conformal or scaling symmetry carried by scale fields. Under this project, new polymer structures will be developed to provide a rigorous mathematical foundation of the new theory.
  •  Loop quantum cosmology. In standard quantum cosmology, the Big Bang scenario is replaced by the so-called Big Bounce. However, the new theory is expected to lead to a radically new quantum cosmological model. Mathematical modelling with computer simulations of the new model will be performed including the very early Universe stage where different scenarios in connections with the Big Bang or Big Bounce are expected.