UCL Department of Space and Climate Physics


EPSRC DTP PhD Project 2022

One EPSRC DTP studentship starting in September 2022 is open for the application for the following research project until 31st January 2022.  

 For more information on the application process, please visit the UCL Graduate Degrees pages and read the "guidelines for research programmes" carefully. To apply, please visit the Online Application page, select department of "Space & Climate Physics" and programme type of "Postgraduate Research". After pushing "Search Now" button, select "RRDSPSSING01: Research Degree: Space and Climate Physics" for Full-time or Part-time mode. Please cleary indicate that you are applying for the EPSRC DTP PhD project in your application document. 

Entry requirements

An upper second-class Bachelor’s degree, or a second-class Bachelor’s degree together with a Master's degree from a UK university in a relevant subject, or an equivalent overseas qualification.

Additional eligibility requirements

The Engineering and Physical Sciences Research Council (EPSRC) Doctoral Training Programme (DTP) studentship will pay your full tuition fees and a maintenance allowance for 4 years (subject to the PhD upgrade review).

Geometric deep learning for likelihood-free statistical inference

Supervisor Prof. Jason McEwen

Deep learning has been remarkably successful in the interpretation of standard (Euclidean) data, such as 1D time series data, 2D image data, and 3D video or volumetric data, now exceeding human accuracy in many cases. However, standard deep learning techniques fail catastrophically when applied to data defined on other domains, such as data defined over networks, graphs, 3D objects, or other manifolds such as the sphere. This has given rise to the field of geometric deep learning (Bronstein et al. 2017, arXiv:1611.08097; Bronstein et al. 2021, arXiv:2104.13478).

The bedrock of much scientific analysis is statistical inference, in particular Bayesian approaches. Recently, simulation-based inference techniques (cf. likelihood-free inference) have emerged, and are rapidly evolving, for scenarios where an explicit likelihood is not available or simply to speed up inference in time-critical applications (e.g. in gravitational wave detection for rapid electromagnetic follow-up). For a brief review see Cranmer et al. 2020 (arXiv:1911.01429). These techniques build on powerful machine learning models for probability distributions (e.g. Papamakarios et al. 2021, arXiv:1912.02762).

The focus of the current project is to develop integrated geometric deep learning and simulation- based inference techniques (cf. likelihood-free inference) for data defined over complex domains, such as spherical manifolds and graphs. This will involve developing geometric emulation, imaging, and inference techniques as part of a overarching inference pipeline. A key component of such a pipeline will be geometric scattering network representations (Mallat 2012, arXiv:1101.2286; McEwen et al. 2021, arXiv:2102.02828). The techniques developed will have application in cosmology, medical imaging, geophysics and beyond; we will collaborate with others to apply them in the aforementioned fields.

The student should have a strong mathematical background and be proficient in coding, particularly in Python. Experience in deep learning is advantageous. The expertise gained in both geometric deep learning and likelihood-free inference will prepare the student well for a future research career either in academia or industry. In particular, both geometric deep learning and likelihood-free inference are specialities highly sought after in industry by companies such as Google, Twitter, Facebook, Amazon and many others.

Other PhD opportunities in UCL

There are similar PhD projects available in other departments in the UCL, which are sometimese (co-)supervised by our academic staff. Such opportunities are listed below. Please note that you need to submit an application to the other department or Centre for Doctoral Training, if you are also applying for their projects.