Cosmoparticle Initiative


Stephen Feeney appointed to Lectureship in Astrophysics

20 July 2020

The Cosmoparticle Initiative congratulates core staff member Stephen Feeney on his appointment

Stephen Feeney

After graduating from Cambridge University, Stephen worked as a software engineer for 5 years, before joining UCL for his PhD supervised by Prof Hiranya Peiris. He was appointed to a post-doctoral research post at Imperial College London before gaining a Research Fellowship at the Flatiron Institute’s Center for Computational Astrophysics in New York City. He returned to UCL last year as a Royal Society University Research Fellow, becoming a Cosmoparticle initiative core member in January 2020.

Q. What were your research interests before joining UCL?

All sorts! My PhD focussed on searching for hints of new fundamental physics (colliding bubble universes and topological defects) in observations of the cosmic microwave background. In the last few years I've worked a lot on determining the (highly controversial) expansion rate of the Universe using both standard distance-ladder techniques and new methods based on gravitational-wave standard sirens. Fundamentally, I'm interested in any hard inference problem in astrophysics, and have also worked on transient detection, modelling of stellar spectra, astrochemistry, likelihood-free inference and more.

Q. What do you think is different or special about the Cosmoparticle Initiative? How would you describe the research opportunities it offers?

The Cosmoparticle Initiative is special because it makes a concrete commitment to bringing researchers from different groups together. In my experience, co-locating researchers is critical to breaking down field-specific language barriers and discovering common interests and complementary skills.

Q. What are you currently working on now in the Cosmoparticle Hub?

How observations of mergers between black holes and neutron stars might tell us exactly how fast the Universe is expanding, and whether we can determine that expansion speed using simulations alone. Traditional inference methods rely on our ability to calculate the likelihood: the probability of obtaining our data given some physical parameters. In many cases, however, the likelihood is unknown or very hard or expensive to calculate. In those cases, we can instead try to learn an approximate likelihood by simulating the data repeatedly for many different parameter values. Using this likelihood to determine the most probable parameters given the data is called likelihood-free inference. It’s particularly useful for figuring out parameters (such as the Universe’s expansion rate) from populations of objects (such as binary neutron stars) because the way we select populations is often hard to include in a likelihood, but typically easy to include in simulations.

Q. What further areas or research programmes are you interested in?

I'm really interested in applying the inference tools I've helped develop to problems in particle physics. They face many of the same challenges as we do in astronomy: trying to tease out subtle information from very large and complex datasets, in the presence of complicated potential systematics. In principle, these problems could be tackled by a common toolset.

Q. What are your plans for the next few years?

That would be telling! But it will involve probabilistic models, likelihood-free inference, and a wide range of astrophysical data and phenomena. For example, I’m keen to develop statistical tools to determine how well probes of distance in the local Universe (namely Cepheid variables and stars at the tip of the red giant branch) agree, and identify any as-yet undiagnosed systematic errors they might have.

Stephen Feeney’s website