Gatsby Computational Neuroscience Unit


Our Research

The Unit's research encompasses theoretical and computational neuroscience, computational statistics, machine learning and AI.

We aim to uncover the mathematical basis of intelligent behaviour in both natural and artificial systems.  In neuroscience, we work closely with experimentalists—most notably in the Sainsbury-Wellcome Centre—to understand computational principles that emerge from consideration of measured neural activity, in tandem with the pursuit of theories that connect principles of learning and computation to the substrates of neural circuits.  Our work in machine learning is similarly directed to understanding fundamental computational principles, elaborating the mathematics that underlie data-based discovery of structure, predictability and causality.

This page provides an overview of some common themes in the Unit's research.  For more details please visit the pages of individual members.

Theoretical Neuroscience


How activity in neural populations reflects properties of stimuli, actions and internal cognitive variables is one of the most fundamental questions in neuroscience.  We tackle this question in many ways, on the one hand working with empirical data, particularly from large populations, to understand, process and formalise the information available within them; and on the other, addressing theoretical issues associated with sophisticated versions of population codes.  A common thread in much of our work is the documented robustness of perceptual and motor systems in the face of unexpected noise, non-stationary environments and the concommitant uncercertainty -- a robustness that sets them apart from even the best artificial systems.  A significant thrust of our theoretical work has looked at how neural representations may account for uncertainty in internal variables to achieve such robustness.


Biological neural networks exhibit rich dynamical behaviours, whose importance for computation is under constant debate.  We study the computations achieved by recurrent dynamical systems with varying degrees of biological realism, looking for the general principles of computation-through-dynamics.  These include data-driven models of motor cortex, dynamics in coupled excitatory-inhibitory systems, models of olfactory processing, and others.  We also study the dynamical properties of active membrane processes associated with spiking.


Neural systems are remarkable in their ability to adapt to and learn from experience.  We seek to understand the principles that guide this learning in many settings: based on sparse reinforcement or on rich teaching signals, or from the structure of the environment alone.   Behavioural investigations help to identify the capabilities and limits of biological learning.  Theoretical work, cross-referenced to experimental data, tackles difficult problems in learning such as credit assignment (which synapse should adapt to improve prediction) and structure identification (how is the environment best parsed into its constituent causal components) by looking for biologically plausible algorithmic solutions.   At the circuit level, learning has measurable physiological correlates in terms of changes at individual synapses, as well as in resulting modifications of the stimulus-response properties of individual neurons. We study the theoretical significance of these changes at a number of levels, including the interpretation of spike-timing update rules for synaptic strength, the interaction of reinforcement and neuromodulation with receptive field plasticity, and the consequences of plastic changes on perceptual learning.


Although the principles of neural computation may apply broadly, theories can only be evaluated experimentally by considering specific neural systems.  We develop theory and data analysis methods to investigate the organisational and computational principles that lie behind physiological, anatomical, and psychophysical observations in many different subsystems of the brain.  These range from sensory or perceptual systems including vision, audition and olfaction, control systems underlying motor action, systems that effect choices and learning from reinforcement signals, as well as the systems that underlie more elaborate cognition such as context-driven decision making, mapping and contextual awareness, attention and planning.

Machine Learning

Graphical Models

Realistic models often require representing the dependencies between many random variables. Graphical models provide an elegant formalism for representing these dependencies and for implementing efficient probabilistic inference and decision making. We study novel algorithms for approximate inference and methods for learning both parameters and the structure of graphical models from data.

Kernel Methods

Difficult real-world pattern recognition and function learning problems require that the learning system be highly flexible. Kernel methods such as Gaussian processes and support vector machines are one way of defining highly flexible non-parametric models based on similarities between data points. Gaussian processes, which correspond to neural networks with infinitely many hidden neurons, have proved powerful at avoiding some of the common pitfalls of learning such as 'overfitting'. We focus on how to make kernel methods even more flexible and efficient, how to learn the kernel from data, and how to use them in a variety of applications. 

Bayesian Statistics

Bayesian statistics is a framework for doing inference by combining prior knowledge and data, and as such has been influential in the understanding of intelligent learning systems. We work on many areas of Bayesian statistics, including using variational methods to do inference efficiently in complex domains, model selection and non-parametric modelling, novel Markov chain methods, semi-supervised learning and modelling temporal sequences.

Reinforcement Learning

Reinforcement learning studies how systems can actively learn about the transition and reward structure of their environments and come to choose appropriate actions. Apart from the links with conditioning and neuromodulation, we have studied various aspects of the trade-off between exploration and exploitation, the effects of approximation and the divination of hierarchical structure.

Network and Relational Data

Someone hands you a data set that represent a small part of a large network - a social network or synaptic network, say. What can you learn from the data about the network as a whole? How should sample data be selected from a network in the first place, in order to be informative? Such questions are fundamental, but much harder than one might expect, and where we have answers, they are often far from obvious. They lead to a rich nexus at the intersection of machine learning, statistics, and probability. Ingredients range from both Bayesian modelling and empirical risk minimisation, through old favourites like sufficient statistics and convex analysis, to symmetry properties and dynamical systems.

Neural Data Analysis

The brain is perhaps the most complex subject of empirical investigation in scientific history. The scale is staggering: over 10 11 neurons, each making an average of 10 3 synapses, with computation occurring on scales ranging from a single dendritic spine to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterise this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. In collaborations with experimental laboratories we have adapted machine learning techniques to characterise data from multiple extracellular electrodes, from identified single cells, as well as from local-field and magnetoencephalographic recordings. These studies have the potential to introduce powerful new theoretically-motivated ways of looking at neural data.

Recent Publications

RPS Widget Placeholderhttps://research-reports.ucl.ac.uk/RPSDATA.SVC/pubs/Null?orderby=year&st...'2018'&deptcode=D76