movement: a Python package that simplifies analysis of animals in motion.
Working with Dr Sepiedeh Keshavarzi and the movement core development team, we added features, examples, and datasets to make it easier for researchers to study how animals (and eyes!) move.

12 May 2025
Written by: Dr Stella Prins
Advances in animal tracking
One important aspect of behavioural studies is quantifying how animals move. Recently developed tools such as DeepLabCut and SLEAP, have made it easier for researchers to track how animals move. These tools use deep learning methods to label videos with “keypoints” of animals - for example their ears, snout, or paws.
movement
There is, however, no standardised way to process and analyse the outputs, or “pose tracks”, of these tools. To address this, the neuroinformatics unit (NUI) – a group of research software engineers embedded in the Sainsbury Wellcome Centre (SWC) and Gatsby Computational Neuroscience unit (GCNU) – has been developing a software called movement. The aim of movement is to make it easy to process, analyse, and visualise these pose tracks.
One of the researchers interested in using movement for her research is the head of the Brain Circuits for Sensation and Cognition lab at Cambridge University, Dr Sepiedeh Keshavarzi. Wellcome Trust funding was made available by Sepi to implement new movement features and develop tutorials that demonstrate the use of movement in the kinds of workflows her group uses. Will Graham and I from the ARC collaborations team worked closely with Niko, who's leading the movement development team, to achieve this.
Working together in a short but focused period enabled us to combine our skills and knowledge to tackle issues, leading to quick and effective development. Will's background in mathematics and familiarity with vector operations was instrumental for tackling the more technical issues. On the other hand, my background in experimental neuroscience gave some insights in how to deal with eye movement data and helped us grasp the needs of users like Sepi. It also enabled us to learn from each other; for example, Will introduced me to polar-to-Cartesian coordinate conversions, the right-hand rule, and some neat ways to parametrize tests, and from Niko, I learned how to easily plot multifaceted data using xarray plotting features.
In an intense burst of work, we enabled handling of regions of interest (ROIs), made it easier to plot results, and added new sample datasets and examples to demonstrate how movement is used in common workflows. Will wrote a comprehensive blog detailing the work we did and the roadmap we used.
Regions of Interest
One of the features that the team was keen to introduce was support for labelling ROIs within movement. Typically, an experiment takes place in an arena that can be divided into different regions, such as the nest of the animal in the arena, or maybe two rooms separated by a doorway or corridor. The positioning of an animal relative to these ROIs provides insights into its behaviour. For example, ROIs allows researchers to analyse the amount of time the animal spends in each region and the orientation of its head relative to labelled objects. Using this information, researchers can deduce whether the animal prefers certain areas and whether it is likely that the animal is paying attention to visual cues. Such insights are invaluable for understanding perception and cognitive processes, including how the animal navigates its environment, responds to stimuli, and forages.
We introduced functionality for checking whether animal keypoints occupy ROIs at each timepoint during the experiment, their relative angle or distance to an ROI, and a means for visualising ROIs on top of spatial plots. This example shows how these new features can be used in a typical workflow. This work has been an important stepping stone for current development work that allows users to graphically define and interact with ROIs via movement’s napari plugin. Having a convenient representation for ROIs, as well as a simple interface for extracting information from them, has been a core development objective of the project.
“While pupillometry is outside movement’s initial design intentions – movement was conceived and developed to analyse animal postures – we soon realised that existing movement features could be used to quantify pupil position, velocity and diameter.
Eye opening
One of the things that Sepi investigates with her research, is how the brain combines visual input with motion cues from the head and eyes. To investigate this, she uses pose tracks of moving eyes instead of animals. While pupillometry is outside movement’s initial design intentions – movement was conceived and developed to analyse animal postures – we soon realised that existing movement features could be used to quantify pupil position, velocity and diameter. To demonstrate this, we added two sample pupillometry datasets (now available through the movement package) and wrote this example dedicated to analysing eye pose tracks. This widens the potential scope and user base of movement, by expanding its use into research areas beyond the one it was originally intended for!
Plotting made easy
Some of the smaller features that we worked on during this collaboration involved standardising and simplifying the way plots of datasets are produced (which now can be done with one function call rather than a dozen of lines of manual code), expanding on the existing suite of examples, and methods for scaling the units and values in datasets consistently.
Join the movement!
Inspired to learn more about movement and make your own contribution?
The first Neuroinformatics Unit Open Software Week is taking place at the Sainsbury Wellcome Centre in London, 11th to 15th August 2025 and includes a animals in motion track for researchers and students interested in learning about the latest open-source tools for tracking and analysis of animal movement from videos.
Unable to Make it? movement is an open-source software and contributions are very much encouraged, see the movement community page for more information on how to become involved.
People
Cambridge University Brain Circuits for Sensation and Cognition lab:
SWC and GCNU NIU: