UCL Spatial Computing Lab
The Lab explores how immersive tech like VR and motion capture can support creativity, learning and human connection across disciplines.
The Lab
The UCL Spatial Computing Lab was established in 2025. It brings together pioneering research in spatial computing by the UCL Institute of Education. The Lab combines existing expertise in a wide range of both applied and theoretical fields, such as:
- music composition
- architecture and urban design
- human-computer interaction (HCI)
- education
- artificial intelligence
- game design and development
- museology
- computer science
- 3D animation
- linguistics.
It is a collaborative hub for researchers from different disciplines, who study spatial technologies, how they connect people with digital systems and shape creativity, learning and communication.
Mission
The Lab’s mission focuses on two key areas:
- To advance applied research using emerging technologies such as:
- extended reality (XR)
- motion capture (including biosensors)
- full-body haptics
- artificial intelligence
- spatial audio.
- To better understand how people use, interact with, and create through these tools.
The lab highlights the social and cultural sides of spatial computing. It places new technologies in the context of human creativity, experience and education.
Equipment
The Lab uses advanced motion capture, wearable sensing and XR devices. These tools support interdisciplinary projects across art, design, and science. They aim to connect creative practice, embodied knowledge, and human-computer interaction (HCI). A more human-centred understanding of emerging spatial technologies is one of its goals.
These facilities enable research that connects creativity, embodiment, and computation. They allow to model, measure, and imagine how people create, move, and play in spatially intelligent environments.
Our digital tools
- Custom Desktops (RTX 5090, 64GB RAM): High-end workstation for real-time simulation, spatial audio rendering, and VR development.
- 3XS Vengeance 18" Laptops (RTX 5090, 64GB RAM): Portable powerhouse for field-based XR demonstrations and data processing.
- Mac Studio M4 Max and MacBook Pro M4: For creative production, media design, and software development in macOS environments.
- Meta Quest 3 Headsets: Wireless mixed reality headsets for multi-user and collaborative immersive experiences.
- Meta Quest Pro Headsets: High-resolution MR devices for advanced spatial interaction studies.
- Bobo VR S3 Headstraps: Comfort and stability enhancements for extended VR sessions.
- Ray-Ban Meta Wayfarer Smart Glasses: Wearable mixed reality eyewear enabling spatial video and sensor-based social interaction research.
- Samsung Galaxy S9 Tablets: Touch and stylus-based mobile interfaces for spatial annotation and creative workflows.
- Kat Walk C 2+ VR Treadmill: Full locomotion platform allowing safe, room-scale navigation in virtual environments.
Delsys Trigno Centro EMG System: Advanced electromyography (EMG) platform with 8 Avanti sensors, wireless base station, and full EMGWorks software suite. Enables high-fidelity muscle activity and gesture tracking for embodied interaction research.
- Max 9 Software Licenses: Visual programming environment for sound, movement, and interactive media design.
- Unreal Engine, Unity, and other development platforms. The equipment above can be integrated with game engines.
- Custom in-house software for bridging cutting-edge biosensors with game engines (Unity).
Get in touch
Please contact us for enquiries or collaborative opportunities.