'Designing Multisensory Interfaces' seminar by Dr. Jin Ryong Kim (UCLIC seminar)
Designing Multisensory Interfaces: From Perception to Human-Centered Intelligence
Abstract
Our digital experiences remain largely visual and auditory, leaving much of our sensory potential underexplored. My research explores how multisensory feedback, including touch, temperature, moisture, and other sensory cues, can make interactions with technology more expressive, embodied, and empathetic. At the Multimodal Interaction Lab, we take a perception-driven and illusion-based approach, studying how people perceive and integrate sensory cues and translating these insights into compact yet compelling XR experiences. By treating perception as a design material, we transform psychophysical insights into expressive forms of human–computer interaction, including interfaces that convey social touch and subtle affective signals through multisensory feedback.
In this talk, I will describe how thermal–tactile integration evokes compelling sensations in XR environments and introduce thermal masking, a perceptual phenomenon we discovered in which vibration alters temperature perception. Vibration can redirect sensations of warmth or coolness, creating vivid thermal illusions while reducing hardware complexity and power demands. Building on this principle, we developed systems such as HeatFlow and Fiery Hands that deliver dynamic thermal feedback for expressive and energy-efficient interaction. Extending this approach, our work explores multisensory interfaces that combine touch, temperature, moisture, and motion to communicate information, emotion, and intent, and points toward future AI-powered agents that interact through rich sensory channels to enable more natural and emotionally intelligent human–computer interaction.
Dr. Jin Ryong Kim is an Assistant Professor in the Department of Computer Science at The University of Texas at Dallas, where he directs the Multimodal Interaction Lab (https://mi-lab.io).
Short Bio
Dr. Jin Ryong Kim is an Assistant Professor in the Department of Computer Science at The University of Texas at Dallas, where he directs the Multimodal Interaction Lab (https://mi-lab.io). His research lies at the intersection of human–computer interaction, multisensory interfaces, and extended reality, with a focus on designing systems that integrate touch, temperature, and other sensory cues to create more immersive and expressive interactions. By combining psychophysics, interface design, and engineering, his work explores how human perception can inform the design of compact and energy-efficient multisensory systems for virtual and augmented reality.
Dr. Kim received his Ph.D. in Electrical and Computer Engineering from Purdue University and previously worked as a researcher at Alibaba Group and Electronics and Telecommunications Research Institute. His work has received several recognitions, including the NSF CAREER Award, CHI Best Paper Honorable Mention, UIST Best Paper Honorable Mention, and multiple Best Demo Awards at venues such as IEEE World Haptics Conference, ISMAR, and SIGGRAPH Asia. His research has been published at leading venues including CHI, UIST, ISMAR, IEEE VR, and IEEE Transactions on Haptics.
Getting to 66–72 Gower Street
Please go directly to 66–72 Gower Street. The entrance is clearly marked with the street number and opens directly onto Gower Street.
This entrance is step-free and accessible, with a ramp leading up to the main doors.
You can use the UCL map here:
https://maps.ucl.ac.uk/66-72-gower-street
Once inside, you will arrive in a small lobby area. The room we are using is immediately to the right of the lobby, so you should see it as soon as you enter. If you are unsure, someone from our team will be nearby to help guide you.
Further information
Ticketing
Open
Cost
Free
Open to
All
Availability
Yes