OpenVisSim: Using virtual/augmented reality to simulate visual impairments

example graphic

I am using virtual/augmented reality devices, such as Google Cardboard, to create gaze-cointingent simulations of visual impairment. The technology is compatible with all major phones and VR devices, and supports both virtual environments, and 'Augmented Reality' (e.g., via the phone's camera). A selection of screenshots are shown on the right, indicating the sorts of image processing techniques that we have developed.

Overview and validation:

Jones et al. (2020). Seeing other perspectives: Evaluating the use of virtual and augmented reality to simulate visual impairments (OpenVisSim), NJP Digit. Med., 169:49-57. link

Technical details:

Jones & Ometto (2018). Degraded Reality: Using VR/AR to simulate visual impairments, 20018 IEE Workshop: VAR4Good, 169:49-57. link pdf

Source code:

https://github.com/petejonze/OpenVisSim

Example application:

Chow-Wing-Bom et al. (2020). The worse eye revisited: Quantifying the everyday impact of unilateral peripheral loss, Vis. Res., 169:49-57. link

example graphic
Return to top


EyecatcherHome: Home monitoring of visual field loss for glaucoma

example graphic

I am studying whether glaucoma patients are willing and able to perform visual field testing at home, using our own custom perimeter:

Jones et al. (2020). Glaucoma home-monitoring using a tablet-based visual field test (Eyecatcher): An assessment of accuracy and adherence over six months, MedRxiv (pre-print), doi:[10.1101/2020.05.28.20115725]. link

Return to top


Eyecatcher: An open source eye-movement perimeter

example graphic

Eyecatcher is a button-less 'eye-movement' perimeter, in which the patient simply has to look at lights as they appear on the screen. A near-infrared camera tracks their eyes and determines where to position the stimulus (relative to the current point of fixation), and whether or not a light was seen (looked at).

example graphic

I am exploring Eyecatcher as a way of quickly triaging patients on arrival to busy glaucoma clinics:

Jones et al. (2020). Using an open-source tablet perimeter (Eyecatcher) as a rapid triage measure in a glaucoma clinic waiting area, BJO, in press

Jones (2020). An open-source static threshold perimetry test using remote eye-tracking (Eyecatcher): Description, validation, and normative data, TVST, in press

Jones et al. (2020). Portable perimetry using eye-tracking on a tablet computer – a feasibility assessment, TVST, 8(1):17. link pdf

Source code:

https://github.com/petejonze/Eyecatcher

Return to top


pCSF: A child-friendly CSF measure for amblyopia

example graphic

The pCSF is a fun, tablet-based test of the Contrast Sensitivity Function [CSF]. The child's task is to simply 'pop' (press) Gabor patches as they bounce around the screen. Behind the scenes, a Bayesian adaptive algorithm (QUEST+) is used to estimated detection thresholds for each spatial frequency.

Elfadaly et al (2020). Can psychophysics be fun? Exploring the feasibility of a gamified Contrast Sensitivity Function measure in amblyopic children aged 4 – 9 years, Front. in Med., in press

Supporting papers:

Farahbakhsh et al. (2019). Psychophysics with children: Evaluating the use of maximum likelihood estimators in children aged 4 – 15 years (QUEST+), J. Vis., 19:22. link pdf

Jones (2018). QuestPlus: a MATLAB implementation of the QUEST+ adaptive psychometric method, J. Open Res. Soft., 6(1):27. link pdf

Source code:

https://github.com/petejonze/pCSF

Return to top


Lapse Tracking: Detecting and accounting for lapses in concentration

example graphic

Lapses in concentration can result in misleading test data, and are a particular problem in children and certain clinical populations:

Manning et al. (2018). Psychophysics with children: Investigating the effects of attentional lapses on threshold estimates, Atten., Percep., & Psycho., 80(5): 1311–1324. link pdf

Jones et al. (2016). Optimizing the rapid measurement of detection thresholds in infants, J. Vis., 15(11):2. link pdf

I am interested in using affective computing techniques (e.g., machine learning) to detect and adjust for lapses in concentration in real time:

Jones (et al (2020). The human touch: Using a webcam to autonomously monitor compliance during visual field assessments, TVST. Accepted 17th March 2020.

Jones (2018). Sit still and pay attention: Using the Wii Balance-Board to detect lapses in concentration in children during psychophysical testing, Behav. Res. Methods, 51:28-39. link pdf

Return to top


ACTIVE: An automated preferential looking acuity test for infants

example graphic

The world's first fully-automated infant acuity test, using a computer monitor and remote eye-tracking to perform preferential looking:

Jones et al. (2014). Automated measurement of resolution acuity in infants using remote eye-tracking, IOVS, 55(12):8102-8110. link pdf

Jones et al. (2014). Optimizing the rapid measurement of detection thresholds in infants, J. Vis., 15(11):2. link pdf

Return to top


Misc: Child-friendly measures of functional vision

I am interested in developing rapid behavioural measures of vision, suitable for use in infants and children. Remote eye-tracking can be used to locate stimuli precisely on the retina, and to record eye-movement responses. Self-calibrating monitors can be used to generate precsisely calibrated stimuli. While efficienct (Maximum A Posteriori) psychophysical algorithms can be used to rapidly and accute determine various perceptual thresholds (e.g., the dimmest light that the observer is able to detect/discriminate).

example graphic

Visual Acuity testing

Acuity refers to the finest spatial data an observer can resolve. In adults, this is typically measured by asking the patient to read letters of diminishing size. An 'infant friendly' alternative is to find the finest black-and-white grating that can be distinguished from an equiluminant grey background.


For more details: Jones et al. (2014). Automated measurement of resolution acuity in infants using remote eye-tracking, IOVS, 55(12):8102-8110. pdf

example graphic

Visual Field testing

A more advanced use of the eye-tracker is to position stimuli relative to the patient's current point-of-fixation. In this way, the patient's sensitivity to light can be mapped-out, across their visual field. This produces a 'heatmap', of vision loss. We all have one 'blindspot': an area of the mammalian retina where there are no functioning photoreceptors. Some patients may exhibit additional 'scotoma', due to illness or injury.


example graphic

Colour Sensitivity testing

The same basic technology can also measure chromatic sensitivity functions. A background of random noise prevents observers from exploiting small differences in luminance; while large patches allow observers with low acuity to perform the test.





example graphic

Contrast Sensitivity testing (the CSF)

The CSF measures the smallest modulation in contrast (the faintest black-and-white lines) that can be detected for various levels of spatial frequency (fineness of lines). To make the test as rapid and reliable as possible, we are using the new QUEST+ algorithm to efficiently estimate thresholds across multiple spatial scales (the red dashed line).


For more details: Farahbakhsh et al. (2014). Psychophysics with children: Evaluating the use of maximum likelihood estimators in children aged 4 – 15 years (QUEST+), J. Vis., 19:22. link pdf

example graphic

Flicker Sensitivity testing (the tCSF)

The temporal equivalent of the CSF: the tCSF measures the most rapid modulation in time that the observer can detect (after which point, the light appears to stay constant). We can use Silent Substitution to target particular classes of photoreceptor.



Return to top


Misc: Understanding sensory integration (aka "Perceptual Averaging")

example graphic

I am interested in understanding how sensory information (e.g., two sounds, or sight and sound) are combined:

Jones (2016). A tutorial on cue combination and Signal Detection Theory: Using changes in sensitivity to evaluate how observers integrate sensory information, Journal of Mathematical Psychology, 73:117–139. pdf

In particular, how the ability to combine information develops in childhood:

Jones et al. (2018). Efficient visual information sampling develops late in childhood, J. Exp. Psych.: General, 148(7):1138-1152. link

Jones (2018). The development of perceptual averaging: efficiency metrics in children and adults using a multiple-observation sound-localization task, JASA, 144(1):228-241. link pdf

Jones & Dekker (2018). The development of perceptual averaging: Learning what to do, not just how to do it, Dev. Sci., 21(3):e12584. link pdf

Nardini, et al. (2008). Development of cue integration in human navigation, Current Biology, 18:689-693. pdf

And how the ability to combine information is affected by sensory impairment:

Garcia et al. (2017). Multisensory cue combination after sensory loss: Audio-Visual localization in patients with progressive retinal disease, J. Exp. Psych: Human Perception and Performance. 43(4):729-740. link pdf

Garcia et al. (2017). Auditory localisation biases increase with sensory uncertainty, Sci. Rep., 7:40567. link pdf

Return to top


Misc: Understanding perceptual learning

example graphic

Practice improves performance on many basic auditory tasks. However, while the phenomenon of auditory perceptual learning is well established, little is known about the mechanisms underlying such improvements. What is learned during auditory perceptual learning? In my PhD, I attempted to answer this question by applying models of performance to behavioural response data, and examining which parameters change with practice.

On a simple pure tone discrimination task, learning was shown to represent a reduction in internal noise:

Jones et al. (2013). Reduction of internal noise in auditory perceptual learning, JASA, 133:970-981. pdf

However, in a more complex auditory detection task, learning and development were shown to also involve improvements in listening strategy, with listeners becoming better able to selectively-attend to task-relevant information.

Jones et al. (2015). Development of auditory selective attention: Why children struggle to hear in noisy environments, Dev. Psy., 51(3):353-69, doi: 10.1037/a0038570. pdf

Jones et al. (2014). Learning to detection a tone in unpredictable noise, JASA, 135:EL128-EL133. pdf

Finally, task performance was shown to constrained not just by the strength of the sensory evidence, but by non-sensory factors such as bias and attentiveness.

Jones et al. (2015). The role of response bias in perceptual learning, J. Exp. Psy.: LMC in press.

Return to top