Guest Talk: Tom Freeman
Motion and moving: Across the senses and across individuals
Vision and hearing face similar problems when the observer moves – eye movements create visual image motion and head movements create acoustic image motion. How is this accounted for by the perceptual brain? We know that the visual system uses ‘extra-retinal signals’ to encode eye velocity; I make a similar case for hearing, arguing that it must rely on ‘extra-cochlear signals’ encoding head velocity. Our experiments show that hearing, like vision, makes similar mistakes during head rotation, such that stationary sounds do not appear stationary. I argue in both cases that errors in perceived stability arise from distortions in motion perception, and these can be accounted for using a Bayesian model that combines noisy sensory evidence with a slow-motion prior. I provide some preliminary evidence to support the account for hearing (the case for vision having already been established). I then go on to explore how the Bayesian framework could account for individual differences in perception, testing a model that combines motion thresholds and autistic traits to predict the range of distortions in perceived speed that can occur when stimuli are pursued by eye.