Cognitive behavior of humans, animals, and machines: Computational and experimental perspectives

Colloquium
Date: 
19 October 2018
Begin time: 
10:30
End time: 
12:00
Room: 
1.204

Multisensory Integration: A computational understanding of Behaviour & Brain activity

Everyday perception requires the brain to organise the plethora of sensory inputs into a seemingly simple and coherent percept. This requires solving two problems: first, to decide which bits of information belong together given some belief that they originate from the same object; and second, to merge the available redundant multisensory information into a unified representation. Our brain is adept at combining prior knowledge with incoming information to infer the causal structure of the sensory environment. This in turn allows us to flexibly arbitrate between integrating and keeping segregate specific bits of sensory evidence, a crucial ability to parse the sensory environment into individual objects. I'll present some of our recent studies investigating the perceptual and neural mechanisms of multisensory perception. These studies use a combination of computational modelling and neuroimaging to link specific sensory computations with behaviour and it's underlying neural processes. In particular, I'll focus on studies addressing the contextual dependence of multisensory integration. These reveal a critical role of inferior frontal brain regions in mediating the flexibility to use current and previous sensory information in a context-dependent manner to decide at each moment how to best use the available evidence.