Guest Talk: Wolfram Schenk

Colloquium
Date: 
24 November 2015
Begin time: 
18:00
End time: 
19:30
Room: 
2.015

Applications of saccade-triggered visual prediction

It is assumed that the prediction of future sensory or system states is important in many areas of motor control, perception, and cognition - both for biological organisms and artificial agents like robots. The basis for the studies presented in this talk is a visual forward model. This model predicts future visual states in the context of saccade-like camera movements of a robot setup. We explore how such a model can be used to enable a cognitive agent to perform various perceptual and motor tasks: Learning saccadic eye movements, grasping, mental imagery, and depth perception. Each of these applications is not only a theoretical proposition, but fully implemented on a robotic real-world setup or at least in half-simulations of this setup. In this way, the underlying modeling assumptions have passed a first real-world test and are viable candidates to explain cognitive processes in biological organisms.