Prof. Dr. Andrew B. Schwartz | University of Pittsburgh, USA
A better understanding of neural population function would be an important advance in systems neuroscience. The change in emphasis from the single neuron to the neural ensemble has made it possible to extract high-fidelity information about movements that will occur in the near future. Neurons encode many parameters simultaneously. Although the correlation between firing rate and any single parameter may be weak, extraction methods based on multiple neurons are capable of generating a faithful representation, or decoding of intended movement. The realization that useful information is embedded in the population has spawned the current success of brain-controlled interfaces. We have been gradually increasing the degrees of freedom (DOF) that a subject can control through the interface. Our early work showed that 3-dimensions could be controlled in a virtual reality task. We then demonstrated control of an anthropomorphic physical device with 4 DOF in a self-feeding task. Currently, monkeys in our laboratory are using this interface to control a very realistic, prosthetic arm with a wrist and hand to grasp objects in different locations and orientations. This technology has now been extended to a paralyzed patient who cannot move any part of her body below her neck. Based on our laboratory work and using a high-performance “modular prosthetic limb” she has been able to control 10 degrees-of-freedom simultaneously. The control of this artificial limb is intuitive and the movements are coordinated and graceful, closely resembling natural arm and hand movement. This subject has been able to perform tasks of daily living - reaching to, grasping, and manipulating objects as well as performing spontaneous acts such as self-feeding. Current work in a second subject who was implanted with additional stimulating electrodes in the finger region of his sensory cortex is progressing. With this addition, we expect to provide tactile feedback to the subject from the prosthetic fingers as he grasps objects and develops dexterity with the device.
The research projects at CITEC in the field of motion intelligence aim at understanding the mechanisms underlying the control of natural movement and action sequences. To this end, the adaptive locomotion abilities of insects as well as humans are studied, their multimodal sensory processing is analysed, and the identified mechanisms are applied to technical systems. Select this stream and you will get insights into the research of the Biological Cybernetics group (Dürr), the Neurobiology group (Egelhaaf), and the Cognitive Neuroscience group (Ernst/Boeddeker).
One workshop afternoon will focus on the function of active tactile sensing (touch) and distributed proprioception (the sense of posture) in insects, including a practical part on motion capture of freely walking stick insects. Approaches from behavioural physiology (e.g., motion capture), electrophysiology (e.g., intracellular recordings) and biomimetic modeling will be presented.
An other workshop is focussing the quantitative analysis of action of insects, based on the assumption that their brain acquires the relevant spatial information about the environment by means of their visual system. Here, neurophysiological data is gathered with electrophysiology, simulations and verified in robotic applications. The smart principles of the biological visual systems are incorporated into artificial systems, bringing them closer to the performance of their biological counterparts.
Moreover complementary to this, human perception can be understood as a problem of inference, for which the sensory data often is not sufficient to uniquely determine the percept. Thus, prior knowledge has to be used to constrain the process of inference from ambiguous sensory signals. A principled way to describe the combination of prior knowledge with sensory data can be applied in a probabilistic way is the Bayesian Framework. These models are used as a benchmark against which human performance can be tested. To do so quantitative psychophysical and neuropsychological methods are applied together with Virtual Reality techniques.