A central hypothesis in neuroscience is that learning is ultimately based on synaptic changes. However, there still is a wide gap between the plethora of experimental findings on synaptic plasticity and theoretical models attempting to explain how neuronal systems can realistically absorb information. All learning rules which have been proposed for the learning of exact spike times rely on a hypothetical supervisor for which there is no known neural correlate. We bridge this gap by introducing a plasticity mechanism that relies on the neuronal membrane potential as a global signal controlling local synaptic plasticity. Despite its simplicity the proposed mechanism is sufficient to unify the Perceptron and the Chronotron, two fundamental theoretical concepts of learning. The mechanism works in a purely associative manner, and can achieve close to maximal memory capacity for Chronotrons, without requiring supervisory mechanisms that monitor the progress of learning in more artificial rules. Our mechanism potentially explains the detailed balance of excitation and inhibition and the sparse coding of natural stimuli in cortex. Last not least, it is suitable to learn inverse sensor-motor models, which we exemplify with bird song learning.
(Klaus Pawelzik, Maren Westkott, and Christian Albers)
Ort: INB Seminarraum