Project

Back to overview

Bayesian synapses

English title Bayesian synapses
Applicant Pfister Jean-Pascal
Number 175644
Funding scheme Project funding (Div. I-III)
Research institution Institut für Physiologie Medizinische Fakultät Universität Bern
Institution of higher education University of Zurich - ZH
Main discipline Biophysics
Start/End 01.10.2018 - 30.09.2022
Approved amount 741'827.00
Show all

All Disciplines (2)

Discipline
Biophysics
Neurophysiology and Brain Research

Keywords (11)

long-term plasticity; Spike-Timing Dependent Plasticity; spiking neuron model; Bayesian regression; learning; stochastic synaptic transmission; generalisation; Bayesian inference; Generative model; synaptic plasticity; short-term plasticity

Lay Summary (French)

Lead
Le cerveau est capable d’effectuer des tâches remarquablement complexes. Cependant, son fonctionnement repose entièrement sur des composants non-fiables tels que les synapses qui ne transmettent l’information que de façon probabiliste. Ce projet a pour but d’élucider le rôle fonctionnel des synapses probabilistes.
Lay summary
Les synapses transmettent l’information du neurone pré-synaptique au neurone post-synaptique. Curieusement, cette transmission d’information n’est pas fiable. En effet, à l’arrivée d’une impulsion dans le terminal pré-synaptique, les vésicules synaptiques ne relâchent pas toujours les neurotransmetteurs dans la fente synaptique, ils le font seulement avec une certaine probabilité. Bien que cette transmission stochastique soit connue depuis plus de 60 ans, son rôle fonctionnel demeure énigmatique. Le but de ce projet est d’étudier une nouvelle hypothèse pour le rôle fonctionnel des synapses stochastiques. Cette hypothèse postule que la transmission stochastique est bénéfique car elle permet au réseau de neurones de mieux généraliser.  En effet, la pertinence d’un réseau de neurones (ou tout autre algorithme d’apprentissage) tient à sa capacité de généralisation, c.à.d à sa capacité de faire des prédictions correctes pour des situations jamais rencontrés lors de l’apprentissage. Pour atteindre ce but, ce projet établira un pont entre des modèles biophysiques (validés par les donnée venant du prof. Martin Müller) et des modèles fonctionnels qui permettent de démontrer une capacité de généralisation.
Direct link to Lay Summary Last update: 04.09.2018

Responsible applicant and co-applicants

Employees

Publications

Publication
Model-Based Inference of Synaptic Transmission
Bykowska Ola, Gontier Camille, Sax Anne-Lene, Jia David W., Montero Milton Llera, Bird Alex D., Houghton Conor, Pfister Jean-Pascal, Costa Rui Ponte (2019), Model-Based Inference of Synaptic Transmission, in Frontiers in Synaptic Neuroscience, 11, 1-9.

Associated projects

Number Title Start Funding scheme
150637 Inference and Learning with Spiking Neurons 01.09.2014 SNSF Professorships
179060 Filtering with Spiking Neurons 01.09.2018 SNSF Professorships

Abstract

Synapses are highly stochastic and complex transmission units. Upon the arrival of an action potential in the presynaptic terminal, vesicles fuse to the membrane with a certain probability and release neurotransmitter in the synaptic cleft thereby activating postsynaptic receptors. This probability of release can be influenced by several factors such as the history of presynaptic activity, the identity of the postsynaptic neuron, the age of the animal as well as the presence of neuromodulators. Surprisingly, the functional relevance of this probabilistic release remains largely unknown despite the decades of study of this ubiquitous phenomenon. Here, we propose to study a new hypothesis for the functional role of stochastic synapses. This hypothesis states that the stochasticity at the level of synaptic transmission is computationally beneficial in the sense that it helps the network to generalise better and therefore avoid overfitting. Concretely, we frame the problem in a machine learning setting and ask whether synaptic stochasticity implements Bayesian regression.In a regression problem, the task is to learn the mapping from an input to an output. This mapping is characterised with some parameters that have to be learned. However, with a finite amount of input and output data there is always some uncertainty on the estimation of the parameters. So in a Bayesian approach, the task is to estimate the distribution over the parameters given the data rather than the parameters themselves. This (posterior) distribution over the parameters given the data can be calculated using Bayes' rule. In the context of spiking neural networks, this means that the Bayesian perspective is to compute the posterior distribution over the synaptic weights. Therefore, the present grant will ask the following 3 questions: how is this distribution over weights implemented in biological synapses (project A)? How should this distribution evolve in order to be computationally efficient (project B)? Do biological synapses implement the computationally optimal solution for this evolution of weight distribution (project C)?In project A, the goal will be to quantify stochastic synaptic transmission from a generative model perspective for a wide range of synapse types and conditions. This will be done from a Bayesian perspective by computing the posterior distribution over the synaptic parameters. In project B, we will assess the generalisation performance of spiking neural network with Bayesian regression. In particular, we will derive the optimal learning rule from Bayesian regression perspective and benchmark the regression performance on standard data sets. The third project (C) will combine the biological side of project A and the machine learning side of project B. Concretely, this third project aims at validating with electrophysiological data the Bayesian regression learning rules.
-