2018-03-26 : PhD Program: course in Computational Neuroscience
PhD Program: course in Computational Neuroscience.
PhD Program: course in Computational Neuroscience
Computational neuroscience is an expending field that is proving to be essential in neurosciences. The aim of this course will be to provide a common solid background in computational neurosciences. The course will comprise historical recall of the field and a description of the different modelling approaches that are currently developed, including details about their specificities, limits and advantages.
The course aims at introducing students with the major tools that will be necessary during their thesis to model or analyze their neuroscientific results. While it will start by a short, generic introduction, we will then explore different systems at different scales. On the first day, we will study the different possible regimes in which a single neuron can behave, while progressively introducing the theory of dynamical systems to understand these more globally. Then, during the second day, we will introduce methods to analyze neuroscientific data in general, such as Bayesian methods and information theory. This will be implemented by simple practical examples.
Language of intervention
Number of hours
~20 hours (session 1=7 + session 2=7 + session 3=4)
15 for the practical sessions (afternoon Day 2 and Day 3), unlimited for theoretical courses
PhD students, interested M2 students and postdocs
Institut des Neurosciences de la Timone (INT)
neuronal modelling, neural circuit modelling, information theory, decoding and encoding
Understanding how computational modelling can be used to formulate and solve neuroscience problems at different spatial and temporal scales; learning the formal notions of information, encoding and decoding and experimenting their use on toy datasets
First session: Introduction to modeling single neurons (morning); An introduction to neural masses: modeling assemblies of neurons up to capturing collective oscillations and resting state dynamics in a mean-field model - presentation of the Virtual Brain software (afternoon) - Second session: An overview on “What is encoding?” “What is decoding?": formalization of the notion of information in neural activity; shared and transferred information; integration, segregation and complexity (morning). Bayesian probabilities, the Free-energy principle and Active Inference, with practical demonstrations in python (afternoon). Third session: the problem of information estimation in practice. Practical exercices in Matlab: estimating entropy and stimulus decodability from spike trains; comparing coding hypotheses (morning).
Basic knowledge of statistics and probability and calculus (differential equations,…) is useful, but steps will be explained and complex math avoided as much as possible. Practical exercises are in python and/or MATLAB, so basic knowledge of these environments is a plus.
day 1 : 2018-03-26 : an introduction to Computational Neuroscience
09:30-12:30 = Introduction to modeling single neurons (LaP)
14:00-17:00 = An introduction to neural masses: modeling assemblies of neurons up to capturing resting state dynamics in a mean-field model - presentation of the Virtual Brain software (DaB)
day 2 : 2018-03-27 : Information theory / bayesian models
09:15-10:30 = An overview on “What is encoding?” “What is decoding?": formalization of the notion of information in neural activity (DaB)
11:00-12:15 = (…continued after the coffee break: ) Live information! From sharing information to transferring information (and a glimpse into the zoo of higher-order friends) (DaB)
14:00-17:10 = Probabilities, the Free-energy principle and Active Inference (LuP).
day 3 : 2018-03-28 : Practical course on Information theory
- 09:30-12:30 = Practical course on Information theory (DaB)
More material related to the course
day 1 - morning : the single neuron
site du livre de Gerstner et al “Neuronal Dynamics”: http://neuronaldynamics.epfl.ch/
A (longer) introduction to the Hodgkin-Huxley model in three steps by Dr Stefano Luccioli
An interactive course with Wulfram Gerstner https://www.edx.org/course/neuronal-dynamics-computational-epflx-bio465-1x
His book ONLINE http://cn.epfl.ch/~gerstner/NeuronalDynamics-MOOC1.html
day 1 - afternoon : neural mass models
Another interactive course @ Washington University https://www.coursera.org/course/compneuro
Collection of didactic material for the EU FP7 ITN Neural Engineering Transformative Technology http://www.neural-engineering.eu/training/index.html
Didactic material from Lab in Computational Neuroscience http://neuro.fi.isc.cnr.it/index.php?page=didactic-material
A open source simulator of a whole brain which runs on your laptop, “The Virtual Brain”: http://thevirtualbrain.org
day 2 - morning : information theory
The best book on information theory and decoding, freely available directly from the author: http://www.inference.phy.cam.ac.uk/itprnn/book.html
a gentle introduction to bayesian methods : https://homepages.inf.ed.ac.uk/pseries/Peg_files/Chapter9_SotiropoulosSeries.pdf
day 2 - afternoon : bayesian models
an interesting read : http://cognitrn.psych.indiana.edu/busey/q551/PDFs/PredictivCodingRaoBallard.pdf
a tutorial on free-energy : some exercises : http://www.sciencedirect.com/science/article/pii/S0022249615000759
solutions to the tutorial : https://laurentperrinet.github.io/sciblog/posts/2017-01-15-bogacz-2017-a-tutorial-on-free-energy.html
LaP: Laurent Pezard «Laurent.Pezard@univ-amu.fr»
DaB: Demian Battaglia «email@example.com», INS
LuP: Laurent Udo Perrinet «firstname.lastname@example.org», INT
- PhD offer "Ultra-fast vision using Spiking Neural Networks"
- 2020-09-11 : Feedforward and feedback processes in visual recognition (T Serre)
- 2020-03-13: Soutenance Victor Boutin
- Postdoc position on Visual computations using Spatio-temporal Diffusion Kernels and Traveling Waves
- 2019-10-10: GDR vision 2019