This year ICMNS have arranged optional pre-congress tutorials. The number of places will be limited, first registered, first accepted.
Date: May 29, 2016
Place: Antibes Juan les Pins congress centre
We have reached the maximum number of participants. Registration for tutorials has closed.
Program at a glance
9.00-10.00: Olivier Faugeras “Thermodynamic limits of networks of rate neurons”
10.00-11.00: Jonathan Rubin “Hurry up and wait: multiple timescales in neuronal models”
11.20-12.20: Rachel Kuske “Interactions of noise, bifurcations, and time scales”
13.20-14.20: Alex Roxin “Meanfield theory for networks of spiking neurons”
14.30-15.30: Susanne Ditlevsen “Statistical inference for stochastic processes”
15.30-16.30: Daniele Avitabile “Numerical calculation of coherent structures in spatially-extended neural networks”
Abstract: I will discuss the numerical computation and bifurcation analysis of stationary solutions and travelling waves in spatially-extended systems. Fundamental concepts will be introduced using a simple 1D reaction-diffusion system, which we will use to provide examples of differentiation matrices, discretisation methods, nonlinear solvers, continuation algorithms and eigenvalue computations. This example will then be used as a springboard to compute traveling waves in a prototypical neural field model. The talk should be accessible to undergraduates who have followed a basic course on numerics (we will only assume prior knowledge of finite differences and Newton’s method). Working codes will be showcased and distributed during the talk.
Susanne Ditlevsen (University of Copenhagen, Denmark)
Title: Statistical inference for stochastic processes
Abstract: Neuronal processes are inherently random, and modeling with stochastic differential equations becomes more and more popular in neuroscience, not only because of the powerful mathematical tools from stochastic analysis, but also because of the increasing availability of measurements and data for dynamical processes, where randomness plays a major role. Therefore also the development of statistical procedures based on these models has seen a large rise during the last decades. I will discuss statistical inference for stochastic processes relevant for neuronal models.
Olivier Faugeras (Inria Sophia Antipolis – Méditerranée, France)
Title: Thermodynamic limits of networks of rate neurons
Abstract: The idea of studying the thermodynamic limit (TL), i.e. the limit of the activity of a neuronal network when the number N of neurons grows to infinity, may seem bizarre since the number of neurons in biological brains is obviously finite. But, if this limit is reached rapidly, i.e. for values of N of the same order of magnitude as the number of neurons in observable networks, and if the description of this limit is significantly simpler than that of the actual network, then the problem of characterizing the TL is worth studying not only for its mathematical, but also its biological interest.
I will present an approach for studying the TL of networks of Hopfi eld, i.e. rate, neurons. using the Large Deviations approach. I will consider networks with random, Gaussian distributed, synaptic weights, first when the weights are independent identically distributed (i.i.d.). I will characterize the thermodynamic limit in this case as a non Markov Gaussian process and show the propagation of chaos effect: in the TL all neurons become independent resulting in asynchronous activities. I will then consider the more biologically plausible case of correlated synaptic weights and show that one can draw similar conclusions to those in the i.i.d. case, but for the propagation of chaos effect: neurons do not become independent but the TL can be described in ways that resemble those of the i.i.d. case, with a slight increase in the complexity.
Rachel Kuske (University of British Columbia, Canada)
Title: Interactions of noise, bifurcations, and time scales
Abstract: Dynamical systems with bifurcations often exhibit an array of complex behaviours. Stochastic effects can change the picture dramatically, particularly if multiple time scales are present. Understanding the effects of noise in these systems requires the combination of a range of ideas, some developed in canonical physics and engineering contexts and recently transferred to areas such as biology and the environment. While the similarities facilitate this transfer, differences in the interplay of nonlinearities, delays, randomness, and discrete and continuous dynamics necessitate novel approaches. We will develop these ideas through a series of examples from several biological applications.
Alex Roxin (Centre de Recerca Matematica, Spain)
Title: Meanfield theory for networks of spiking neurons
Abstract: If functional units in the brain are large networks then we may not need to know the detailed dynamics of each neuron, but rather just a measure of the mean activity. Can we develop such a meanfield theory for networks of model spiking neurons? I will discuss the by now classical meanfield theory for irregularly spiking neurons which is based on two main assumptions: 1 – neurons receive large numbers of weak, uncorrelated inputs and 2 – neurons are Poisson processes. I will show how to calculate, analytically and numerically, stationary states and their stability in such networks. Finally, I will discuss the limitations of the meanfield theory and hint at new directions.
Jonathan E. Rubin (University of Pittsburgh, USA)
Title: Hurry up and wait: multiple timescales in neuronal models
Abstract: As in many circuits, the components contributing to neuronal membrane dynamics often evolve on disparate timescales. In particular, this property arises in many neuronal models based on or extracted from the Hodgkin-Huxley framework. I will discuss how disparate timescales can be exploited in the analysis of the dynamics of such models as well as how multiple timescales can give rise to complicated, sometimes unexpected dynamics such as bursting and mixed-mode oscillations. I will also discuss properties that can emerge in circuits composed of multi-timescale neurons.