DOC2AMU (2016/2019)

DOC2AMU is co-funded by the prestigious Marie Skłodowska-Curie COFUND action within the H2020 Research and Innovation programme of the European Union and by the Regional Council of Provence-Alpes-Côte d’Azur, with a contribution from A*MIDEX Foundation.

Within this programme, the PhD fellows will sign a three-year work contract with one of the 12 Doctoral Schools of AMU. Numerous advantages

These PhD fellowships are remunerated above that of a standard French PhD contract with a gross monthly salary of 2600 € and a gross monthly mobility allowance of 300 €, which after standard deductions will amount to a net salary of approximately 1625€/month (net amount may vary slightly). A 500€ travel allowance per year and per fellow is also provided for the fellows to travel between Marseille and their place of origin. Tailored training and personalised mentoring: Fellows will define and follow a Personal Career Development Plan at the beginning of their Doctoral thesis and will have access to a variety of training options and workshops. Financial support for international research training and conferences participations. A contribution to the research costs will be provided for the benefit of the fellow.

“This work was supported by the Doc2Amu project which received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 713750. Projet cofinancé par le Conseil Régional Provence-Alpes-Côte d’Azur.Projet cofinancé par le Conseil Régional Provence-Alpes-Côte d’Azur, la commission européenne et les Investissements d’Avenir.”

Avatar
Laurent U Perrinet
Researcher in Computational Neuroscience

My research interests include Machine Learning and computational neuroscience applied to Vision.

Publications

Meaningful representations emerge from Sparse Deep Predictive Coding

The formation of connections between neural cells is essentially emerging from an unsupervised learning process. During the development …

Top-down feedback in Hierarchical Sparse Coding

From a computer science perspective, the problem of optimal representation using Hierarchical Sparse Coding (HSC) is often solved using …

A hierarchical, multi-layer convolutional sparse coding algorithm based on predictive coding

Sparse coding holds the idea that signals can be concisely described as a linear mixture of few components (called atoms) picked from a …

Sparse Deep Predictive Coding To Model Visual Object Recognition

Convolutional Neural Network (CNN) are popular to model object recognition in the brain. They offer a flexible and convenient framework …

Top-down connection in Hierarchical Sparse Coding

The brain has to solve inverse problems to correctly interpret sensory data and infer the set of causes that generated the sensory …

From biological vision to unsupervised hierarchical sparse coding

The formation of connections between neural cells is essentially emerging from an unsupervised learning process. During the development …

M2APix: a bio-inspired auto-adaptive visual sensor for robust ground height estimation

This paper presents for the first time the embedded stand-alone version of the bio-inspired M2APix (Michaelis-Menten auto-adaptive …

On the Origins of Hierarchy in Visual Processing

It is widely assumed that visual processing follows a forward sequence of processing steps along a hierarchy of laminar sub-populations …

Unsupervised Hierarchical Sparse Coding algorithm inspired by Biological Vision

The brain has to solve inverse problems to correctly interpret sensory data and infer the set of causes that generated the sensory …

Controlling an aerial robot with human gestures using bio-inspired algorithm

Improve performances of existing recognition computer vision algorithms with biological concepts. The gain are expected in the …

Efficient learning of sparse image representations using homeostatic regulation

One core advantage of sparse representations is the efficient coding of complex signals using compact codes. For instance, it allows …

Efficient learning of sparse image representations using homeostatic regulation

One core advantage of sparse representations is the efficient coding of complex signals using compact codes. For instance, it allows …

Talks