Portrait

Graduate Student in Applied Mathematics and Machine Learning
Computer Science Department, École Normale Supérieure, Paris
Supervisor: Francis Bach

Education:
MSc in Applied Mathematics, Machine Learning, Computer Vision (a.k.a Master MVA).
École Normale Supérieure, Paris. Completed with honors.

Work Experience:
Research Internship at the Montreal Institute for Learning Algorithms, University of Montreal
Supervisor: Yoshua Bengio

Research Internship at the Laboratory of Computational Neuroscience, EPFL
Supervisors: Johanni Brea and Wulfram Gerstner
→ Exploring deep learning techniques with spiking neurons in energy based models
→ Investigating backpropagation with random feedback weights in deep neural networks

Research Internship at the Montreal Institute for Learning Algorithms, University of Montreal
Supervisor: Yoshua Bengio
→ Focusing on new biologically plausible deep learning algorithms
→ Exploring new versions of RNNs/Clockwork RNNs using LSTM and GRU

Research Internship at the Laboratory of Neuronal Circuit Development,Institut Curie
Supervisors: Christoph Gebhardt and Filippo Del Bene
→ ZebraFish visual system mapping using data-analysis and machine learning

Curriculum Vitae

Research Interests

I am interested in reinforcement learning, unsupervised Learning and neuroscience.

Publications

2018 “Activation alignment: exploring the use of approximate activity gradients in multilayer networks” by Thomas Mesnard and Blake Richards. In: CCN. pdf

2018 “Fully discretized training of neural networks through direct feedback” by Thomas Mesnard, Gaëtan Vignoud, Jonathan Binas, and Yoshua Bengio. Under Review. pdf

2018 “Ghost Units Yield Biologically Plausible Backprop in Deep Neural Networks” by Thomas Mesnard, Gaëtan Vignoud, Walter Senn, and Yoshua Bengio. In: CCN. pdf

2018 “Extending the Framework of Equilibrium Propagation to General Dynamics” by Benjamin Scellier, Anirudh Goyal, Jonathan Binas, Thomas Mesnard, and Yoshua Bengio. In: ICLR. Workshop. pdf

2017 “Towards deep learning with spiking neurons in energy based models with contrastive Hebbian plasticity” by Thomas Mesnard, Wulfram Gerstner, and Johanni Brea. In: NIPS. Computing with Spikes Workshop. pdf Poster

2016 “STDP-Compatible Approximation of Backpropagation in an Energy-Based Model” by Yoshua Bengio, Thomas Mesnard, Asja Fischer, Saizheng Zhang, and Yuhuai Wu. In: Neural computation. pdf

2015 “Towards biologically plausible deep learning” by Yoshua Bengio, Dong-Hyun Lee, Jorg Bornschein, Thomas Mesnard, and Zhouhan Lin. In: arXiv. pdf

Implementations

Outstanding people