Photo of Nicolas Le Roux

Welcome

I am a former researcher with expertise in machine learning, neural networks, optimization, large-scale learning and statistical modeling in general. I am now Scientific Program Manager at Criteo, in charge of defining and driving scientific projects pertaining to the ad prediction problem.

News

  • 10 Oct 2013 - Mark Schmidt released his Matlab code of SAG, the algorithm described in A Stochastic Gradient Method with an Exponential Convergence Rate for Finite Training Sets.
  • 7 Dec 2012 - Here are the slides, the poster and the arXiv version of our paper on Stochastic Average Gradient.
  • 7 Dec 2012 - The code of our paper "A latent factor model for highly multi-relational data" is available here. You may also wish to download the poster.
  • 28 Feb 2012 - After years of using it, I finally released my MATLAB code to train neural networks. It is meant to be fast, but does not yet use any fancy optimization method (just plain old stochastic gradient descent). Download it here.
  • 13 Sep 2011 - Our paper Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization is now available on arXiv. We obtain bounds on the convergence rate of proximal-gradient methods in the presence of noisy gradients and when the proximal problem is only approximately solved.
  • 13 Sep 2011 - Our technical report on Local Component Analysis is now available on arXiv. It is a metric learning algorithm which tries to make the data locally isotropic. It is a useful preprocessing tool for all algorithms assuming local isotropy, like spectral clustering. You can try it using the code available here.
  • 17 Mar 2011 - Slides from my SMILE seminar on Deep Belief Networks.

Stuff

  • Download my resume (updated on 16/9/14)
  • Email: nicolas@le-roux.name