Hongzhou Lin

Stata Center | 32 Vassar Street, Cambridge, MA 02139 | hongzhou at mit dot edu

I am a PostDoc in the Machine Learning Group at MIT working with Stefanie Jegelka. I am interested in theoretical aspects of mathematical optimization, deep learning and reinforcement learning.

I received my PhD from Université Grenoble Alpes in Nov 2017, advised by Zaid Harchaoui and Julien Mairal. My thesis is about generic acceleration schemes for first-order optimization methods. In particular, two algorithms: Catalyst and QNing are developped, their Matlab code are available on GitHub.

Before my PhD, I spent four amazing years at ENS Paris studying Mathematics (Probability theory) and Cognitive Science (Linguistics). Even before, I was envolved in a two-year intensive study in classe prépa at Lycée Louis-Le-Grand.

Here are my Google Scholar profile and my CV.

You may find my PhD thesis here and PhD defense slides here.

Publications

ResNet with one-neuron hidden layers is a Universal Approximator
Hongzhou Lin, Stefanie Jegelka.
NIPS 2018 spotlight, [Poster].

Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice
Hongzhou Lin, Julien Mairal, and Zaid Harchaoui.
The Journal of Machine Learning Research (JMLR), volume 18, 2018, [Matlab Code].

Catalyst Acceleration for Gradient-Based Non-Convex Optimization
Courtney Paquette, Hongzhou Lin, Dmitriy Drusvyatskiy, Julien Marial and Zaid Harchaoui
AISTATS, 2018.

A Generic Quasi-Newton Algorithm for Faster Gradient-Based Optimization
Hongzhou Lin, Julien Marial and Zaid Harchaoui
Preprint, 2017.

A Universal Catalyst for First-Order Optimization
Hongzhou Lin, Julien Marial and Zaid Harchaoui
NIPS, 2015.