Views of Deep Networks from Reproducing Kernel Hilbert Spaces

We explore the connections between kernel-based methods and deep neural networks. Kernel-based methods have lately been commonly presented as “shallow architectures", in contrast to "deep architectures" such as deep neural networks. We discuss the scope of this assertion, in light of empirical and theoretical properties of several classes of multi-layer compositional kernels and their corresponding reproducing kernel Hilbert spaces. We also address the question of supervision to train deep architectures and review several approaches other than the popular supervised training with stochastic back-propagation of gradients. Finally, we discuss current challenges and open problems for theorists and practitioners.

Readings for this lecture

Zaid Harchaoui

Zaid Harchaoui is an Assistant Professor in the Department of Statistics, an Adjunct Professor in the Paul G. Allen School of Computer Science & Engineering, and a Data Science Fellow in the eScience Institute at University of Washington. He completed his Ph.D. at ParisTech, working with Eric Moulines, Stephane Canu and Francis Bach. Before joining the University of Washington, he was a visiting assistant professor at the Courant Institute for Mathematical Sciences at New York University (2015 – 2016). Prior to this, he was "chargé de recherche" in the LEAR team of Inria (2010 – 2015). He was a postdoctoral fellow at Carnegie Mellon University in 2009. He is an Associate Fellow of the program “Learning in Machines and Brains” of CIFAR. He received the Inria award for scientific excellence, the NIPS reviewer award, and the Criteo Faculty Research Award. He gave a tutorial on “Frank-Wolfe, greedy algorithms, and friends” at ICML’14, on “Large-scale visual recognition” at CVPR’13, and on “Machine learning for computer vision” at MLSS Kyoto 2015. He recently co-organized the “Future of AI” symposium at New York University, the workshop on “Optimization for Machine Learning” at NIPS’14, and the “Optimization and statistical learning” workshop (2013, 2015, 2017) in Ecole de Physique des Houches (France). He served as Area Chair for ICML 2015, ICML 2016, NIPS 2016, ICLR 2016, ICML 2017, NIPS 2017.