Theories of Deep Learning (STATS 385)
Stanford University, Fall 2017
Lecture slides for STATS385, Fall 2017
Lecture01: Deep Learning Challenge. Is There Theory? (Donoho/Monajemi/Papyan)
Lecture02: Overview of Deep Learning From a Practical Point of View (Donoho/Monajemi/Papyan)
Lecture03: Harmonic Analysis of Deep Convolutional Neural Networks (Helmut Bolcskei)
Lecture04: Convnets from First Principles: Generative Models, Dynamic Programming & EM (Ankit Patel)
Lecture05: When Can Deep Networks Avoid the Curse of Dimensionality and Other Theoretical Puzzles (Tomaso Poggio)
Lecture06: Views of Deep Networksfrom Reproducing Kernel Hilbert Spaces (Zaid Harchaoui)
Lecture07: Understanding and Improving Deep Learning With Random Matrix Theory (Jeffrey Pennington)
Lecture08: Topology and Geometry of Half-Rectified Network Optimization (Joan Bruna)
Lecture09: What’s Missing from Deep Learning? (Bruno Olshausen)
Lecture10: Convolutional Neural Networks in View of Sparse Coding (Vardan Papyan)
back