CSC 2532 Spring 2020: Statistical Learning Theory

This course covers several topics in machine learning theory. We will try to answer questions like:

Topics may include: Asymptotic statistics, Uniform Convergence, Generalization, Kernel Methods, Online Learning, Sampling. More details can be found in syllabus.

This class requires a good informal knowledge of probability theory, linear algebra, real analysis (at least Masters level). Homework 0 is a good way to check your background.


Instructors: Murat A. Erdogdu

Teaching Assistants: Rozhina Ghanavi, Calum MacRury

Time & Location:

Section Room Lecture time
L0101 SS 2108 Th 14-16

Suggested Reading

No required textbooks. Suggested reading will be posted after each lecture (See lectures below).

Lectures and (tentative) course outline

Combined lecture notes.

Week Day Lectures Timeline
1 1/09 Introduction & Warm-up: Gaussian Mean Estimation syllabus
2 1/16 Exponential Families and Information Inequality -
3 1/23 Asymptotic statistics hw1 out
4 1/30 Uniform convergence & Generalization project proposal due
5 2/06 Covering with epsilon-nets hw1 due & hw2 out
6 2/13 Rademacher complexity: Definition -
7 2/20 Rademacher complexity: Properties & Applications hw2 due & hw3 out
8 2/27 Combinatorial Measures of Complexity project progress report due
9 3/05 Chaining and Dudley’s theorem hw 3 due
10 3/12 Midterm (in class) midterm
11 3/19 PAC-Bayes bounds & Stability -
12 3/26 Kernel Methods: Basics -
13 4/02 Kernel Methods: Properties & Applications  


Homework # Out Due TA Office Hours
Homework 0 - V0 1/9 - -
Homework 1 - V0 1/24 2/6 in class Tue 10-11am, Wed 1-2pm @BA5256
Homework 2 - V0 2/10 2/23 via email Tue 10-11am, Wed 3-4pm @Pratt 286B
Homework 3 - V0 2/26 3/05 via email Tue 9:30-10:30am, Wed 3-4pm @BA5256

Latex template can be found here.


Your project goal is to make a significant contribution to understanding a machine learning related problem. An ideal project will begin with an interesting observation, later explained through theory, and end with a thorough empirical analysis. Several research directions can be found below, but the list is by no means comprehensive, and your project topic need not be drawn from it. You will review relevant literature, find interesting research directions, and either develop novel methodology, or explain an observed behavior related to a learning algorithm.

Project Inspiration: You can go through recent papers on COLT, NeurIPS, ICML, ICLR, JMLR to get project ideas. Several research directions can be found here, but the list is by no means comprehensive. If you have suggestions, let me know.

Latex template for reports can be found here.

Computing Resources

For the homework assignments, we will use Python, and libraries such as NumPy, SciPy, and scikit-learn.