This course introduces probabilistic learning tools such as exponential families, directed graphical models, Markov random fields, exact inference techniques, message passing, sampling and mcmc, hidden Markov models, variational inference, EM algorithm, Bayesian regression, probabilistic PCA, Neural networks kernel methods, Gaussian processes, and variational autoencoders. It will also offer a broad view of model-building and optimization techniques that are based on probabilistic building blocks which will serve as a foundation for more advanced machine learning courses.
More details can be found in syllabus and piazza.
Prof | Murat A. Erdogdu & Piotr Zwiernik |
---|---|
sta414-2104prof@cs.toronto.edu | |
Office hours | T 15:30 -17:30 (UY 9040) |
Hossein Yousefi, Alireza Mousavi, Daniel Eftekhari, Madhu Gunasingam
Section | Room | Lecture time | Zoom link |
---|---|---|---|
STA 414 LEC0101 & STA 2104 LEC0101 | MS 2172 | M 14-17 | link |
STA 414 LEC5101 & STA 2104 LEC5101 | MS 2172 | T 18-21 | link |
No required textbooks. Suggested reading will be posted after each lecture (See lectures below).
For the homework assignments, we will use Python, and libraries such as NumPy, SciPy, and scikit-learn. You have two options:
pip install scipy numpy autograd matplotlib jupyter sklearn