CS 6347: Statistical Methods in AI and ML
Spring 2024Course Info
Where: ECSW 3.250
When: TR, 10:00am-11:15pm
Instructor: Nicholas Ruozzi
Office Hours: TW 11:00am-12pm, and by appointment in ECSS 3.409
TA: TBD
Office Hours: TBD
Grading: problem sets (80%), Lecture Scribe (15%), class participation & extra credit (5%)
Prerequisites: some familiarity with basic probability, linear algebra, and introductory machine learning (helpful, but not required).
Schedule & Lecture Slides
Week | Dates | Topic | Readings |
1 | Jan. 18 | Introduction & Basic Probability | K&F: Ch. 1 & 2Basic Probability |
2 | Jan. 23 & 25 | Bayesian Networks More BNs: D-separation | K&F: Ch. 3, 4, and 9Octave (free version of MATLAB) BN Notes |
3 | Jan. 30 & Feb. 1 | Markov Random Fields | MRF Notes |
4 | Feb. 6 & 8 | More MRFs Variable Elimination & BP (Scribe Group 1) | K&F: 13.1-13.5 |
5 | Feb. 13 (Scribe Group 2) & 15 (Scribe Group 3) | More Belief Propagation Approx. MAP EstimationMAP LP | Approximate MAP Notes K&F 11.1-11.2, 11.5Sections 1-3 of this paper K&F A.5.3Boyd: Ch. 5.1-5.5 |
5 | Feb. 20 (Scribe 4) & Feb. 22 (Scribe 5) | Approx. MAP EstimationMAP LP Variational Methods | |
6 | Feb. 27 (Scribe 6) & Feb. 29 (Scribe 7) | More variational methods Intro to Sampling | K&F 12.1-12.3 |
7 | March 5 (Scribe 8) & 8 (Scribe 9) | Markov Chain Monte Carlo | K&F: 17.1-17.4 |
8 | Mar. 18 (Scribe 10) & 20 (Scribe 11) | Intro to Machine Learning | K&F: 20.1-20.5 | 9 | Mar. 26 (Scribe 12) & 28 (Scribe 13) | MLE for CRFs |
10 | April 2 (Scribe 14) & 4 (Scribe 15) | More MLE Alternatives to MLE | |
11 | April 9 (Scribe 16) & 13 (Scribe 17) | Alternatives to MLE Expectation Maximization | |
12 | April 16 & 18 | More Expectation Maximization Hidden Markov Models | K&F: 19.1-19.2, 20.6Box 17.E | 11 | April 23 & Mar. 25 | Structure Learning LDA | K&F: 20.6 | 12 | Apr. 30 & May 2 | Exponential Families and EP | Free Energy Approximations |
Problem Sets
All problem sets will be available on the eLearning site and are to be turned in there. See the homework guidelines below for the homework policies.
Textbooks & References
This semster, online notes in book form will (hopefully) be available for each lecture. In addition, the following textbook is suggested:
- Probabilistic Graphical Models: Principles and Techniques, by Daphne Koller and Nir Friedman.
- Modeling and Reasoning with Bayesian Networks, by Adnan Darwiche.
- Machine Learning: a Probabilistic Perspective, by Kevin Murphy.
Homework Guidelines*
We expect you to try solving each problem set on your own. However, when being stuck on a problem, I encourage you to collaborate with other students in the class, subject to the following rules:
- You may discuss a problem with any student in this class, and work together on solving it. This can involve brainstorming and verbally discussing the problem, going together through possible solutions, but should not involve one student telling another a complete solution.
- Once you solve the homework, you must write up your solutions on your own, without looking at other people's write-ups or giving your write-up to others.
- In your solution for each problem, you must write down the names of any person with whom you discussed it. This will not affect your grade.
- Do not consult solution manuals or other people's solutions from similar courses - ask the course staff, we are here to help!
*adpated from David Sontag