Schedule
- Readings in normal font should be completed and annotated ahead of lecture.
- Readings in italic provide optional additional depth on the material.
- Assignments are listed on the day when we suggest you begin working on them.
- Assessments are listed on the day when they will occur.
Reading sources:
- CN: Class notes written for this course, hosted here.
- PDSH: The Python Data Science Handbook by @vanderplasPythonDataScience2016.
- PDA: Python for Data Analysis, 3rd edition by @mckinneyPythonDataAnalysis2022.
- BHN: Fairness and Machine Learning: Limitations and Opportunities by @barocasFairnessMachineLearning2023.
Week 1
| Mon Feb. 09 |
Welcome! | |||||
| Course introduction. What is machine learning? How will this class work? Introduction to data as a combination of signal and noise. | ||||||
| Objectives Welcome! Theory |
Notes Welcome slides |
|||||
| Wed Feb. 11 |
Data = Signal + Noise | |||||
| Gaussian distribution, linear data with Gaussian noise. Likelihoods and log-likelihoods. Introduction to maximum likelihood estimation (MLE). | ||||||
| Objectives Theory Experimentation |
Reading Calculus review |
Warmup Log and critical points |
Notes Class notes |
Assignments HW 1 (due 2/18) |
||
Week 2
| Mon Feb. 16 |
Maximum likelihood, gradients | |||||
| Calculating gradients and checking with Torch. Gradient descent for maximum-likelihood estimation in one dimension. | ||||||
| Objectives Theory Experimentation |
Notes Class notes |
|||||
| Wed Feb. 18 |
Higher Dimensions | |||||
| Linear-Gaussian model with many features. Matrix-vector notation for linear models. Loss function and gradient in higher dimensions. | ||||||
| Objectives Theory Experimentation |
Notes Class notes |
|||||
No matching items