Semester: | Spring 2018, also offered on Fall 2020, Spring 2020, Fall 2017 and Fall 2016 |
Time and place: | Tuesday and Thursday, 10.30-11.45am, Lawson Building B155 |
Instructor: | Jean Honorio, Lawson Building 2142-J (Please send an e-mail for appointments) |
TAs: |
Adarsh Barik, e-mail: abarik at purdue.edu, Office hours: Thursday, 3pm-5pm, HAAS G50 Shraddha Sahoo, email: sahoo0 at purdue.edu, Office hours: Monday, 3pm-5pm, HAAS G50 |
Date | Topic (Tentative) | Notes |
Tue, Jan 9 |
Lecture 1: perceptron (introduction) Notes: [1] |
Homework 0: due on Jan 11 at beginning of class - NO EXTENSION DAYS ALLOWED |
Thu, Jan 11 |
Lecture 2: perceptron (convergence), max-margin classifiers, support vector machines (introduction) Notes: [1] |
Homework 0 due - NO EXTENSION DAYS ALLOWED |
Tue, Jan 16 | Lecture 3: nonlinear feature mappings, kernels (introduction), kernel perceptron | Homework 0 solution |
Thu, Jan 18 | — | |
Tue, Jan 23 |
Lecture 4: SVM with kernels, dual solution Notes: [1] Refs: [1] [2] (not mandatory to be read) |
Homework 1: due on Jan 30, 11.59pm EST |
Thu, Jan 25 |
Lecture 5: one-class problems (anomaly detection), one-class SVM, multi-way classification, direct multi-class SVM Notes: [1] Refs: [1] [2] [3] [4] (not mandatory to be read) |
|
Tue, Jan 30 |
Lecture 6: rating (ordinal regression), PRank, ranking, rank SVM Notes: [1] Refs: [1] [2] (not mandatory to be read) |
Homework 1 due |
Thu, Feb 1 |
Lecture 7: linear and kernel regression, feature selection (information ranking, regularization, subset selection) Notes: [1] |
|
Tue, Feb 6 |
Lecture 8: ensembles and boosting Notes: [1] |
|
Thu, Feb 8 |
Lecture 9: model selection (finite hypothesis class) Notes: [1] Refs: [1] (not mandatory to be read) |
Homework 2: due on Feb 15, 11.59pm EST |
Tue, Feb 13 |
Lecture 10: model selection (growth function, VC dimension, PAC Bayesian bounds) Notes: [1] |
|
Thu, Feb 15 |
Lecture 11: performance measures, cross-validation, bias-variance tradeoff, statistical hypothesis testing Notes: [1] |
Homework 2 due |
Tue, Feb 20 |
Lecture 12: dimensionality reduction, principal component analysis (PCA), kernel PCA Notes: [1] |
Homework 3: due on Feb 27, 11.59pm EST |
Thu, Feb 22 |
Lecture 13: generative probabilistic modeling, maximum likelihood estimation, mixture models, EM algorithm (introduction) Notes: [1] |
|
Tue, Feb 27 |
Lecture 14: mixture models, EM algorithm, convergence, model selection Notes: [1] |
Homework 3 due |
Thu, Mar 1 |
Lecture 15: active learning, kernel regression, Gaussian processes Refs: [1] (not mandatory to be read) |
|
Tue, Mar 6 | MIDTERM (lectures 1 to 12) | 10.30am-11.45am, Lawson Building B155 |
Thu, Mar 8 | (midterm solution) |
Project plan due (see Assignments for details) [Word] or [Latex] format |
Tue, Mar 13 | SPRING VACATION | |
Thu, Mar 15 | SPRING VACATION | |
Tue, Mar 20 |
Lecture 16: collaborative filtering (matrix factorization), structured prediction (max-margin approach) Notes: [1] Refs: [1] (not mandatory to be read) |
|
Thu, Mar 22 | — | |
Tue, Mar 27 |
Lecture 17: Bayesian networks (motivation, examples, graph, independence) Notes: [1] Refs: [1] [2] (not mandatory to be read) |
|
Thu, Mar 29 |
Lecture 18: Bayesian networks (independence, equivalence, learning) Notes: [1] Refs: [1] [2] [3, chapters 16-20] (not mandatory to be read) |
Preliminary project report due (see Assignments for details) |
Tue, Apr 3 |
Lecture 19: Bayesian networks (introduction to inference), Markov random fields, factor graphs Notes: [1] Refs: [1] [2] (not mandatory to be read) |
|
Thu, Apr 5 |
Lecture 20: Markov random fields (inference, learning) Notes: [1] Refs: [1] [2] [3, chapters 16-20] (not mandatory to be read) |
|
Tue, Apr 10 | — | |
Thu, Apr 12 | (lecture continues) | Final project report due (see Assignments for details) |
Tue, Apr 17 |
Lecture 21: Markov random fields (inference in general graphs, junction trees) Notes: [1] |
|
Thu, Apr 19 | FINAL EXAM (lectures 13 to 21) | 10.30am-11.45am, Lawson Building B155 |
Tue, Apr 24 | — | |
Thu, Apr 26 | — |