** Syllabus **

Basic classifiers: nearest neighbor, decision trees, linear separators.

Generalization: large deviation theory, VC bounds, cross validation.

Generative models: multivariate Gaussian, Fisher linear discriminant, naive Bayes, logistic regression

Support vector machines and kernels.

Ensemble methods: boosting, bagging.

Other topics: the PAC model, multiclass methods, regression, active learning, reinforcement learning.

** Course materials **

1. You will need access to Matlab.

2. Useful supplementary material (all but the first are on reserve in the S&E library):

Stuart Russell and Peter Norvig, * Artificial intelligence: a modern approach * (second edition).

Michael Kearns and Umesh Vazirani, * An introduction to computational learning theory*.

Trevor Hastie, Robert Tibshirani, and Jerome Friedman, * The elements of statistical learning*.

Richard Duda, Peter Hart, and David Stork, * Pattern classification*.

** Homework policy **

Homeworks should be turned in at the beginning (first five minutes) of class, on the due date. No late homeworks, please.

** Grading **

Homeworks: 75%

Project: 25%