Instructor: Sargur Srihari Department of Computer Science and Engineering, University at Buffalo

Machine learning is an exciting topic about designing machines that can learn from examples. The course covers the necessary theory, principles and algorithms for machine learning. The methods are based on statistics and probability-- which have now become essential to designing systems exhibiting artificial intelligence.

Reference textbooks for different parts of the course are "Pattern Recognition and Machine Learning" by Chris Bishop (Springer 2006) and "Probabilistic Graphical Models" by Daphne Koller and Nir Friedman (MIT Press 2009) and "Deep Learning" by Goodfellow, Bengio and Courville (MIT Press 2016).

Lecture Slides for Machine Learning

Course topics are listed below with lecture slides. The course is taught during the Fall semester, succeeded by a course focusing on Probabilistic Graphical Models in the Spring semester. A stand-alone course on Deep Learning is offered in the Fall semester.

The slides were last updated in Fall 2015. Chapters 1-13 (Topic titles in Red) are more recently taught versions.

  1. Introduction
    1. Machine Learning-Overview(3MB)
    2. Linear Algebra(1.2MB)
    3. Example: Curve Fitting(821KB)
    4. Probability Theory(950KB)
    5. Numerical Computation(1.4MB)
    6. Decision-Theory(212KB)
    7. Information Theory(160KB)
    8. MATLAB Introduction(347KB)
  2. Probability Distributions
    1. Discrete Distributions(1MB)
    2. Gaussian Distribution(833KB)
    3. Gaussian Bayesian Networks(738KB)
  3. Linear Models for Regression
    1. Regression with Basis Functions(1.9MB)
    2. Gradient Descent(678KB)
    3. Bias-Variance(702KB)
    4. Bayesian Regression(2.6MB)
    5. Bayesian Model Comparison(478KB)
    6. Evidence Approximation(746KB)
    7. Computer Science Ranking(126KB)
  4. Linear Models for Classification
    1. Introduction(134KB)
    2. Discriminant Functions(840KB)
    3. Generative Models(1.8MB)
    4. Logistic Regression(4.2MB)
    5. Laplace Approximation (1MB)
    6. Bayesian Logistic Regression(1.2MB)
  5. Neural Networks
    1. Biology(1.6MB)
    2. Introduction(926KB)
    3. Training(1.1MB)
    4. Error Backpropagation(1.1MB)
    5. The Hessian Matrix(562KB)
    6. Regularization in Neu Networks(1.2MB)
    7. Convolutional Networks(1.2MB)
    8. Mixture Density Networks (634KB)
    9. Bayesian Neural Networks(716KB)
    10. See course on Deep Learning
  6. Kernel Methods
    1. Kernel Methods(1.1MB)
    2. Radial Basis Function Networks(812KB)
    3. Gaussian Processes(6.8MB)
    4. Support Vector Machines(2.1MB)
  7. Probabilistic Graphical Models
    See Course on Probabilistic Graphical Models
  8. Mixture Models and EM
    1. K-means Clustering(1.3MB)
    2. Gaussian Mixture Models(1.5MB)
    3. Latent Variable View of EM(1.1MB)
    4. Bernoulli Mixture Models(3.1MB)
    5. Theoretical Basis of EM(693KB)
  9. Approximate Inference
    1. Approximate Inference(180KB)
    2. Variational Inference(3.3MB)
    3. Variational Mixture of Gaussians(1MB)
  10. Sampling Methods
    1. Need for Sampling (813KB)
    2. Basic Sampling Methods(949KB)
    3. MCMC and Gibbs Sampling(815KB)
  11. Continuous Latent Variables
    1. Principal Components Analysis
    2. Nonlinear Latent Variable Models
  12. Sequential Data
    1. Markov Models(433KB)
    2. Hidden Markov Models(1.3MB)
    3. Extensions to HMMs(287KB)
    4. Linear Dynamical Systems(217KB)
    5. Conditional Random Fields(1.6MB)
  13. Combining Models
    1. Boosting(pdf, 156KB)
  14. Concept Learning
    1. Hypothesis Space (pdf, 111KB)
    2. Candidate Elimination (pdf,236KB)
  15. Decision Trees
    1. Information Gain and ID3(pdf, 286KB)
    2. Data Sets and Data Mining(pdf, 332KB)
    3. Overfitting and Pruning(pdf, 536KB)
  16. Computational Learning Theory
    1. PAC Learning(pdf, 98KB)
    2. VC Dimension(pdf, 321KB)
    3. Mistake Bound(pdf, 51KB)