DM825 - Introduction to Machine Learning

Syllabus

Lecture 1

  • introduction [B2 sc2.1]
  • linear regression and linear models [B1 sc3.1; B1 sc1.1-1.4] (In R: ?lm)
  • gradient descent, Newton-Raphson (batch and sequential) (In R: ?optim)
  • least squares method [B6, sc5.1-5.10]
  • k-nearest neighbor [B2, 1-2.4; B3, 3.1.3; B6, 5.1-5.10]
  • curse of dimensionality [B1 sc1.4]

Lecture 2

  • regularized least squares (aka, shrinkage or ridge regr.) [B1 sc3.1.4]
  • locally weighted linear regression [B2, sc6.1.1]
  • probability theory [B2 sc1.2]
  • probability interpretation [B1, sc1.1-1.4, sc3.1; B2, sc7.1-7.3, sc7.10-7.11]
  • maximum likelihood approach [B1 sc1.2.5]
  • Bayesian approach and application in linear regression [B1 sc1.2.6, 2.3, 3.3, ex. 3.8]

Lecture 3

  • probabilistic approach to learn parameters of binary variables [B2 sc2.1]
  • model assessment [B1 sc1.5.5; sc3.2; B2 sc7.1-7.3, sc7.10-7.11]
  • logistic regression [B1 sc2.1, ]

Lecture 4

  • linear models for classification
  • multinomial (logistic) regression [B1 sc2.2]

Lecture 5

  • generalized linear models [B1 sc2.4] (In R: ?glm)
  • decision theory [B1 sc1.5]

Lecture 6

  • neural networks
    • perceptron algorithm [B1 5.1]
    • multi-layer perceptrons [B1 sc5.2-5.3, sc5.5; B2 ch11] (in R: library(nnet); ?nnet)

Lecture 7

  • generative algorithms
    • Gaussian discriminant analysis [B1 sc4.2] (in R: library(MASS); ?lda, ?plot.lda)
    • naive Bayes (in R: library(e1071); ?naiveBayes)

Lecture 8 and 9

  • support vector machines and kernel methods [B2 sc2.8.2, ch6, sc12.1-12.3.4; B1 sc2.5, sc7-7.1.5]

Lecture 10

  • Practical Advice [B12]
  • Learning Theory [B12]

Lecture 11

  • probabilistic graphical models
    • Discrete [B1 sc8.1]
    • Linear Gaussian [B1 sc8.1]
    • Mixed Variables
    • Conditional Independence [B1 sc8.2, wikipedia]

Lecture 12

  • probabilistic graphical models, Inference
    • Exact in Chains and Polytrees [B1, sc 8.4]
    • Approximate [B4 sc14.5]

Lecture 13

  • Unsupervised Learning:
  • k-means, mixtures models, EM algorithm [B1 sc 9.1,9.2; B2 ch14.1-14.5] (in R kmeans, em from mclust package)

Lecture 14

  • tree based methods [B1 sc1.6, 14.4; B2 sc9.2] (in R rpart from rpart package and ctree from party package)
  • principal component analysis [B10; B1 sc12.1; B2 sc14.5.1] (in R princomp)

Author: Marco Chiarandini <marco@imada.sdu.dk>

Date: 2013-03-11 11:04:04 CET

HTML generated by org-mode 6.33x in emacs 23