NUIM CS401 F2006
Mon 12:00-12:50 CS2
Tue 15:00-15:50 JohnHume6
Instructor: Barak A.
Office: Hamilton Institute (NUIM,
Rye Hall, South Wing, room 5)
Office hours: you are welcome any time, just drop on by. (Afternoons
are best, excepting Fridays.) Or feel free to email or ring me up
(x6394) and make an appointment. If people would prefer that I set
aside some particular weekly times, let me know and I will do so.
We will use notes made available on the web.
Introduction to Machine Learning, pre-requisites
Introduction to R
Misc Definitions, and
Intro to Linear Classifiers, ie Perceptron
The Perceptron Learning Rule.
The "homogeneous coordinates" trick
Linear regression, gradient descent
More gradient descent
Backpropagation (part 1 of 2)
Backpropagation or reverse-mode AD (part 2 of 2)
Convergence of vanilla gradient descent of a linear unit with
Example of unsupervised learning: k-means clustering
Maximum Likelihood Estimation: definition and toy examples
EM of a simple Gaussian mixture model
Hidden Markov Models or HMMs (1/3)
Hidden Markov Models or HMMs (2/3)
Hidden Markov Models or HMMs (3/3)
graphical models: intuitions (1/2)
graphical models: Energy, Boltzmann distributions, Monte-Carlo (2/2)
Support Vector Machines I:
- dual representation of weights
- maximum margin hyperplane
Support Vector Machines II:
- quadratic programming
- the kernel trick!
Reinforcement Learning. Policy-value iteration, Q-learning, TD
Case Studies: ALVINN, PAPnet, TDNN for E-set,
handwritten digit recognition, TDgammon
Author's lecture notes
from Machine Learning by Tom Mitchell.
Code written in class, usually scrubbed up a bit.