COMPGI16 - Approximate Inference and Learning in Probabilistic Models
Note: Whilst every effort is made to keep the syllabus and assessment records correct, the precise details must be checked with the lecturer(s).- Code
- COMPGI16
- Year
- MSc
- Prerequisites
- COMPGI18 Probabilstic and Unsupervised Learning
- Term
- 1
- Taught By
- Maneesh Sahani (Gatsby Computational Neuroscience Unit)
(50%)
YeeWhye Teh (Gatsby Computational Neuroscience Unit) (50%) - Aims
- The module will present the foundations of approximate inference and learning in probabilistic graphical models (e.g. Bayesian networks and Markov networks), with particular focus on models composed from conditional exponential family distributions. Both stochastic (Monte Carlo) methods and deterministic approximations will be covered. The methods will be discussed in relation to practical problems in real-world inference in Machine Learning, including problems in tracking and learning.
- Learning Outcomes
- Students will be able to understand how to derive and implement state-of-the-art approximate inference techniques and be able to make contributions to research in this area.
Content:
- Monte Carlo methods
- rejection and importance sampling
sequential Monte Carlo (particle filters)
Gibbs sampling
Metropolis Hastings
Hybrid Monte Carlo
Slice Sampling - Deterministic approximations
- Mean Field theories based on Kullback-Leibler divergence minimisation
structured mean field approximations
expectation propagation
loopy- and other variants of belief propagation
the Laplace approximation
general convex approximation
Method of Instruction:
Lecture presentations with associated class problems.
Assessment:
The course has the following assessment components:
- Written Examination (2.5 hours, 50%)
- Coursework Section (3 pieces, 50%)
To pass this course, students must:
- Obtain an overall pass mark of 50% for all sections combined
The examination rubric is:
Answer all questions
Resources:
There is no required textbook. However, the following in an excellent sources for many of the topics covered here. David J.C.
MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press. (also available online)

