COMPGI01 - Supervised Learning
This database contains the 2017-18 versions of syllabuses. Syllabuses from the 2016-17 session are available here.
Note: Whilst every effort is made to keep the syllabus and assessment records correct, the precise details must be checked with the lecturer(s).
|Code||COMPGI01 (Also taught as: COMPM055 Supervised Learning)|
|Prerequisites||Basic mathematics, Calculus, Probability, Linear algebra|
|Taught By||Mark Herbster (50%), John Shawe-Taylor (30%), Massi Pontil (20%)|
|Aims||This module covers supervised approaches to machine learning.|
|Learning Outcomes||Gain in-depth familiarity with various classical and contemporary supervised learning algorithms, understand the underlying limitations and principles that govern learning algorithms and ways of assessing and improving their performance.|
The course consists of both foundational topics for supervised learning such as Linear Regression, Nearest Neighbors and Kernelisation as well contemporary research areas such as multi-task learning and optimisation via proximal methods. In a given year topics will be drawn non-exclusively from the following.
- Nearest Neighbors
- Linear Regression
- Kernels and Regularisation
- Support Vector Machines
- Gaussian Processes
- Decision Trees
- Ensemble Learning
- Sparsity Methods
- Multi-task Learning
- Proximal Methods
- Semi-supervised Learning
- Neural Networks
- Matrix Factorization
- Online Learning
- Statistical Learning Theory
Method of Instruction
Lecture presentations with associated class problems
The course has the following assessment components:
- Written Examination (2.5 hours, 75%)
- Coursework Section (25%)
To pass this course, students must:
- Obtain an overall pass mark of 50% for all sections combined.
For full details of coursework see the course web page.
Reading list available via the UCL Library catalogue.