David MacKay
.






Search :
.

Information Theory, Pattern Recognition and Neural Networks

Minor Option [16 lecture synopsis]
(from 2006, the course is reduced to 12 lectures)
Lecturer: David MacKay
Introduction to information theory [1]
The possibility of reliable communication over unreliable channels. The (7,4) Hamming code and repetition codes.
Entropy and data compression [3]
Entropy, conditional entropy, mutual information, Shannon information content. The idea of typicality and the use of typical sets for source coding. Shannon's source coding theorem. Codes for data compression. Uniquely decodeable codes and the Kraft-MacMillan inequality. Completeness of a symbol code. Prefix codes. Huffman codes. Arithmetic coding.
Communication over noisy channels [3]
Definition of channel capacity. Capacity of binary symmetric channel; of binary erasure channel; of Z channel. Joint typicality, random codes, and Shannon's noisy channel coding theorem. Real channels and practical error-correcting codes. Hash codes.
Statistical inference, data modelling and pattern recognition [2]
The likelihood function and Bayes' theorem. Clustering as an example
Approximation of probability distributions [2]
Laplace's method. (Approximation of probability distributions by Gaussian distributions.)
Monte Carlo methods: Importance sampling, rejection sampling, Gibbs sampling, Metropolis method. (Slice sampling, Hybrid Monte Carlo, Overrelaxation, exact sampling. *)
Variational methods and mean field theory. Ising models.
Neural networks and content-addressable memories [2]
The Hopfield network. [* = non-examinable]
Bibliography

Site last modified Sun Aug 31 18:51:05 BST 2014