Course overview
Information theory is one of the seminal ideas of the 20th century. It enabled modern telecommunications and computing, and now is frequently used in modern Machine Learning. Information theory lies at the heart of how we communicate, store and preserve what we know. In some sense it is the most fundamental theory, as without symbols (representations of information) we couldn't even do mathematics. In this course, learners will develop knowledge of Information Theory and skills in using it to efficiently and reliably code signals. This course will help to unify concepts from statistical mechanics, statistical estimation theory and entropy, broadening and deepening knowledge gained from earlier units.
- Concepts Of Information And Uncertainty
- Coding And Communication
- Data Science And Other Applications
Course learning outcomes
- Rationalise different definitions and uses of entropy
- Derive key measures (e.g., mutual information) from known distributions
- Propose mechanisms for coding and compression of data in different contexts
- Implement core theoretical findings in applied settings including cyber security, network coding, game theory, cryptography and statistical estimation
- Quantify uncertainty both mathematically and programmatically
Degree list
The following degrees include this course