Course overview
This course aims to equip students with mathematical skills critical for understanding and applying machine learning algorithms. Students will be able to explain and apply the mathematical concepts essential for machine learning, focusing on calculus, linear algebra, optimisation, probability, and information theory. Essential for machine learning and data science curricula, this course strengthens mathematical foundations, supporting advanced studies and practical applications in the field.
Course learning outcomes
- Demonstrate the principles of calculus, linear algebra and probability theory and their applications in machine learning
- Apply gradient-based optimisation techniques to simple multi-variate problems
- Apply probability theory to model uncertainty and make predictions in machine learning tasks
- Quantify information content in machine learning systems using information theory principles