Course overview
Introductory and Preliminary material - Introduction to the concepts, key issues and motivating examples for adaptive filters; Discrete time linear systems and filters; Random variables and random processes, covariance matrices; Z transforms of stationary random processes. Optimum Linear Systems - Error surfaces and minimum mean square error; Optimum discrete time Wiener filter; Principle of orthogonality and canonical forms; Constrained optimisation; Method of steepest descent - convergence issues; Stochastic gradient descent LMS - convergence in the mean and mis-adjustment Case study. Least squares and recursive least squares. Linear Prediction - Forward and backward linear prediction; Levinson Durbin; Lattice filters. Nevrae networks.
Course learning outcomes
- Examine and derive the FIR Wiener filter
- Explain and use the LMS algorithm
- Apply the RLS algorithm
- Recognise the prediction filter formulation and applications
- Solve the Wiener filter weights for the prediction filter using the Levinson-Durbin algorithm
- Apply the Lattice filter architecture from the Levinson-Durbin algorithm
- Use Matlab to implement the Wiener filter, Least Squares, LMS and RLS algorithms, and apply to selected applications.