Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the theory behind Kalman filtering, focusing on innovations and predictions. The instructor explains how to compute the optimal predictor and filter in a statistical sense, emphasizing the importance of conditioning on past measurements. The lecture also delves into the generalization to systems with inputs, showcasing the impact of input on prediction accuracy. Additionally, the lecture discusses the innovation sequence, highlighting its role in quantifying new information brought by measurements. Various statistical properties of the innovation sequence are explored, shedding light on its correlation with past measurements and noise. The lecture concludes with practical applications of Kalman filtering, such as estimating the position and velocity of ground vehicles using GPS measurements.