Explores explicit stabilised Runge-Kutta methods and their application to Bayesian inverse problems, covering optimization, sampling, and numerical experiments.
Covers vectorization in Python using Numpy for efficient scientific computing, emphasizing the benefits of avoiding for loops and demonstrating practical applications.
Explores error estimation in numerical methods for solving differential equations, focusing on local truncation error, stability, and Lipschitz continuity.
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.