This lecture covers the fundamentals of back-propagation in neural networks, explaining the process of updating weights based on errors, the concept of pre-activations, and the role of biases. It also delves into the evaluation of neural networks through training and test losses.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace