**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Neural Networks: Learning Features & Linear Prediction

Description

This lecture introduces neural networks as a way to learn features directly from observations and make linear predictions on top of these learned features. The instructor explains how neural networks can fix the limitations of kernel methods by allowing the learning of various features. The lecture covers the representation power of neural networks, their application in deep learning, and the importance of having a large amount of data for effective performance. The instructor also discusses the structure of neural networks, the role of activation functions like sigmoid and ReLU, and the approximation of functions using piecewise linear functions and ReLU activations.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (117)

Related lectures (146)

Feedforward neural network

A feedforward neural network (FNN) is one of the two broad types of artificial neural network, characterized by direction of the flow of information between its layers. Its flow is uni-directional, meaning that the information in the model flows in only one direction—forward—from the input nodes, through the hidden nodes (if any) and to the output nodes, without any cycles or loops, in contrast to recurrent neural networks, which have a bi-directional flow.

Deep learning

Deep learning is part of a broader family of machine learning methods, which is based on artificial neural networks with representation learning. The adjective "deep" in deep learning refers to the use of multiple layers in the network. Methods used can be either supervised, semi-supervised or unsupervised.

Convolutional neural network

Convolutional neural network (CNN) is a regularized type of feed-forward neural network that learns feature engineering by itself via filters (or kernel) optimization. Vanishing gradients and exploding gradients, seen during backpropagation in earlier neural networks, are prevented by using regularized weights over fewer connections. For example, for each neuron in the fully-connected layer 10,000 weights would be required for processing an image sized 100 × 100 pixels.

Artificial neural network

Artificial neural networks (ANNs, also shortened to neural networks (NNs) or neural nets) are a branch of machine learning models that are built using principles of neuronal organization discovered by connectionism in the biological neural networks constituting animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain. Each connection, like the synapses in a biological brain, can transmit a signal to other neurons.

Multilayer perceptron

A multilayer perceptron (MLP) is a misnomer for a modern feedforward artificial neural network, consisting of fully connected neurons with a nonlinear kind of activation function, organized in at least three layers, notable for being able to distinguish data that is not linearly separable. It is a misnomer because the original perceptron used a Heaviside step function, instead of a nonlinear kind of activation function (used by modern networks).

Kernel Methods: Neural Networks

Covers the fundamentals of neural networks, focusing on RBF kernels and SVM.

Neural Networks: Training and Activation

Explores neural networks, activation functions, backpropagation, and PyTorch implementation.

Neural Networks: Multilayer Perceptrons

Covers Multilayer Perceptrons, artificial neurons, activation functions, matrix notation, flexibility, regularization, regression, and classification tasks.

Deep Learning Fundamentals

Introduces deep learning, from logistic regression to neural networks, emphasizing the need for handling non-linearly separable data.

Neural Networks: Deep Neural Networks

Explores the basics of neural networks, with a focus on deep neural networks and their architecture and training.