Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Neural Networks: Basics and History
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Kernel Methods: Neural Networks
Covers the fundamentals of neural networks, focusing on RBF kernels and SVM.
Neural Networks: Two Layers Neural Network
Covers the basics of neural networks, focusing on the development from two layers neural networks to deep neural networks.
Neural Networks: Perceptron Model and Backpropagation Algorithm
Covers the perceptron model and backpropagation algorithm in neural networks.
Neural Networks: Training and Optimization
Explores the training and optimization of neural networks, addressing challenges like non-convex loss functions and local minima.
Multi-layered Perceptron: History and Training Algorithm
Explores the historical development and training of multi-layered perceptrons, emphasizing the backpropagation algorithm and feature design.
Neural Networks: Perceptron and Backpropagation
Covers the basics of neural networks, including the perceptron model and backpropagation.
Feedforward Neural Networks: Activation Functions and Backpropagation
Introduces feedforward neural networks, activation functions, and backpropagation for training, addressing challenges and powerful methods.
Pytorch Intro: MNIST and Digits
Covers Pytorch basics with MNIST and Digits datasets, focusing on training neural networks for handwritten digit recognition.
Recurrent Neural Networks: Training and Challenges
Discusses recurrent neural networks, their training challenges, and solutions like LSTMs and GRUs.
Neural Networks: Perceptron
Covers the main concepts of neural networks, including the Perceptron model and training algorithms.