Related lectures (192)
Maximum Entropy Modeling: Applications & Inference
Explores maximum entropy modeling applications in neuroscience and protein sequence data.
Understanding Autoencoders
Explores autoencoders, from linear mappings in PCA to nonlinear mappings, deep autoencoders, and their applications.
Neural Networks Recap: Activation Functions
Covers the basics of neural networks, activation functions, training, image processing, CNNs, regularization, and dimensionality reduction methods.
Singular Value Decomposition
Explores Singular Value Decomposition, low-rank approximation, fundamental subspaces, and matrix norms.
Linear Algebra: Cramer's Rule
Covers Cramer's Rule for solving linear equations using determinants.
Laplace Equation: Decomposition and Solutions
Covers the Laplace equation, decomposition of linear problems, and solutions through separation of variables.
Unsupervised Learning: PCA & K-means
Covers unsupervised learning with PCA and K-means for dimensionality reduction and data clustering.
Modeling Neurobiological Signals: Spikes & Firing Rate
Explores modeling neurobiological signals, focusing on spikes, firing rate, multiple state neurons, and parameter estimation.
Statistical Signal Processing
Covers Gaussian Mixture Models, Denoising, Data Classification, and Spike Sorting using Principal Component Analysis.
Jordan Normal Form
Covers the Jordan Normal Form theorem and invariance of kernels under transformations.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.