Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
LASSO Regression: Sparse Signal Induction
Graph Chatbot
Related lectures (30)
Previous
Page 3 of 3
Next
Non-smoothness and Compressive Sensing
Explores non-smooth minimization, compressive sensing, sparse signal recovery, and simple representations using atomic sets and atoms.
Ridge Regression: Penalised Least Squares
Explores Ridge Regression for handling multicollinearity and the LASSO method for model selection.
L1 Regularization: Sparse Solutions and Dimensionality Reduction
Delves into L1 regularization, sparse solutions, and dimensionality reduction in the context of machine learning.
Practical Aspects of Gaussian Linear Model
Explores practical aspects of the Gaussian linear model, focusing on variable selection and regularization methods.
Cross-validation & Regularization
Explores polynomial curve fitting, kernel functions, and regularization techniques, emphasizing the importance of model complexity and overfitting.
Supervised Learning in Financial Econometrics
Explores supervised learning in financial econometrics, covering linear regression, model fitting, potential problems, basis functions, subset selection, cross-validation, regularization, and random forests.
Cross-Validation: Techniques and Applications
Explores cross-validation, overfitting, regularization, and regression techniques in machine learning.
Linear Models: Least Squares
Explores linear models, least squares, Gaussian vectors, and model selection methods.
Model Selection Criteria: AIC, BIC, Cp
Explores model selection criteria like AIC, BIC, and Cp in statistics for data science.
Regularization Techniques
Explores regularization in linear models, including Ridge Regression and the Lasso, analytical solutions, and polynomial ridge regression.