**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Generalized Linear Regression: Classification

Description

This lecture covers Generalized Linear Regression, Multiple Linear Classification, Evaluating Binary Classification, and Poisson Regression. It explains the likelihood maximization, supervised learning, confusion matrices, ROC curves, AUC, and noise in data. The instructor demonstrates the application of logistic regression, classification, and regression models on various datasets.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Instructor

In course

Related concepts (192)

BIO-322: Introduction to machine learning for bioengineers

Students understand basic concepts and methods of machine learning. They can describe them in mathematical terms and can apply them to data using a high-level programming language (julia/python/R).

Logistic regression

In statistics, the logistic model (or logit model) is a statistical model that models the probability of an event taking place by having the log-odds for the event be a linear combination of one or more independent variables. In regression analysis, logistic regression (or logit regression) is estimating the parameters of a logistic model (the coefficients in the linear combination).

Linear regression

In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

Binary classification

Binary classification is the task of classifying the elements of a set into two groups (each called class) on the basis of a classification rule. Typical binary classification problems include: Medical testing to determine if a patient has certain disease or not; Quality control in industry, deciding whether a specification has been met; In information retrieval, deciding whether a page should be in the result set of a search or not. Binary classification is dichotomization applied to a practical situation.

Nonlinear regression

In statistics, nonlinear regression is a form of regression analysis in which observational data are modeled by a function which is a nonlinear combination of the model parameters and depends on one or more independent variables. The data are fitted by a method of successive approximations. In nonlinear regression, a statistical model of the form, relates a vector of independent variables, , and its associated observed dependent variables, . The function is nonlinear in the components of the vector of parameters , but otherwise arbitrary.

Poisson distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.

Related lectures (926)

Supervised Learning: Likelihood Maximization

Covers supervised learning through likelihood maximization to find optimal parameters.

Supervised Learning Essentials

Introduces the basics of supervised learning, focusing on logistic regression, linear classification, and likelihood maximization.

Supervised Learning Fundamentals

Introduces the fundamentals of supervised learning, including loss functions and probability distributions.

Polynomial Regression: Overview

Covers polynomial regression, flexibility impact, and underfitting vs overfitting.

Logistic Regression: Probabilistic Interpretation

Covers logistic regression's probabilistic interpretation, multinomial regression, KNN, hyperparameters, and curse of dimensionality.