**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Jacamar Data Analysis

Description

This lecture presents an analysis of jacamar data, focusing on the response of a rufous-tailed jacamar to different species of palatable butterflies with artificially colored wing undersides. The lecture covers contingency tables, Poisson response models, and the implications of including parameters for row margins. It also discusses models, probabilities, and computations related to the data. Additionally, the lecture delves into smoking data analysis, including models, likelihood ratio tests, parameter estimates, and the precision of estimates. Furthermore, it explores problems with log-linear models in the context of visual impairment data, highlighting the challenges and limitations of such models.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

MATH-408: Regression methods

General graduate course on regression methods

Instructor

Related concepts (57)

Contingency table

In statistics, a contingency table (also known as a cross tabulation or crosstab) is a type of table in a matrix format that displays the (multivariate) frequency distribution of the variables. They are heavily used in survey research, business intelligence, engineering, and scientific research. They provide a basic picture of the interrelation between two variables and can help find interactions between them.

Maximum likelihood estimation

In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.

Probability axioms

The Kolmogorov axioms are the foundations of probability theory introduced by Russian mathematician Andrey Kolmogorov in 1933. These axioms remain central and have direct contributions to mathematics, the physical sciences, and real-world probability cases. An alternative approach to formalising probability, favoured by some Bayesians, is given by Cox's theorem. The assumptions as to setting up the axioms can be summarised as follows: Let be a measure space with being the probability of some event , and .

Wald test

In statistics, the Wald test (named after Abraham Wald) assesses constraints on statistical parameters based on the weighted distance between the unrestricted estimate and its hypothesized value under the null hypothesis, where the weight is the precision of the estimate. Intuitively, the larger this weighted distance, the less likely it is that the constraint is true. While the finite sample distributions of Wald tests are generally unknown, it has an asymptotic χ2-distribution under the null hypothesis, a fact that can be used to determine statistical significance.

Relative likelihood

In statistics, when selecting a statistical model for given data, the relative likelihood compares the relative plausibilities of different candidate models or of different values of a parameter of a single model. Assume that we are given some data x for which we have a statistical model with parameter θ. Suppose that the maximum likelihood estimate for θ is . Relative plausibilities of other θ values may be found by comparing the likelihoods of those other values with the likelihood of .

Related lectures (47)

Probabilistic Linear RegressionPHYS-467: Machine learning for physicists

Explores probabilistic linear regression, covering joint and conditional probability, ridge regression, and overfitting mitigation.

Multiclass ClassificationPHYS-467: Machine learning for physicists

Covers the concept of multiclass classification and the challenges of linearly separating data with multiple classes.

Linear Regression: Ozone Data AnalysisMATH-131: Probability and statistics

Explores linear regression analysis of ozone data using statistical models.

Linear Regression: Estimation and TestingMATH-234(b): Probability and statistics

Explores linear regression estimation, hypothesis testing, and practical applications in statistics.

Linear Regression: Estimation and InferenceMATH-234(b): Probability and statistics

Explores linear regression estimation, linearity assumptions, and statistical tests in the context of model comparison.