Résumé
In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density function considered as a function of its distributional parameterization argument. For example, consider a model which gives the probability density function of observable random variable as a function of a parameter Then for a specific value of the function is a likelihood function of it gives a measure of how "likely" any particular value of is, if we know that has the value The density function may be a density with respect to counting measure, i.e. a probability mass function. Two likelihood functions are equivalent if one is a scalar multiple of the other. The likelihood principle is this: All information from the data that is relevant to inferences about the value of the model parameters is in the equivalence class to which the likelihood function belongs. The strong likelihood principle applies this same criterion to cases such as sequential experiments where the sample of data that is available results from applying a stopping rule to the observations earlier in the experiment. Suppose is the number of successes in twelve independent Bernoulli trials with probability of success on each trial, and is the number of independent Bernoulli trials needed to get three successes, again with probability of success on each trial ( for the toss of a fair coin). Then the observation that induces the likelihood function while the observation that induces the likelihood function The likelihood principle says that, as the data are the same in both cases, the inferences drawn about the value of should also be the same. In addition, all the inferential content in the data about the value of is contained in the two likelihoods, and is the same if they are proportional to one another.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Concepts associés (8)
Foundations of statistics
Statistics is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data, and is used to solve practical problems and draw conclusions. When analyzing data, the approaches used can lead to different conclusions on the same data. For example, weather forecasts often vary among different forecasting agencies that use different forecasting algorithms and techniques. Conclusions drawn from statistical analysis often involve uncertainty as they represent the probability of an event occurring.
Likelihood principle
In statistics, the likelihood principle is the proposition that, given a statistical model, all the evidence in a sample relevant to model parameters is contained in the likelihood function. A likelihood function arises from a probability density function considered as a function of its distributional parameterization argument.
Fonction de vraisemblance
vignette|Exemple d'une fonction de vraisemblance pour le paramètre d'une Loi de Poisson En théorie des probabilités et en statistique, la fonction de vraisemblance (ou plus simplement vraisemblance) est une fonction des paramètres d'un modèle statistique calculée à partir de données observées. Les fonctions de vraisemblance jouent un rôle clé dans l'inférence statistique fréquentiste, en particulier pour les méthodes statistiques d'estimation de paramètres.
Afficher plus