Résumé
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements. The metric is interesting in several respects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics. It can also be understood to be the infinitesimal form of the relative entropy (i.e., the Kullback–Leibler divergence); specifically, it is the Hessian of the divergence. Alternately, it can be understood as the metric induced by the flat space Euclidean metric, after appropriate changes of variable. When extended to complex projective Hilbert space, it becomes the Fubini–Study metric; when written in terms of mixed states, it is the quantum Bures metric. Considered purely as a matrix, it is known as the Fisher information matrix. Considered as a measurement technique, where it is used to estimate hidden parameters in terms of observed random variables, it is known as the observed information. Given a statistical manifold with coordinates , one writes for the probability distribution as a function of . Here is drawn from the value space R for a (discrete or continuous) random variable X. The probability is normalized by The Fisher information metric then takes the form: The integral is performed over all values x in X. The variable is now a coordinate on a Riemann manifold. The labels j and k index the local coordinate axes on the manifold. When the probability is derived from the Gibbs measure, as it would be for any Markovian process, then can also be understood to be a Lagrange multiplier; Lagrange multipliers are used to enforce constraints, such as holding the expectation value of some quantity constant.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Publications associées (2)
Cours associés (1)
COM-406: Foundations of Data Science
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an