In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements. The metric is interesting in several respects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics. It can also be understood to be the infinitesimal form of the relative entropy (i.e., the Kullback–Leibler divergence); specifically, it is the Hessian of the divergence. Alternately, it can be understood as the metric induced by the flat space Euclidean metric, after appropriate changes of variable. When extended to complex projective Hilbert space, it becomes the Fubini–Study metric; when written in terms of mixed states, it is the quantum Bures metric. Considered purely as a matrix, it is known as the Fisher information matrix. Considered as a measurement technique, where it is used to estimate hidden parameters in terms of observed random variables, it is known as the observed information. Given a statistical manifold with coordinates , one writes for the probability distribution as a function of . Here is drawn from the value space R for a (discrete or continuous) random variable X. The probability is normalized by The Fisher information metric then takes the form: The integral is performed over all values x in X. The variable is now a coordinate on a Riemann manifold. The labels j and k index the local coordinate axes on the manifold. When the probability is derived from the Gibbs measure, as it would be for any Markovian process, then can also be understood to be a Lagrange multiplier; Lagrange multipliers are used to enforce constraints, such as holding the expectation value of some quantity constant.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (3)
EE-411: Fundamentals of inference and learning
This is an introductory course in the theory of statistics, inference, and machine learning, with an emphasis on theoretical understanding & practical exercises. The course will combine, and alternat
COM-406: Foundations of Data Science
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
Related lectures (22)
Estimation: Linear Estimator
Explores linear estimation, optimal criteria, and the orthogonality principle for good choices in estimation.
Maximum Likelihood, MSE, Fisher Information, Cramér-Rao Bound
Explains maximum likelihood estimation, MSE, Fisher information, and Cramér-Rao bound in statistical inference.
Confidence Intervals: Gaussian Estimation
Explores confidence intervals, Gaussian estimation, Cramér-Rao inequality, and Maximum Likelihood Estimators.
Show more
Related publications (27)

On the use of Cramér-Rao Lower Bound for least-variance circuit parameters identification of Li-ion cells

Mario Paolone, Vladimir Sovljanski

Electrochemical Impedance Spectroscopy (EIS) and Equivalent Circuit Models (ECMs) are widely used to characterize the impedance and estimate parameters of electrochemical systems such as batteries. We use a generic ECM with ten parameters grouped to model ...
2024

Euclid preparation: XXVIII. Forecasts for ten different higher-order weak lensing statistics

Frédéric Courbin, Gianluca Castignani, Jean-Luc Starck, Austin Chandler Peel, Maurizio Martinelli, Yi Wang, Richard Massey, Fabio Finelli, Marcello Farina

Recent cosmic shear studies have shown that higher-order statistics (HOS) developed by independent teams now outperform standard two-point estimators in terms of statistical precision thanks to their sensitivity to the non-Gaussian features of large-scale ...
EDP SCIENCES S A2023

Differential Entropy of the Conditional Expectation Under Additive Gaussian Noise

Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the co ...
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC2022
Show more
Related concepts (2)
Fisher information
In mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).
Information geometry
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to probability distributions. Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian metric. The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.