Résumé
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemannian manifolds whose points correspond to probability distributions. Historically, information geometry can be traced back to the work of C. R. Rao, who was the first to treat the Fisher matrix as a Riemannian metric. The modern theory is largely due to Shun'ichi Amari, whose work has been greatly influential on the development of the field. Classically, information geometry considered a parametrized statistical model as a Riemannian manifold. For such models, there is a natural choice of Riemannian metric, known as the Fisher information metric. In the special case that the statistical model is an exponential family, it is possible to induce the statistical manifold with a Hessian metric (i.e a Riemannian metric given by the potential of a convex function). In this case, the manifold naturally inherits two flat affine connections, as well as a canonical Bregman divergence. Historically, much of the work was devoted to studying the associated geometry of these examples. In the modern setting, information geometry applies to a much wider context, including non-exponential families, nonparametric statistics, and even abstract statistical manifolds not induced from a known statistical model. The results combine techniques from information theory, affine differential geometry, convex analysis and many other fields. The standard references in the field are Shun’ichi Amari and Hiroshi Nagaoka's book, Methods of Information Geometry, and the more recent book by Nihat Ay and others. A gentle introduction is given in the survey by Frank Nielsen. In 2018, the journal Information Geometry was released, which is devoted to the field. The history of information geometry is associated with the discoveries of at least the following people, and many others. Ronald Fisher Harald Cramér Calyampudi Radhakrishna Rao Harold Jeffreys Solomon Kullback Jean-Louis Koszul Richard Leibler Claude Shannon Imre Csiszár N.
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.
Cours associés (2)
EE-411: Fundamentals of inference and learning
This is an introductory course in the theory of statistics, inference, and machine learning, with an emphasis on theoretical understanding & practical exercises. The course will combine, and alternat
Publications associées (21)
Personnes associées (2)
Concepts associés (4)
Information de Fisher
En statistique, l'information de Fisher quantifie l'information relative à un paramètre contenue dans une distribution. Elle est définie comme l'espérance de l'information observée, ou encore comme la variance de la fonction de score. Dans le cas multi-paramétrique, on parle de matrice d'information de Fisher. Elle a été introduite par R.A. Fisher. Soit f(x ; θ) la distribution de vraisemblance d'une variable aléatoire X (qui peut être multidimensionnelle), paramétrée par θ.
Fisher information metric
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements. The metric is interesting in several respects. By Chentsov’s theorem, the Fisher information metric on statistical models is the only Riemannian metric (up to rescaling) that is invariant under sufficient statistics.
Information
vignette|redresse=0.6|Pictogramme représentant une information. L’information est un de la discipline des sciences de l'information et de la communication (SIC). Au sens étymologique, l'« information » est ce qui donne une forme à l'esprit. Elle vient du verbe latin « informare », qui signifie « donner forme à » ou « se former une idée de ». L'information désigne à la fois le message à communiquer et les symboles utilisés pour l'écrire. Elle utilise un code de signes porteurs de sens tels qu'un alphabet de lettres, une base de chiffres, des idéogrammes ou pictogrammes.
Afficher plus