Bayesian programming is a formalism and a methodology for having a technique to specify probabilistic models and solve problems when less than the necessary information is available. Edwin T. Jaynes proposed that probability could be considered as an alternative and an extension of logic for rational reasoning with incomplete and uncertain information. In his founding book Probability Theory: The Logic of Science he developed this theory and proposed what he called “the robot,” which was not a physical device, but an inference engine to automate probabilistic reasoning—a kind of Prolog for probability instead of logic. Bayesian programming is a formal and concrete implementation of this "robot". Bayesian programming may also be seen as an algebraic formalism to specify graphical models such as, for instance, Bayesian networks, dynamic Bayesian networks, Kalman filters or hidden Markov models. Indeed, Bayesian Programming is more general than Bayesian networks and has a power of expression equivalent to probabilistic factor graphs. A Bayesian program is a means of specifying a family of probability distributions. The constituent elements of a Bayesian program are presented below: A program is constructed from a description and a question. A description is constructed using some specification () as given by the programmer and an identification or learning process for the parameters not completely specified by the specification, using a data set (). A specification is constructed from a set of pertinent variables, a decomposition and a set of forms. Forms are either parametric forms or questions to other Bayesian programs. A question specifies which probability distribution has to be computed. The purpose of a description is to specify an effective method of computing a joint probability distribution on a set of variables given a set of experimental data and some specification . This joint distribution is denoted as: . To specify preliminary knowledge , the programmer must undertake the following: Define the set of relevant variables on which the joint distribution is defined.
Jean-Pierre Hubaux, Mathias Jacques Jean-Marc Humbert, Kévin Clément Huguenin, Reza Shokri, Alexandra-Mihaela Olteanu
Wulfram Gerstner, Johanni Michael Brea, Alireza Modirshanechi, Vasiliki Liakoni