Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Many different algorithms developed in statistical physics, coding theory, signal processing, and artificial intelligence can be expressed by graphical models and solved (either exactly or approximately) with iterative message-passing algorithms on the model. One quantity of interest in these algorithms is the partition function. In graphical models without cycle (trees), the partition function can be computed efficiently by means of message-passing algorithms, like GDL or the sum-product algorithm. In contrast, when graphical models contain cycles, the computation of the partition function is in general intractable. Our contributions in this dissertation are: deriving deterministic upper and lower bounds on partition function, and developing methods to approximate this quantity with Monte Carlo methods. Specifically, we obtain subtree-based upper and lower bounds which lead to theoretical results on optimality properties of the minimum entropy sub-tree and finally lead to a greedy algorithm. At last, we introduce and analyze a number of estimators that use Gibbs sampling to draw samples from different target distributions to estimate the value of the partition function. In one estimator, we demonstrate a novel strategy which combines Gibbs sampling and message-passing algorithms on trees.
, ,
,
,