**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Variable-order Markov model

Summary

In the mathematical theory of stochastic processes, variable-order Markov (VOM) models are an important class of models that extend the well known Markov chain models. In contrast to the Markov chain models, where each random variable in a sequence with a Markov property depends on a fixed number of random variables, in VOM models this number of conditioning random variables may vary based on the specific observed realization.
This realization sequence is often called the context; therefore the VOM models are also called context trees. VOM models are nicely rendered by colorized probabilistic suffix trees (PST). The flexibility in the number of conditioning random variables turns out to be of real advantage for many applications, such as statistical analysis, classification and prediction.
Consider for example a sequence of random variables, each of which takes a value from the ternary alphabet . Specifically, consider the string constructed from infinite concatenations of the sub-string aaabc: aaabcaaabcaaabcaaabc...aaabc.
The VOM model of maximal order 2 can approximate the above string using only the following five conditional probability components: Pr(a aa) = 0.5, Pr(b aa) = 0.5, Pr(c b) = 1.0, Pr(a c)= 1.0, Pr(a ca) = 1.0.
In this example, Pr(c ab) = Pr(c b) = 1.0; therefore, the shorter context b is sufficient to determine the next character. Similarly, the VOM model of maximal order 3 can generate the string exactly using only five conditional probability components, which are all equal to 1.0.
To construct the Markov chain of order 1 for the next character in that string, one must estimate the following 9 conditional probability components: Pr(a a), Pr(a b), Pr(a c), Pr(b a), Pr(b b), Pr(b c), Pr(c a), Pr(c b), Pr(c c). To construct the Markov chain of order 2 for the next character, one must estimate 27 conditional probability components: Pr(a aa), Pr(a ab), ..., Pr(c cc). And to construct the Markov chain of order three for the next character one must estimate the following 81 conditional probability components: Pr(a aaa), Pr(a aab), .

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.