**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Person# Sophie Myriam Hautphenne

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related research domains (4)

Related units (2)

Related publications (8)

Branching process

In probability theory, a branching process is a type of mathematical object known as a stochastic process, which consists of collections of random variables. The random variables of a stochastic process are indexed by the natural numbers. The original purpose of branching processes was to serve as a mathematical model of a population in which each individual in generation produces some random number of individuals in generation , according, in the simplest case, to a fixed probability distribution that does not vary from individual to individual.

Galton–Watson process

The Galton–Watson process is a branching stochastic process arising from Francis Galton's statistical investigation of the extinction of family names. The process models family names as patrilineal (passed from father to son), while offspring are randomly either male or female, and names become extinct if the family name line dies out (holders of the family name die without male descendants). This is an accurate description of Y chromosome transmission in genetics, and the model is thus useful for understanding human Y-chromosome DNA haplogroups.

Markov chain

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC).

Sophie Myriam Hautphenne, Laurent Lehmann, Nicolas Salamin

Understanding macroevolutionary patterns is central to evolutionary biology. This involves the process of divergence within a species, which starts at the microevolutionary level, for instance, when t

Anthony Christopher Davison, Sophie Myriam Hautphenne, Andrea Kraus

Birth-and-death processes are widely used to model the development of biological populations. Although they are relatively simple models, their parameters can be challenging to estimate, as the likeli

We consider the extinction events of Galton-Watson processes with countably infinitely many types. In particular, we construct truncated and augmented Galton-Watson processes with finite but increasin