**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Geometric distribution

Summary

In probability theory and statistics, the geometric distribution is either one of two discrete probability distributions:
The probability distribution of the number X of Bernoulli trials needed to get one success, supported on the set ;
The probability distribution of the number Y = X − 1 of failures before the first success, supported on the set .
Which of these is called the geometric distribution is a matter of convention and convenience.
These two different geometric distributions should not be confused with each other. Often, the name shifted geometric distribution is adopted for the former one (distribution of the number X); however, to avoid ambiguity, it is considered wise to indicate which is intended, by mentioning the support explicitly.
The geometric distribution gives the probability that the first occurrence of success requires k independent trials, each with success probability p. If the probability of success on each trial is p, then the probability that the kth trial is the first success is
for k = 1, 2, 3, 4, ....
The above form of the geometric distribution is used for modeling the number of trials up to and including the first success. By contrast, the following form of the geometric distribution is used for modeling the number of failures until the first success:
for k = 0, 1, 2, 3, ....
In either case, the sequence of probabilities is a geometric sequence.
For example, suppose an ordinary die is thrown repeatedly until the first time a "1" appears. The probability distribution of the number of times it is thrown is supported on the infinite set { 1, 2, 3, ... } and is a geometric distribution with p = 1/6.
The geometric distribution is denoted by Geo(p) where 0 < p ≤ 1.
Consider a sequence of trials, where each trial has only two possible outcomes (designated failure and success). The probability of success is assumed to be the same for each trial. In such a sequence of trials, the geometric distribution is useful to model the number of failures before the first success since the experiment can have an indefinite number of trials until success, unlike the binomial distribution which has a set number of trials.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (77)

Related people (19)

Related units (13)

Related concepts (19)

Poisson distribution

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.

Negative binomial distribution

In probability theory and statistics, the negative binomial distribution is a discrete probability distribution that models the number of failures in a sequence of independent and identically distributed Bernoulli trials before a specified (non-random) number of successes (denoted ) occurs. For example, we can define rolling a 6 on a dice as a success, and rolling any other number as a failure, and ask how many failure rolls will occur before we see the third success ().

Bernoulli distribution

In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli, is the discrete probability distribution of a random variable which takes the value 1 with probability and the value 0 with probability . Less formally, it can be thought of as a model for the set of possible outcomes of any single experiment that asks a yes–no question. Such questions lead to outcomes that are boolean-valued: a single bit whose value is success/yes/true/one with probability p and failure/no/false/zero with probability q.

Related courses (21)

MATH-232: Probability and statistics

A basic course in probability and statistics

CS-101: Advanced information, computation, communication I

Discrete mathematics is a discipline with applications to almost all areas of study. It provides a set of indispensable tools to computer science in particular. This course reviews (familiar) topics a

CS-457: Geometric computing

This course will cover mathematical concepts and efficient numerical methods for geometric computing. We will explore the beauty of geometry and develop algorithms to simulate and optimize 2D and 3D g

Related lectures (38)

Inference Problems & Spin Glass GamePHYS-642: Statistical physics for optimization & learning

Covers inference problems related to the Spin Glass Game and the challenges of making mistakes with preb P.

Verifying Single Qubit Hamiltonians

Covers the verification of single qubit Hamiltonians using various techniques and methods.

Probability Basics: Events and IndependenceCS-101: Advanced information, computation, communication I

Introduces the basics of probability, covering event probabilities, conditional probabilities, and independence.

Alain Nussbaumer, Heikki Tapani Remes, Abinab Niraula

The local undercut defects at the weld toe provide a potential initiation site for fatigue cracks and significantly impact the structure's fatigue strength. The influence of continuous undercut depth on fatigue performance is widely studied, but the resear ...

2024Marcos Rubinstein, Farhad Rachidi-Haeri, Hamidreza Karami, Elias Per Joachim Le Boudec, Nicolas Mora Parra

Time reversal exploits the invariance of electromagnetic wave propagation in reciprocal and lossless media to localize radiating sources. Time-reversed measurements are back-propagated in a simulated domain and converge to the unknown source location. The ...

2024Rachid Guerraoui, Sadegh Farhadkhani, El Mahdi El Mhamdi, Le Nguyen Hoang

The geometric median, an instrumental component of the secure machine learning toolbox, is known to be effective when robustly aggregating models (or gradients), gathered from potentially malicious (or strategic) users. What is less known is the extent to ...

2023