**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Poisson distribution

Summary

In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.
It plays an important role for discrete-stable distributions.
For instance, a call center receives an average of 180 calls per hour, 24 hours a day. The calls are independent; receiving one does not change the probability of when the next one will arrive. The number of calls received during any minute has a Poisson probability distribution with mean 3: the most likely numbers are 2 and 3 but 1 and 4 are also likely and there is a small probability of i

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications

Loading

Related people

Loading

Related units

Loading

Related concepts

Loading

Related courses

Loading

Related lectures

Loading

Related concepts (83)

Normal distribution

In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function

Binomial distribution

In probability theory and statistics, the binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experim

Probability distribution

In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mat

Related publications (100)

Loading

Loading

Loading

Related people (22)

Related units (11)

Related courses (127)

MATH-600: Optimization and simulation

Master state-of-the art methods in optimization with heuristics and simulation.
Work involves:

- reading the material beforehand
- class hours to discuss the material and solve problems
- homework

ENV-424: Water resources engineering

Water resources engineering designs systems to control the quantity, quality, timing, and distribution of water to support human demands and the needs of the environment.

MATH-408: Regression methods

General graduate course on regression methods

Ahmed Bassam Sayed Ayoub Mohamed Emam

In this thesis, we study the 3 challenges described above. First, we study different reconstruction techniques and assess the fidelity of each reconstruction results by means of structured illumination and phase conjugation. By reconstructing the 3D refractive index of the sample using different algorithms (i.e. Born, Rytov, and Radon) and then perform a numerical back-propagation of experimentally measured structured illumination pattern we are able to assess the fidelity of each reconstruction algorithms without prior information about the 3D RI distribution of the sample.The second part of the thesis is concerned with the 3D reconstruction of samples using intensity-only measurements which the need to holographically acquire them. We show that using intensity-only measurements, we could still be able to reconstruct the 3D volume of the sample with edge-enhanced effects which was proven useful for drug delivery applications in which nano-particles were identified on the cell membrane of immune T-cells in a drug delivery studies. Such reconstruction technique would result in more robust imaging system where the commercial imaging microscope systems can be incorporated with LEDs for high-quality speckle noise-free imaging systems. In addition, we show that under certain conditions, we can be able to reconstruct the 3D refractive index distribution of different samples.The third part of the thesis is contributing to high-speed complex wave-front shaping using DMDs. In that part, new modulation technique is demonstrated that can boost the speed of the current time-multiplexing techniques by a factor of 32. The modulation technique is based on amplitude modulation where an amplitude modulator is synchronized withvthe DMD to modulate the intensity of each bit-plane of an 8-bit image and then all the modulated bit-planes are linearly added on the detector. Such modulation technique can be used not only for structured illumination microscopy but also for high-speed 3D printing applications as well as projectors.The last part is concerned with using deep learning approaches to solve the missing cone problem usually accompanied with optical imaging due to the limited numerical aperture of the imaging system. Two techniques are discussed; the first is based on using a physical model to enhance the quality of the 3D RI reconstruction and the second is based on using deep neural network to solve the missing cone problem.

Generalized Linear Models have become a commonly used tool of data analysis. Such models are used to fit regressions for univariate responses with normal, gamma, binomial or Poisson distribution. Maximum likelihood is generally applied as fitting method. In the usual regression setting the least absolute-deviations estimator (L1-norm) is a popular alternative to least squares (L2-norm) because of its simplicity and its robustness properties. In the first part of this thesis we examine the question of how much of these robustness features carry over to the setting of generalized linear models. We study a robust procedure based on the minimum absolute deviation estimator of Morgenthaler (1992), the Lq quasi-likelihood when q = 1. In particular, we investigate the influence function of these estimates and we compare their sensitivity to that of the maximum likelihood estimate. Furthermore we particularly explore the Lq quasi-likelihood estimates in binary regression. These estimates are difficult to compute. We derive a simpler estimator, which has a similar form as the Lq quasi-likelihood estimate. The resulting estimating equation consists in a simple modification of the familiar maximum likelihood equation with the weights wq(μ). This presents an improvement compared to other robust estimates discussed in the literature that typically have weights, which depend on the couple (xi, yi) rather than on μi = h(xiT β) alone. Finally, we generalize this estimator to Poisson regression. The resulting estimating equation is a weighted maximum likelihood with weights that depend on μ only.

Il s'agit d'un ensemble formé de deux bâtiments (centre de distribution d'héroïne et bureaux, restaurant) qui s'articulent autour d'une petite place par laquelle on accède depuis un escalier reliant les deux niveaux de la ville.

1999Related lectures (312)