**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Advanced Probability: Independent Random Variables

Description

This lecture covers the concept of independent random variables, defined as X and Y being independent if the probability of X taking a certain value and Y taking another value is the product of their individual probabilities. The instructor explains this definition using examples with dice rolls and emphasizes the importance of understanding independence between events. The lecture also delves into the calculation of probabilities for dependent random variables, showcasing how to determine the probability of specific outcomes based on the given conditions.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Related concepts (22)

CS-101: Advanced information, computation, communication I

Discrete mathematics is a discipline with applications to almost all areas of study. It provides a set of indispensable tools to computer science in particular. This course reviews (familiar) topics a

Probability

Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty. The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin.

Probability theory

Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space.

Bayesian probability

Bayesian probability (ˈbeɪziən or ˈbeɪʒən ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown.

Probability distribution

In probability theory and statistics, a probability distribution is the mathematical function that gives the probabilities of occurrence of different possible outcomes for an experiment. It is a mathematical description of a random phenomenon in terms of its sample space and the probabilities of events (subsets of the sample space). For instance, if X is used to denote the outcome of a coin toss ("the experiment"), then the probability distribution of X would take the value 0.5 (1 in 2 or 1/2) for X = heads, and 0.

Probability interpretations

The word probability has been used in a variety of ways since it was first applied to the mathematical study of games of chance. Does probability measure the real, physical, tendency of something to occur, or is it a measure of how strongly one believes it will occur, or does it draw on both these elements? In answering such questions, mathematicians interpret the probability values of probability theory. There are two broad categories of probability interpretations which can be called "physical" and "evidential" probabilities.

Related lectures (629)

Random Variables and Expected ValueCS-101: Advanced information, computation, communication I

Introduces random variables, probability distributions, and expected values through practical examples.

Probability Theory: Examples and ApplicationsCS-101: Advanced information, computation, communication I

Explores probability theory through examples like bit strings, Bernoulli trials, and the Monty Hall problem, as well as the generalized Bayes' theorem and random variable distributions.

Bias and Variance in EstimationMATH-232: Probability and statistics

Discusses bias and variance in statistical estimation, exploring the trade-off between accuracy and variability.

Independence of Sub-FieldsCOM-417: Advanced probability and applications

Explores the concept of independence of sub-fields within a field and its implications in random variables.

Conditional Expectation PropertiesMATH-431: Theory of stochastic calculus

Explores conditional expectation properties, including measurability, linearity, and independence of random variables.