In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers. Complex random variables can always be considered as pairs of real random variables: their real and imaginary parts. Therefore, the distribution of one complex random variable may be interpreted as the joint distribution of two real random variables.
Some concepts of real random variables have a straightforward generalization to complex random variables—e.g., the definition of the mean of a complex random variable. Other concepts are unique to complex random variables.
Applications of complex random variables are found in digital signal processing, quadrature amplitude modulation and information theory.
A complex random variable on the probability space is a function such that both its real part and its imaginary part are real random variables on .
Consider a random variable that may take only the three complex values with probabilities as specified in the table. This is a simple example of a complex random variable.
The expectation of this random variable may be simply calculated:
Another example of a complex random variable is the uniform distribution over the filled unit circle, i.e. the set . This random variable is an example of a complex random variable for which the probability density function is defined. The density function is shown as the yellow disk and dark blue base in the following figure.
Complex normal distribution
Complex Gaussian random variables are often encountered in applications. They are a straightforward generalization of real Gaussian random variables. The following plot shows an example of the distribution of such a variable.
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form make no sense. However expressions of the form make sense.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Discrete mathematics is a discipline with applications to almost all areas of study. It provides a set of indispensable tools to computer science in particular. This course reviews (familiar) topics a
In probability theory and statistics, the characteristic function of any real-valued random variable completely defines its probability distribution. If a random variable admits a probability density function, then the characteristic function is the Fourier transform of the probability density function. Thus it provides an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions.
Independence is a fundamental notion in probability theory, as in statistics and the theory of stochastic processes. Two events are independent, statistically independent, or stochastically independent if, informally speaking, the occurrence of one does not affect the probability of occurrence of the other or, equivalently, does not affect the odds. Similarly, two random variables are independent if the realization of one does not affect the probability distribution of the other.
In probability theory and statistics, variance is the squared deviation from the mean of a random variable. The variance is also often defined as the square of the standard deviation. Variance is a measure of dispersion, meaning it is a measure of how far a set of numbers is spread out from their average value. It is the second central moment of a distribution, and the covariance of the random variable with itself, and it is often represented by , , , , or .
Explores probability theory through examples like bit strings, Bernoulli trials, and the Monty Hall problem, as well as the generalized Bayes' theorem and random variable distributions.
Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is kn ...
This work presents a new computational optimization framework for the robust control of parks of Wave Energy Converters (WEC) in irregular waves. The power of WEC parks is maximized with respect to the individual control damping and stiffness coefficients ...
MOX Modeling and Scientific Computing2023
In this paper, we consider electric vehicle charging facilities that offer various levels of service, i.e., charging rates, for varying prices such that rational users choose a level of service that minimizes the total cost to themselves including an oppor ...