Taylor expansions for the moments of functions of random variables
Publications associées (11)
Graph Chatbot
Chattez avec Graph Search
Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Since the birth of Information Theory, researchers have defined and exploited various information measures, as well as endowed them with operational meanings. Some were born as a "solution to a problem", like Shannon's Entropy and Mutual Information. Other ...
Given two jointly distributed random variables (X,Y), a functional representation of X is a random variable Z independent of Y, and a deterministic function g(⋅,⋅) such that X=g(Y,Z). The problem of finding a minimum entropy functional representation is kn ...
We address adaptive multivariate polynomial approximation by means of the discrete least-squares method with random evaluations, to approximate in the L2 probability sense a smooth function depending on a random variable distributed according to a given pr ...
Given two random variables X and Y , an operational approach is undertaken to quantify the "leakage" of information from X to Y . The resulting measure L (X -> Y) is called maximal leakage, and is defined as the multiplicative increase, upon observing Y , ...
We present a new information-theoretic result which we call the Chaining Lemma. It considers a so-called "chain" of random variables, defined by a source distribution X-(0) with high min-entropy and a number (say, t in total) of arbitrary functions (T-1,.. ...
We consider the setting of estimating the mean of a random variable by a sequential stopping rule Monte Carlo (MC) method. The performance of a typical second moment based sequential stopping rule MC method is shown to be unreliable in such settings both b ...
Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative gaussian w ...
A novel estimator for mutual information is proposed. The estimator is useful for the (asymmetric) scenario where only a few samples for one random variable are available, but for each sample, the conditional distribution of the other random variable can b ...
Ieee Service Center, 445 Hoes Lane, Po Box 1331, Piscataway, Nj 08855-1331 Usa2009
The characteristic functional is the infinite-dimensional generalization of the Fourier transform for measures on function spaces. It characterizes the statistical law of the associated stochastic process in the same way as a characteristic function specif ...
We consider multiple description (MD) coding for the Gaussian source with K descriptions under the symmetric mean-squared error (MSE) distortion constraints, and provide an approximate characterization of the rate region. We show that the rate region can b ...