**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# An extension of the stochastic sewing lemma and applications to fractional stochastic calculus

Abstract

We give an extension of Le's stochastic sewing lemma. The stochastic sewing lemma proves convergence in $L_m$ of Riemann type sums $\sum _{[s,t] \in \pi } A_{s,t}$ for an adapted two-parameter stochastic process A, under certain conditions on the moments of $A_{s,t}$ and of conditional expectations of $A_{s,t}$ given $\mathcal F_s$ . Our extension replaces the conditional expectation given $\mathcal F_s$ by that given $\mathcal F_v$ for $v

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (17)

Related publications (12)

Conditional expectation

In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take "on average" over an arbitrarily large number of occurrences – given that a certain set of "conditions" is known to occur. If the random variable can take on only a finite number of values, the "conditions" are that the variable can only take on a subset of those values.

Law of total expectation

The proposition in probability theory known as the law of total expectation, the law of iterated expectations (LIE), Adam's law, the tower rule, and the smoothing theorem, among other names, states that if is a random variable whose expected value is defined, and is any random variable on the same probability space, then i.e., the expected value of the conditional expected value of given is the same as the expected value of .

Law of total variance

In probability theory, the law of total variance or variance decomposition formula or conditional variance formulas or law of iterated variances also known as Eve's law, states that if and are random variables on the same probability space, and the variance of is finite, then In language perhaps better known to statisticians than to probability theorists, the two terms are the "unexplained" and the "explained" components of the variance respectively (cf. fraction of variance unexplained, explained variation).

Michael Christoph Gastpar, Alper Köse, Ahmet Arda Atalik

This paper considers an additive Gaussian noise channel with arbitrarily distributed finite variance input signals. It studies the differential entropy of the minimum mean-square error (MMSE) estimator and provides a new lower bound which connects the diff ...

, ,

In this work, we consider the problem of estimating the probability distribution, the quantile or the conditional expectation above the quantile, the so called conditional-value-at-risk, of output quantities of complex random differential models by the MLM ...

2022, ,

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the co ...