In probability theory and statistics, the covariance function describes how much two random variables change together (their covariance) with varying spatial or temporal separation. For a random field or stochastic process Z(x) on a domain D, a covariance function C(x, y) gives the covariance of the values of the random field at the two locations x and y: The same C(x, y) is called the autocovariance function in two instances: in time series (to denote exactly the same concept except that x and y refer to locations in time rather than in space), and in multivariate random fields (to refer to the covariance of a variable with itself, as opposed to the cross covariance between two different variables at different locations, Cov(Z(x1), Y(x2))). For locations x1, x2, ..., xN ∈ D the variance of every linear combination can be computed as A function is a valid covariance function if and only if this variance is non-negative for all possible choices of N and weights w1, ..., wN. A function with this property is called positive semidefinite. In case of a weakly stationary random field, where for any lag h, the covariance function can be represented by a one-parameter function which is called a covariogram and also a covariance function. Implicitly the C(xi, xj) can be computed from Cs(h) by: The positive definiteness of this single-argument version of the covariance function can be checked by Bochner's theorem. For a given variance , a simple stationary parametric covariance function is the "exponential covariance function" where V is a scaling parameter (correlation length), and d = d(x,y) is the distance between two points. Sample paths of a Gaussian process with the exponential covariance function are not smooth. The "squared exponential" (or "Gaussian") covariance function: is a stationary covariance function with smooth sample paths. The Matérn covariance function and rational quadratic covariance function are two parametric families of stationary covariance functions.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (6)
MICRO-455: Applied machine learning
Real-world engineering applications must cope with a large dataset of dynamic variables, which cannot be well approximated by classical or deterministic models. This course gives an overview of method
MICRO-570: Advanced machine learning
This course will present some of the core advanced methods in the field for structure discovery, classification and non-linear regression. This is an advanced class in Machine Learning; hence, student
FIN-417: Quantitative risk management
This course is an introduction to quantitative risk management that covers standard statistical methods, multivariate risk factor models, non-linear dependence structures (copula models), as well as p
Show more
Related lectures (31)
Conditional Gaussian Generation
Explores the generation of multivariate Gaussian distributions and the challenges of factorizing covariance matrices.
Time Series: Estimation and Spectral Representation
Explores time series estimation, spectral representation, and p-variate analysis in depth.
Time Series: Fundamentals and Models
Explores the fundamentals of time series analysis, including stationarity, linear processes, forecasting, and practical aspects.
Show more
Related publications (34)

Transportation-based functional ANOVA and PCA for covariance operators

Victor Panaretos, Yoav Zemel, Valentina Masarotto

We consider the problem of comparing several samples of stochastic processes with respect to their second-order structure, and describing the main modes of variation in this second order structure, if present. These tasks can be seen as an Analysis of Vari ...
Inst Mathematical Statistics-Ims2024

Multivariate geometric anisotropic Cox processes

Sofia Charlotta Olhede

This paper introduces a new modeling and inference framework for multivariate and anisotropic point processes. Building on recent innovations in multivariate spatial statistics, we propose a new family of multivariate anisotropic random fields, and from th ...
WILEY2023

The Completion Of Covariance Kernels

Victor Panaretos, Kartik Waghmare

We consider the problem of positive-semidefinite continuation: extending a partially specified covariance kernel from a subdomain Omega of a rectangular domain I x I to a covariance kernel on the entire domain I x I. For a broad class of domains Omega call ...
INST MATHEMATICAL STATISTICS-IMS2022
Show more
Related concepts (6)
Gaussian process
In probability theory and statistics, a Gaussian process is a stochastic process (a collection of random variables indexed by time or space), such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed. The distribution of a Gaussian process is the joint distribution of all those (infinitely many) random variables, and as such, it is a distribution over functions with a continuous domain, e.g.
Kriging
In statistics, originally in geostatistics, kriging or Kriging, (pronounced /ˌˈkɹiːɡɪŋ/) also known as Gaussian process regression, is a method of interpolation based on Gaussian process governed by prior covariances. Under suitable assumptions of the prior, kriging gives the best linear unbiased prediction (BLUP) at unsampled locations. Interpolating methods based on other criteria such as smoothness (e.g., smoothing spline) may not yield the BLUP. The method is widely used in the domain of spatial analysis and computer experiments.
Random field
In physics and mathematics, a random field is a random function over an arbitrary domain (usually a multi-dimensional space such as ). That is, it is a function that takes on a random value at each point (or some other domain). It is also sometimes thought of as a synonym for a stochastic process with some restriction on its index set. That is, by modern definitions, a random field is a generalization of a stochastic process where the underlying parameter need no longer be real or integer valued "time" but can instead take values that are multidimensional vectors or points on some manifold.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.