In statistics, originally in geostatistics, kriging or Kriging, (pronounced /ˌˈkɹiːɡɪŋ/) also known as Gaussian process regression, is a method of interpolation based on Gaussian process governed by prior covariances. Under suitable assumptions of the prior, kriging gives the best linear unbiased prediction (BLUP) at unsampled locations. Interpolating methods based on other criteria such as smoothness (e.g., smoothing spline) may not yield the BLUP. The method is widely used in the domain of spatial analysis and computer experiments. The technique is also known as Wiener–Kolmogorov prediction, after Norbert Wiener and Andrey Kolmogorov.
The theoretical basis for the method was developed by the French mathematician Georges Matheron in 1960, based on the master's thesis of Danie G. Krige, the pioneering plotter of distance-weighted average gold grades at the Witwatersrand reef complex in South Africa. Krige sought to estimate the most likely distribution of gold based on samples from a few boreholes. The English verb is to krige, and the most common noun is kriging. The word is sometimes capitalized as Kriging in the literature.
Though computationally intensive in its basic formulation, kriging can be scaled to larger problems using various approximation methods.
Kriging predicts the value of a function at a given point by computing a weighted average of the known values of the function in the neighborhood of the point. The method is closely related to regression analysis. Both theories derive a best linear unbiased estimator based on assumptions on covariances, make use of Gauss–Markov theorem to prove independence of the estimate and error, and use very similar formulae. Even so, they are useful in different frameworks: kriging is made for estimation of a single realization of a random field, while regression models are based on multiple observations of a multivariate data set.
The kriging estimation may also be seen as a spline in a reproducing kernel Hilbert space, with the reproducing kernel given by the covariance function.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
The student who follows this course will get acquainted with computational tools used to analyze systems with uncertainty arising in engineering, physics, chemistry, and economics. Focus will be on s
Students get acquainted with the process of mapping from images (orthophoto and DEM), as well as with methods for monitoring the Earth surface using remotely sensed data. Methods will span from machi
The objective of this course is to give an overview of machine learning techniques used for real-world applications, and to teach how to implement and use them in practice. Laboratories will be done i
This course is the second part of a course dedicated to the theoretical and practical bases of Geographic Information Systems (GIS).It offers an introduction to GIS that does not require prior compu
This course is the second part of a course dedicated to the theoretical and practical bases of Geographic Information Systems (GIS).It offers an introduction to GIS that does not require prior compu
In spatial statistics the theoretical variogram, denoted , is a function describing the degree of spatial dependence of a spatial random field or stochastic process . The semivariogram is half the variogram. In the case of a concrete example from the field of gold mining, a variogram will give a measure of how much two samples taken from the mining area will vary in gold percentage depending on the distance between those samples. Samples taken far apart will vary more than samples taken close to each other.
In general, a function approximation problem asks us to select a function among a that closely matches ("approximates") a in a task-specific way. The need for function approximations arises in many branches of applied mathematics, and computer science in particular , such as predicting the growth of microbes in microbiology. Function approximations are used where theoretical models are unavailable or hard to compute.
In probability theory and statistics, the covariance function describes how much two random variables change together (their covariance) with varying spatial or temporal separation. For a random field or stochastic process Z(x) on a domain D, a covariance function C(x, y) gives the covariance of the values of the random field at the two locations x and y: The same C(x, y) is called the autocovariance function in two instances: in time series (to denote exactly the same concept except that x and y refer to locations in time rather than in space), and in multivariate random fields (to refer to the covariance of a variable with itself, as opposed to the cross covariance between two different variables at different locations, Cov(Z(x1), Y(x2))).
We derive a covariance formula for the class of 'topological events' of smooth Gaussian fields on manifolds; these are events that depend only on the topology of the level sets of the field, for example, (i) crossing events for level or excursion sets, (ii ...
For the Bargmann-Fock field on R-d with d >= 3, we prove that the critical level l(c) (d) of the percolation model formed by the excursion sets {f >= l} is strictly positive. This implies that for every l sufficiently close to 0 (in particular for the noda ...
INST MATHEMATICAL STATISTICS-IMS2023
PET reconstruction algorithms have long relied on sinogram rebinning. However, as detectors grow smaller in a recent wave of cutting-edge scanners, individual sensors no longer accrue hundreds of photons. Instead, most detect a single photon or none at all ...