Laplace transformIn mathematics, the 'Laplace transform, named after its discoverer Pierre-Simon Laplace (ləˈplɑ:s), is an integral transform that converts a function of a real variable (usually , in the time domain) to a function of a complex variable (in the complex frequency domain, also known as s-domain', or s-plane). The transform has many applications in science and engineering because it is a tool for solving differential equations. In particular, it transforms ordinary differential equations into algebraic equations and convolution into multiplication.
Covariance matrixIn probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances (i.e., the covariance of each element with itself). Intuitively, the covariance matrix generalizes the notion of variance to multiple dimensions.
Square-integrable functionIn mathematics, a square-integrable function, also called a quadratically integrable function or function or square-summable function, is a real- or complex-valued measurable function for which the integral of the square of the absolute value is finite. Thus, square-integrability on the real line is defined as follows. One may also speak of quadratic integrability over bounded intervals such as for . An equivalent definition is to say that the square of the function itself (rather than of its absolute value) is Lebesgue integrable.
Poisson summation formulaIn mathematics, the Poisson summation formula is an equation that relates the Fourier series coefficients of the periodic summation of a function to values of the function's continuous Fourier transform. Consequently, the periodic summation of a function is completely defined by discrete samples of the original function's Fourier transform. And conversely, the periodic summation of a function's Fourier transform is completely defined by discrete samples of the original function.
Schwartz spaceIn mathematics, Schwartz space is the function space of all functions whose derivatives are rapidly decreasing. This space has the important property that the Fourier transform is an automorphism on this space. This property enables one, by duality, to define the Fourier transform for elements in the dual space of , that is, for tempered distributions. A function in the Schwartz space is sometimes called a Schwartz function. Schwartz space is named after French mathematician Laurent Schwartz.
Projection-valued measureIn mathematics, particularly in functional analysis, a projection-valued measure (PVM) is a function defined on certain subsets of a fixed set and whose values are self-adjoint projections on a fixed Hilbert space. Projection-valued measures are formally similar to real-valued measures, except that their values are self-adjoint projections rather than real numbers. As in the case of ordinary measures, it is possible to integrate complex-valued functions with respect to a PVM; the result of such an integration is a linear operator on the given Hilbert space.
Hardy spaceIn complex analysis, the Hardy spaces (or Hardy classes) Hp are certain spaces of holomorphic functions on the unit disk or upper half plane. They were introduced by Frigyes Riesz , who named them after G. H. Hardy, because of the paper . In real analysis Hardy spaces are certain spaces of distributions on the real line, which are (in the sense of distributions) boundary values of the holomorphic functions of the complex Hardy spaces, and are related to the Lp spaces of functional analysis.
Unit rootIn probability theory and statistics, a unit root is a feature of some stochastic processes (such as random walks) that can cause problems in statistical inference involving time series models. A linear stochastic process has a unit root if 1 is a root of the process's characteristic equation. Such a process is non-stationary but does not always have a trend. If the other roots of the characteristic equation lie inside the unit circle—that is, have a modulus (absolute value) less than one—then the first difference of the process will be stationary; otherwise, the process will need to be differenced multiple times to become stationary.
Position operatorIn quantum mechanics, the position operator is the operator that corresponds to the position observable of a particle. When the position operator is considered with a wide enough domain (e.g. the space of tempered distributions), its eigenvalues are the possible position vectors of the particle. In one dimension, if by the symbol we denote the unitary eigenvector of the position operator corresponding to the eigenvalue , then, represents the state of the particle in which we know with certainty to find the particle itself at position .
Entropy rateIn the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate is the limit of the joint entropy of members of the process divided by , as tends to infinity: when the limit exists. An alternative, related quantity is: For strongly stationary stochastic processes, .