In signal processing, cross-correlation is a measure of similarity of two series as a function of the displacement of one relative to the other. This is also known as a sliding dot product or sliding inner-product. It is commonly used for searching a long signal for a shorter, known feature. It has applications in pattern recognition, single particle analysis, electron tomography, averaging, cryptanalysis, and neurophysiology. The cross-correlation is similar in nature to the convolution of two functions. In an autocorrelation, which is the cross-correlation of a signal with itself, there will always be a peak at a lag of zero, and its size will be the signal energy. In probability and statistics, the term cross-correlations refers to the correlations between the entries of two random vectors and , while the correlations of a random vector are the correlations between the entries of itself, those forming the correlation matrix of . If each of and is a scalar random variable which is realized repeatedly in a time series, then the correlations of the various temporal instances of are known as autocorrelations of , and the cross-correlations of with across time are temporal cross-correlations. In probability and statistics, the definition of correlation always includes a standardising factor in such a way that correlations have values between −1 and +1. If and are two independent random variables with probability density functions and , respectively, then the probability density of the difference is formally given by the cross-correlation (in the signal-processing sense) ; however, this terminology is not used in probability and statistics. In contrast, the convolution (equivalent to the cross-correlation of and ) gives the probability density function of the sum . For continuous functions and , the cross-correlation is defined as:which is equivalent towhere denotes the complex conjugate of , and is called displacement or lag. For highly-correlated and which have a maximum cross-correlation at a particular , a feature in at also occurs later in at , hence could be described to lag by .

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (32)
PHYS-739: Conformal Field theory and Gravity
This course is an introduction to holography, the modern approach to quantum gravity.
PHYS-732: Plasma Diagnostics in Basic Plasma Physics Devices and Tokamaks: from Principles to Practice
The programme will allow students to learn plasma diagnostics and data processing methods of modern fusion experiments and to bridge the gap between diagnostics theory and experimental practice.
EE-607: Advanced Methods for Model Identification
This course introduces the principles of model identification for non-linear dynamic systems, and provides a set of possible solution methods that are thoroughly characterized in terms of modelling as
Show more
Related lectures (217)
Large N Expansion: Vector Models
Explores the Large N expansion in vector models, focusing on matrix models, gauge theories, and the Hooft coupling.
Renormalization Group: Ising Model
Explores the renormalization group applied to the Ising model and the derivation of the central limit theorem.
Nanotoxicology: Emerging Discipline from Ultrafine Particles
Explores nanotoxicology and atmospheric chemistry, focusing on particle deposition, emission, and acid deposition.
Show more
Related publications (600)

Constraints on the intergalactic magnetic field from Fermi/LAT observations of the 'pair echo' of GRB 221009A

Andrii Neronov

Delayed 'pair-echo' signal from interactions of very-high-energy gamma rays in the intergalactic medium can be used for the detection of the intergalactic magnetic field (IGMF). We used the data of the Fermi/LAT telescope coupled with LHAASO observatory me ...
Edp Sciences S A2024

Influence of Ventilation on Formation and Growth of 1-20 nm Particles via Ozone-Human Chemistry

Dusan Licina, Shen Yang, Marouane Merizak, Meixia Zhang

Ozone reaction with human surfaces is an important source of ultrafine particles indoors. However, 1-20 nm particles generated from ozone-human chemistry, which mark the first step of particle formation and growth, remain understudied. Ventilation and indo ...
Washington2024

Recovering Static and Time-Varying Communities Using Persistent Edges

Maximilien Claude Robert Dreveton

This article focuses on spectral methods for recovering communities in temporal networks. In the case of fixed communities, spectral clustering on the simple time-aggregated graph (i.e., the weighted graph formed by the sum of the interactions over all tem ...
Ieee Computer Soc2024
Show more
Related concepts (16)
Scaled correlation
In statistics, scaled correlation is a form of a coefficient of correlation applicable to data that have a temporal component such as time series. It is the average short-term correlation. If the signals have multiple components (slow and fast), scaled coefficient of correlation can be computed only for the fast components of the signals, ignoring the contributions of the slow components. This filtering-like operation has the advantages of not having to make assumptions about the sinusoidal nature of the signals.
Discrete-time Fourier transform
In mathematics, the discrete-time Fourier transform (DTFT), also called the finite Fourier transform, is a form of Fourier analysis that is applicable to a sequence of values. The DTFT is often used to analyze samples of a continuous function. The term discrete-time refers to the fact that the transform operates on discrete data, often samples whose interval has units of time. From uniformly spaced samples it produces a function of frequency that is a periodic summation of the continuous Fourier transform of the original continuous function.
Autocovariance
In probability theory and statistics, given a stochastic process, the autocovariance is a function that gives the covariance of the process with itself at pairs of time points. Autocovariance is closely related to the autocorrelation of the process in question. With the usual notation for the expectation operator, if the stochastic process has the mean function , then the autocovariance is given by where and are two instances in time.
Show more
Related MOOCs (4)
Neuronal Dynamics - Computational Neuroscience of Single Neurons
The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.
Neuronal Dynamics 2- Computational Neuroscience: Neuronal Dynamics of Cognition
This course explains the mathematical and computational models that are used in the field of theoretical neuroscience to analyze the collective dynamics of thousands of interacting neurons.
Neuronal Dynamics 2- Computational Neuroscience: Neuronal Dynamics of Cognition
This course explains the mathematical and computational models that are used in the field of theoretical neuroscience to analyze the collective dynamics of thousands of interacting neurons.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.