Sampling (signal processing)In signal processing, sampling is the reduction of a continuous-time signal to a discrete-time signal. A common example is the conversion of a sound wave to a sequence of "samples". A sample is a value of the signal at a point in time and/or space; this definition differs from the term's usage in statistics, which refers to a set of such values. A sampler is a subsystem or operation that extracts samples from a continuous signal. A theoretical ideal sampler produces samples equivalent to the instantaneous value of the continuous signal at the desired points.
Spectral theoremIn mathematics, particularly linear algebra and functional analysis, a spectral theorem is a result about when a linear operator or matrix can be diagonalized (that is, represented as a diagonal matrix in some basis). This is extremely useful because computations involving a diagonalizable matrix can often be reduced to much simpler computations involving the corresponding diagonal matrix. The concept of diagonalization is relatively straightforward for operators on finite-dimensional vector spaces but requires some modification for operators on infinite-dimensional spaces.
Primitive data typeIn computer science, primitive data types are a set of basic data types from which all other data types are constructed. Specifically it often refers to the limited set of data representations in use by a particular processor, which all compiled programs must use. Most processors support a similar set of primitive data types, although the specific representations vary. More generally, "primitive data types" may refer to the standard data types built into a programming language (built-in types).
Validity (statistics)Validity is the main extent to which a concept, conclusion or measurement is well-founded and likely corresponds accurately to the real world. The word "valid" is derived from the Latin validus, meaning strong. The validity of a measurement tool (for example, a test in education) is the degree to which the tool measures what it claims to measure. Validity is based on the strength of a collection of different types of evidence (e.g. face validity, construct validity, etc.) described in greater detail below.
Spike-and-waveSpike-and-wave is a pattern of the electroencephalogram (EEG) typically observed during epileptic seizures. A spike-and-wave discharge is a regular, symmetrical, generalized EEG pattern seen particularly during absence epilepsy, also known as ‘petit mal’ epilepsy. The basic mechanisms underlying these patterns are complex and involve part of the cerebral cortex, the thalamocortical network, and intrinsic neuronal mechanisms. The first spike-and-wave pattern was recorded in the early twentieth century by Hans Berger.
Functional analysisFunctional analysis is a branch of mathematical analysis, the core of which is formed by the study of vector spaces endowed with some kind of limit-related structure (for example, inner product, norm, or topology) and the linear functions defined on these spaces and suitably respecting these structures. The historical roots of functional analysis lie in the study of spaces of functions and the formulation of properties of transformations of functions such as the Fourier transform as transformations defining, for example, continuous or unitary operators between function spaces.
Bartlett's methodIn time series analysis, Bartlett's method (also known as the method of averaged periodograms), is used for estimating power spectra. It provides a way to reduce the variance of the periodogram in exchange for a reduction of resolution, compared to standard periodograms. A final estimate of the spectrum at a given frequency is obtained by averaging the estimates from the periodograms (at the same frequency) derived from non-overlapping portions of the original series. The method is used in physics, engineering, and applied mathematics.
Iterative reconstructionIterative reconstruction refers to iterative algorithms used to reconstruct 2D and 3D images in certain imaging techniques. For example, in computed tomography an image must be reconstructed from projections of an object. Here, iterative reconstruction techniques are usually a better, but computationally more expensive alternative to the common filtered back projection (FBP) method, which directly calculates the image in a single reconstruction step.
Spectrum (physical sciences)In the physical sciences, the term spectrum was introduced first into optics by Isaac Newton in the 17th century, referring to the range of colors observed when white light was dispersed through a prism. Soon the term referred to a plot of light intensity or power as a function of frequency or wavelength, also known as a spectral density plot. Later it expanded to apply to other waves, such as sound waves and sea waves that could also be measured as a function of frequency (e.g., noise spectrum, sea wave spectrum).
Phase noiseIn signal processing, phase noise is the frequency-domain representation of random fluctuations in the phase of a waveform, corresponding to time-domain deviations from perfect periodicity (jitter). Generally speaking, radio-frequency engineers speak of the phase noise of an oscillator, whereas digital-system engineers work with the jitter of a clock. Historically there have been two conflicting yet widely used definitions for phase noise.