Detection theory or signal detection theory is a means to measure the ability to differentiate between information-bearing patterns (called stimulus in living organisms, signal in machines) and random patterns that distract from the information (called noise, consisting of background stimuli and random activity of the detection machine and of the nervous system of the operator).
In the field of electronics, signal recovery is the separation of such patterns from a disguising background.
According to the theory, there are a number of determiners of how a detecting system will detect a signal, and where its threshold levels will be. The theory can explain how changing the threshold will affect the ability to discern, often exposing how adapted the system is to the task, purpose or goal at which it is aimed. When the detecting system is a human being, characteristics such as experience, expectations, physiological state (e.g., fatigue) and other factors can affect the threshold applied. For instance, a sentry in wartime might be likely to detect fainter stimuli than the same sentry in peacetime due to a lower criterion, however they might also be more likely to treat innocuous stimuli as a threat.
Much of the early work in detection theory was done by radar researchers. By 1954, the theory was fully developed on the theoretical side as described by Peterson, Birdsall and Fox and the foundation for the psychological theory was made by Wilson P. Tanner, David M. Green, and John A. Swets, also in 1954.
Detection theory was used in 1966 by John A. Swets and David M. Green for psychophysics. Green and Swets criticized the traditional methods of psychophysics for their inability to discriminate between the real sensitivity of subjects and their (potential) response biases.
Detection theory has applications in many fields such as diagnostics of any kind, quality control, telecommunications, and psychology. The concept is similar to the signal-to-noise ratio used in the sciences and confusion matrices used in artificial intelligence.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.
In signal processing, a matched filter is obtained by correlating a known delayed signal, or template, with an unknown signal to detect the presence of the template in the unknown signal. This is equivalent to convolving the unknown signal with a conjugated time-reversed version of the template. The matched filter is the optimal linear filter for maximizing the signal-to-noise ratio (SNR) in the presence of additive stochastic noise.
Sensitivity and specificity mathematically describe the accuracy of a test that reports the presence or absence of a condition. If individuals who have the condition are considered "positive" and those who do not are considered "negative", then sensitivity is a measure of how well a test can identify true positives and specificity is a measure of how well a test can identify true negatives: Sensitivity (true positive rate) is the probability of a positive test result, conditioned on the individual truly being positive.
We discuss a set of topics that are important for the understanding of modern data science but that are typically not taught in an introductory ML course. In particular we discuss fundamental ideas an
The physics of optical communication components and their applications to communication systems will be covered. The course is intended to present the operation principles of contemporary optical comm
This course is neither an introduction to the mathematics of statistics nor an introduction to a statistics program such as R. The aim of the course is to understand statistics from its experimental d
Covers stochastic models for binary transmission in communications systems.
Introduces statistical signal processing tools for wireless communications, emphasizing practical applications and hands-on experience with Python or Matlab.
Covers sampling, Fourier Transform, and reconstruction using low-pass filters in signal processing.
Taking advantage of Capella's ability to dwell on a target for an extended period of time (nominally 30s) in its spotlight (SP) mode, an unsupervised methodology for detecting moving targets in this data is presented in this paper. By colourizing short seg ...
New York2023
Vital sign detection is used across ubiquitous scenarios in medical and health settings, and contact and wearable sensors have been widely deployed. However, they are unsuitable for patients with burn wounds or infants with insufficient areas for attachmen ...
In communication systems, there are many tasks, like modulation classification, for which Deep Neural Networks (DNNs) have obtained promising performance. However, these models have been shown to be susceptible to adversarial perturbations, namely impercep ...