**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.

Category# Signal processing

Summary

Signal processing is an electrical engineering subfield that focuses on analyzing, modifying and synthesizing signals, such as sound, , potential fields, seismic signals, altimetry processing, and scientific measurements. Signal processing techniques are used to optimize transmissions, digital storage efficiency, correcting distorted signals, subjective video quality and to also detect or pinpoint components of interest in a measured signal.
According to Alan V. Oppenheim and Ronald W. Schafer, the principles of signal processing can be found in the classical numerical analysis techniques of the 17th century. They further state that the digital refinement of these techniques can be found in the digital control systems of the 1940s and 1950s.
In 1948, Claude Shannon wrote the influential paper "A Mathematical Theory of Communication" which was published in the Bell System Technical Journal. The paper laid the groundwork for later development of information communication systems and the processing of signals for transmission.
Signal processing matured and flourished in the 1960s and 1970s, and digital signal processing became widely used with specialized digital signal processor chips in the 1980s.
A signal is a function , where this function is either
deterministic (then one speaks of a deterministic signal) or
a path , a realization of a stochastic process
Analog signal processing
Analog signal processing is for signals that have not been digitized, as in most 20th-century radio, telephone, and television systems. This involves linear electronic circuits as well as nonlinear ones. The former are, for instance, passive filters, active filters, additive mixers, integrators, and delay lines. Nonlinear circuits include compandors, multipliers (frequency mixers, voltage-controlled amplifiers), voltage-controlled filters, voltage-controlled oscillators, and phase-locked loops.
Continuous-time signal processing is for signals that vary with the change of continuous domain (without considering some individual interrupted points).

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related courses (264)

Related MOOCs (46)

Related concepts (262)

Related units (63)

Related startups (9)

Related people (804)

Related publications (1,000)

EE-205: Signals and systems (for EL)

Ce cours pose les bases d'un concept essentiel en ingénierie : la notion de système. Plus spécifiquement, le cours présente la théorie des systèmes linéaires invariants dans le temps (SLIT), qui sont

EE-550: Image and video processing

This course covers fundamental notions in image and video processing, as well as covers most popular tools used, such as edge detection, motion estimation, segmentation, and compression. It is compose

COM-500: Statistical signal and data processing through applications

Building up on the basic concepts of sampling, filtering and Fourier transforms, we address stochastic modeling, spectral analysis, estimation and prediction, classification, and adaptive filtering, w

Related lectures (998)

Digital Signal Processing I

Basic signal processing concepts, Fourier analysis and filters. This module can
be used as a starting point or a basic refresher in elementary DSP

Digital Signal Processing II

Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization

Digital Signal Processing III

Advanced topics: this module covers real-time audio processing (with
examples on a hardware board), image processing and communication system design.

Related categories (60)

Variable bitrate (VBR) is a term used in telecommunications and computing that relates to the bitrate used in sound or video encoding. As opposed to constant bitrate (CBR), VBR files vary the amount of output data per time segment. VBR allows a higher bitrate (and therefore more storage space) to be allocated to the more complex segments of media files while less space is allocated to less complex segments. The average of these rates can be calculated to produce an average bitrate for the file.

In signal processing, a digital biquad filter is a second order recursive linear filter, containing two poles and two zeros. "Biquad" is an abbreviation of "biquadratic", which refers to the fact that in the Z domain, its transfer function is the ratio of two quadratic functions: The coefficients are often normalized such that a0 = 1: High-order infinite impulse response filters can be highly sensitive to quantization of their coefficients, and can easily become unstable.

Speaker recognition is the identification of a person from characteristics of voices. It is used to answer the question "Who is speaking?" The term voice recognition can refer to speaker recognition or speech recognition. Speaker verification (also called speaker authentication) contrasts with identification, and speaker recognition differs from speaker diarisation (recognizing when the same speaker is speaking).

Active in photogrammetry, digitization and research and development. Pix4D is a leading provider of photogrammetry software, empowering users to digitize reality and measure from various image sources, with a strong emphasis on research and development.

Active in semiconductor, LiDAR and image sensors. Fastree 3D Imagers is a semiconductor company specializing in image sensors for industrial and automotive applications, offering a Hardware Development Kit called Falcon for LiDAR solutions integration.

Active in innovation, quality and sustainability. Logitech, a Swiss company with a focus on innovation and quality, designs products to help customers connect and interact with the digital world, emphasizing design in all aspects of their development.

Statistical Signal Processing Tools

Explores statistical signal processing tools for wireless communications, including spectral estimation and signal detection, classification, and adaptive filtering.

Sampling Theorem and Control Systems

Explores the Sampling Theorem, digital control, signal reconstruction, and anti-aliasing filters.

Signal Processing Fundamentals

Explores signal processing fundamentals, including discrete time signals, spectral factorization, and stochastic processes.

Data compression

In information theory, data compression, source coding, or bit-rate reduction is the process of encoding information using fewer bits than the original representation. Any particular compression is either lossy or lossless. Lossless compression reduces bits by identifying and eliminating statistical redundancy. No information is lost in lossless compression. Lossy compression reduces bits by removing unnecessary or less important information.

Digital image processing

Digital image processing is the use of a digital computer to process s through an algorithm. As a subcategory or field of digital signal processing, digital image processing has many advantages over . It allows a much wider range of algorithms to be applied to the input data and can avoid problems such as the build-up of noise and distortion during processing. Since images are defined over two dimensions (perhaps more) digital image processing may be modeled in the form of multidimensional systems.

Fourier analysis

In mathematics, Fourier analysis (ˈfʊrieɪ,_-iər) is the study of the way general functions may be represented or approximated by sums of simpler trigonometric functions. Fourier analysis grew from the study of Fourier series, and is named after Joseph Fourier, who showed that representing a function as a sum of trigonometric functions greatly simplifies the study of heat transfer. The subject of Fourier analysis encompasses a vast spectrum of mathematics.

Touradj Ebrahimi, Michela Testolina, Davi Nachtigall Lazzarotto

The recent rise in interest in point clouds as an imaging modality has motivated standardization groups such as JPEG and MPEG to launch activities aiming at developing compression standards for point clouds. Lossy compression usually introduces visual arti ...

Tatiana Pieloni, Nicolas Frank Mounet, Christophe Emmanuel R. Lannoy

The Schottky monitors of the Large Hadron Collider (LHC) can be used for non-invasive beam diagnostics to estimate various bunch characteristics, such as tune, chromaticity, bunch profile or synchrotron frequency distribution. However, collective effects, ...

Alireza Karimi, Philippe Louis Schuchert

Modern control synthesis methods rely on accurate models to derive a performant controller. Obtaining a good model is often a costly step, and has led to a renewed interest in data-driven synthesis methods. Frequency-response-based synthesis methods have b ...

2024