**Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?**

Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur GraphSearch.

Concept# Least-squares spectral analysis

Résumé

Least-squares spectral analysis (LSSA) is a method of estimating a frequency spectrum based on a least-squares fit of sinusoids to data samples, similar to Fourier analysis. Fourier analysis, the most used spectral method in science, generally boosts long-periodic noise in the long and gapped records; LSSA mitigates such problems. Unlike in Fourier analysis, data need not be equally spaced to use LSSA.
Developed in 1969 and 1971, LSSA is also known as the Vaníček method and the Gauss-Vaniček method after Petr Vaníček, and as the Lomb method or the Lomb–Scargle periodogram, based on the simplifications first by Nicholas R. Lomb and then by Jeffrey D. Scargle.
The close connections between Fourier analysis, the periodogram, and the least-squares fitting of sinusoids have been known for a long time.
However, most developments are restricted to complete data sets of equally spaced samples. In 1963, Freek J. M. Barning of Mathematisch Centrum, Amsterdam, handled unequally spaced data by similar techniques, including both a periodogram analysis equivalent to what nowadays is called the Lomb method and least-squares fitting of selected frequencies of sinusoids determined from such periodograms — and connected by a procedure known today as the matching pursuit with post-back fitting or the orthogonal matching pursuit.
Petr Vaníček, a Canadian geophysicist and geodesist of the University of New Brunswick, proposed in 1969 also the matching-pursuit approach for equally and unequally spaced data, which he called "successive spectral analysis" and the result a "least-squares periodogram". He generalized this method to account for any systematic components beyond a simple mean, such as a "predicted linear (quadratic, exponential, ...) secular trend of unknown magnitude", and applied it to a variety of samples, in 1971.
Vaníček's strictly least-squares method was then simplified in 1976 by Nicholas R. Lomb of the University of Sydney, who pointed out its close connection to periodogram analysis.

Source officielle

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Publications associées (2)

Concepts associés (34)

Cours associés (13)

Non-uniform discrete Fourier transform

In applied mathematics, the nonuniform discrete Fourier transform (NUDFT or NDFT) of a signal is a type of Fourier transform, related to a discrete Fourier transform or discrete-time Fourier transform, but in which the input signal is not sampled at equally spaced points or frequencies (or both). It is a generalization of the shifted DFT. It has important applications in signal processing, magnetic resonance imaging, and the numerical solution of partial differential equations.

Least-squares spectral analysis

Least-squares spectral analysis (LSSA) is a method of estimating a frequency spectrum based on a least-squares fit of sinusoids to data samples, similar to Fourier analysis. Fourier analysis, the most used spectral method in science, generally boosts long-periodic noise in the long and gapped records; LSSA mitigates such problems. Unlike in Fourier analysis, data need not be equally spaced to use LSSA.

Estimation spectrale

L'estimation spectrale regroupe toutes les techniques d'estimation de la densité spectrale de puissance (DSP). Les méthodes d'estimation spectrale paramétriques utilisent un modèle pour obtenir une estimation du spectre. Ces modèles reposent sur une connaissance a priori du processus et peuvent être classées en trois grandes catégories : Modèles autorégressif (AR) Modèles à moyenne ajustée (MA) Modèles autorégressif à moyenne ajustée (ARMA). L'approche paramétrique se décompose en trois étapes : Choisir un modèle décrivant le processus de manière appropriée.

Building up on the basic concepts of sampling, filtering and Fourier transforms, we address stochastic modeling, spectral analysis, estimation and prediction, classification, and adaptive filtering, w

A first course in statistical time series analysis and applications.

The programme will allow students to learn plasma diagnostics and data processing methods of modern fusion experiments and to bridge the gap between diagnostics theory and experimental practice.

Background: Cerebral autoregulation (CA) is a protective mechanism which maintains the steadiness of the cerebral blood flow (CBF) through a broad range of systemic blood pressure (BP). Acute hypertension has been shown to reduce the cerebrovascular adaptation to BP variations. However, it is still unknown whether CA is impaired in chronic hypertension. This study evaluated whether a strict control of BP affects the CA in patients with chronic hypertension, and compared a valsartan-based regimen to a regimen not inhibiting the renin-angiotensin-aldosterone system (non-RAAS). Methods: Eighty untreated patients with isolated systolic hypertension were randomized to valsartan 320 mg or to a non-RAAS regimen during 6 months. The medication was upgraded to obtain BP

Pablo Antolin Sanchez, Felipe Figueredo Rocha

In many applications, such as textiles, fibreglass, paper and several kinds of biological fibrous tissues, the main load-bearing constituents at the micro-scale are arranged as a fibre network. In these materials, rupture is usually driven by micro-mechanical failure mechanisms, and strain localisation due to progressive damage evolution in the fibres is the main cause of macro-scale instability. We propose a strain-driven computational homogenisation formulationbased on Representative Volume Element (RVE), within a framework in which micro-scale fibre damage can lead to macro-scale localisation phenomena. The mechanical stiffness considered here for the fibrous structure system is due to: i) an intra-fibre mechanism in which each fibre is axially stretched, and as a result, it can suffer damage; ii) an inter-fibre mechanism in which the stiffness results from the variation of the relative angle between pairs of fibres. The homogenised tangent tensor, which comes from the contribution of these two mechanisms, is required to detect the so-called bifurcation point at the macro-scale, through the spectral analysis of the acoustic tensor. This analysis can precisely determine the instant at which the macro-scale problem becomes ill-posed. At such a point, the spectral analysis provides information about the macro-scale failure pattern (unit normal and crack-opening vectors). Special attention is devoted to present the theoretical fundamentals rigorously in the light of variational formulations for multi-scale models. Also, the impact of a recent derived more general boundary condition for fibre networks is assessed in the context of materials undergoing softening. Numerical examples showing the suitability of the present methodology are also shown and discussed.

2021Séances de cours associées (151)

Traitement du signal: bases et analyse spectraleEE-350: Signal processing

Couvre les bases du traitement du signal, de l'estimation linéaire et des filtres numériques.

Signaux harmoniques et estimation du spectreCOM-500: Statistical signal and data processing through applications

Explore les signaux harmoniques, l'estimation du spectre et les méthodes d'analyse des signaux à l'aide des outils MATLAB.

Calcul de densité spectrale de puissanceCOM-302: Principles of digital communications

Couvre le calcul de la densité spectrale de puissance et la conception des systèmes de communication.