**Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?**

Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur GraphSearch.

Concept# Lentille gravitationnelle faible

Résumé

Une lentille gravitationnelle faible (weak gravitational lensing en anglais) est une lentille gravitationnelle dont les effets sont limités par rapport aux lentilles gravitationnelles fortes. Plus fréquentes que ces dernières, les lentilles gravitationnelles faibles sont beaucoup plus difficiles à observer.
Comme tout type de lentille gravitationnelle, les lentilles gravitationnelles faibles peuvent être produites par divers corps célestes plus ou moins massifs. Selon le ou les corps impliqués, les effets de lentille varieront.
Observation
L'observation de lentille gravitationnelles faibles comporte de nombreux défis.
Toute lentille gravitationnelle agit comme une transformation de coordonnées qui déforme les images d’objets en arrière-plan (généralement des galaxies) près d’une masse en avant plan. La transformation peut être séparée en deux termes, la convergence (augmentation de la taille) et la transvection (étirement).
Pour mesurer cet alignement, il est nécessaire

Source officielle

Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.

Publications associées

Chargement

Personnes associées

Chargement

Unités associées

Chargement

Concepts associés

Chargement

Cours associés

Chargement

Séances de cours associées

Chargement

Publications associées (100)

Chargement

Chargement

Chargement

Personnes associées (30)

Cours associés (4)

PHYS-402: Astrophysics IV : observational cosmology

Cosmology is the study of the structure and evolution of the universe as a whole. This course describes the principal themes of cosmology, as seen
from the point of view of observations.

PHYS-401: Astrophysics III : stellar and galactic dynamics

The aim of this course is to acquire the basic knowledge on specific dynamical phenomena related to the origin, equilibrium, and evolution of star
clusters, galaxies, and galaxy clusters.

PHYS-439: Introduction to astroparticle physics

We present the role of particle physics in cosmology and in the description of astrophysical phenomena. We also present the methods and technologies for the observation of cosmic particles.

Unités associées (5)

Concepts associés (24)

Modèle ΛCDM

En cosmologie, le (se prononce « Lambda CDM », qui signifie en anglais Lambda - Cold Dark Matter, c'est-à-dire le modèle « lambda - matière noire froide ») ou modèle de concordance est un modèle cos

Matière noire

vignette|redresse=1.1|Répartition de la densité d'énergie de l'Univers après exploitation des premières données du satellite Planck. La matière noire en est une des composantes principales.
La matièr

Lentille gravitationnelle

En astrophysique, une lentille gravitationnelle, ou mirage gravitationnel, est produit par la présence d'un corps céleste très massif (tel, par exemple, un amas de galaxies) se situant entre un obse

Séances de cours associées (24)

The purpose of this Thesis is to develop, test, and characterize different models attempting to tackle the problem of measurement of galaxy shapes applied in interferometric observations. Shape measurement is a tool for estimating the underlying shear due to weak gravitational lensing by large-scale foreground matter distributions. The recent enormous progress in radio interferometric imaging during the last years, with projects such as MeerKAT, ASKAP, and in future SKA, motivates the development of advanced algorithms that utilize radio observations for this task. In most of these algorithms, the main disadvantage is that the proposed models' parameters have to be estimated in advance, assuming certain prior knowledge on the form of the objects. Our motivation is to develop and evaluate frameworks that do not require significant prior information on this form to make accurate measurements.To achieve this goal, we move in two implementation directions. In the first one, we are based on an existing approach that applies an object decomposition procedure using a dictionary of shapelet functions. Our proposal goes some steps forward trying to estimate this decomposition's key size parameter from a model built on advanced regularization. More specifically, we create an over-redundant dictionary formed as the concatenation of several groups of orthogonal shapelet basis functions with different parameters, and we use structured sparsity penalties to estimate the size parameter of the best group. As an alternative development strategy, we form an algorithm that employs multi-resolution least-squares analysis, which attempts to identify the size value that minimizes the relative residual in the fitting.The second path, instead of making the shape estimation directly on radio interferometric data, performs image reconstruction from the visibilities and measures the ellipticity of the objects in the resulting images. For this purpose, we adopt a sparsity averaging analysis algorithm that has been developed in the past and restores with high precision the intensity image from the visibilities. We also implement a similar framework that uses the CLEAN algorithm for image reconstruction, which helps compare the results between the two options and evaluate the quality of the measurements achieved.All our models are tested using a collection of simulated data that include objects of different profile types, ellipticity, size, orientation, and position in the field of view. The visibilities generation is done using either a simulated Gaussian profile coverage or a realistic SKA-like one. Additionally, we present an initial study of the same algorithms on real objects from the COSMOS survey.

There is now very strong evidence that our Universe is undergoing an accelerated expansion period as if it were under the influence of a gravitationally repulsive “dark energy” component. Furthermore, most of the mass of the Universe seems to be in the form of non-luminous matter, the so-called “dark matter”. Together, these “dark” components, whose nature remains unknown today, represent around 96 % of the matter-energy budget of the Universe. Unraveling the true nature of the dark energy and dark matter has thus, obviously, become one of the primary goals of present-day cosmology. Weak gravitational lensing, or weak lensing for short, is the effect whereby light emitted by distant galaxies is slightly deflected by the tidal gravitational fields of intervening foreground structures. Because it only relies on the physics of gravity, weak lensing has the unique ability to probe the distribution of mass in a direct and unbiased way. This technique is at present routinely used to study the dark matter, typical applications being the mass reconstruction of galaxy clusters and the study of the properties of dark halos surrounding galaxies. Another and more recent application of weak lensing, on which we focus in this thesis, is the analysis of the cosmological lensing signal induced by large-scale structures, the so-called “cosmic shear”. This signal can be used to measure the growth of structures and the expansion history of the Universe, which makes it particularly relevant to the study of dark energy. Of all weak lensing effects, the cosmic shear is the most subtle and its detection requires the accurate analysis of the shapes of millions of distant, faint galaxies in the near infrared. So far, the main factor limiting cosmic shear measurement accuracy has been the relatively small sky areas covered. Next-generation of wide-field, multicolor surveys will, however, overcome this hurdle by covering a much larger portion of the sky with improved image quality. The resulting statistical errors will then become subdominant compared to systematic errors, the latter becoming instead the main source of uncertainty. In fact, uncovering key properties of dark energy will only be achievable if these systematics are well understood and reduced to the required level. The major sources of uncertainty resides in the shape measurement algorithm used, the convolution of the original image by the instrumental and possibly atmospheric point spread function (PSF), the pixelation effect caused by the integration of light falling on the detector pixels and the degradation caused by various sources of noise. Measuring the Cosmic shear thus entails solving the difficult inverse problem of recovering the shear signal from blurred, pixelated and noisy galaxy images while keeping errors within the limits demanded by future weak lensing surveys. Reaching this goal is not without challenges. In fact, the best available shear measurement methods would need a tenfold improvement in accuracy to match the requirements of a space mission like Euclid from ESA, scheduled at the end of this decade. Significant progress has nevertheless been made in the last few years, with substantial contributions from initiatives such as GREAT (GRavitational lEnsing Accuracy Testing) challenges. The main objective of these open competitions is to foster the development of new and more accurate shear measurement methods. We start this work with a quick overview of modern cosmology: its fundamental tenets, achievements and the challenges it faces today. We then review the theory of weak gravitational lensing and explains how it can make use of cosmic shear observations to place constraints on cosmology. The last part of this thesis focuses on the practical challenges associated with the accurate measurement of the cosmic shear. After a review of the subject we present the main contributions we have brought in this area: the development of the gfit shear measurement method, new algorithms for point spread function (PSF) interpolation and image denoising. The gfit method emerged as one of the top performers in the GREAT10 Galaxy Challenge. It essentially consists in fitting two-dimensional elliptical Sérsic light profiles to observed galaxy image in order to produce estimates for the shear power spectrum. PSF correction is automatic and an efficient shape-preserving denoising algorithm can be optionally applied prior to fitting the data. PSF interpolation is also an important issue in shear measurement because the PSF is only known at star positions while PSF correction has to be performed at any position on the sky. We have developed innovative PSF interpolation algorithms on the occasion of the GREAT10 Star Challenge, a competition dedicated to the PSF interpolation problem. Our participation was very successful since one of our interpolation method won the Star Challenge while the remaining four achieved the next highest scores of the competition. Finally we have participated in the development of a wavelet-based, shape-preserving denoising method particularly well suited to weak lensing analysis.

Aims. We introduce a novel approach to reconstructing dark matter mass maps from weak gravitational lensing measurements. The cornerstone of the proposed method lies in a new modelling of the matter density field in the Universe as a mixture of two components: (1) a sparsity-based component that captures the non-Gaussian structure of the field, such as peaks or halos at different spatial scales, and (2) a Gaussian random field, which is known to represent the linear characteristics of the field well.Methods. We propose an algorithm called MCALens that jointly estimates these two components. MCALens is based on an alternating minimisation incorporating both sparse recovery and a proximal iterative Wiener filtering.Results. Experimental results on simulated data show that the proposed method exhibits improved estimation accuracy compared to customised mass-map reconstruction methods.

2021