Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
The Square Kilometre Array (SKA) will form the largest radio telescope ever built, generating on the order of one terabyte of data per second. To reduce the data flow sent to the central processor, hierarchical designs have been proposed: the data is primarily collected in groups of antennas, and summed coherently by beamforming. Historically, Fourier analysis has played a prominent role in radio astronomy interferometry, legitimated by the celebrated van Cittert-Zernike theorem. We show that, in the case of modern hierarchical designs, beamformed data has a less intimate, and thus more complicated relationship to the Fourier domain. Unsatisfactory attempts have been proposed to compensate, which implicitly retain the Fourier framework, and are limited to directive beamforming. We show that when stepping away from Fourier, we can embed the data in a more natural domain originating from the telescope configuration and the specific beamforming technique. This leads to a new, more accurate, imaging pipeline. Standard techniques such as w-projection, and gridding are no longer needed, as the reconstruction is performed on the celestial sphere. The proposed imager operates in two steps. First, a preconditioning based on the Gram-Schmidt orthogonalization procedure is performed, in order to facilitate the computation of the pseudoinverse sky estimate. Then, from this, the LASSO estimate is approximated very efficiently. The quality of this approximation is shown to be linked directly to the effective support of the instrument point spread function. Due to the greater flexibility of this framework, information-maximising beamforming techniques such as randomised beamforming can be readily incorporated. Moreover, we use the Bonferroni method to construct global confidence intervals onto the Gram-Schmidt least squares estimate, and use them to test the statistical significance of each pixel. The complexity of the proposed technique is assessed and compared to the the state-of-the-art combined CLEAN and A-projection algorithm. In the case of LOFAR, we show that our algorithm can be from 2 to 34 times faster. The accuracy and sensitivity of the new technique is also shown, for simulated data, to be superior.
Nicolas Henri Bernard Flammarion, Scott William Pesme
Till Junge, Ali Falsafi, Martin Ladecký
Edoardo Charbon, Francesco Piro, Abhishek Sharma