Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Network information theory studies the communication of information in a network and considers its fundamental limits. Motivating from the extensive presence of the networks in the daily life, the thesis studies the fundamental limits of particular networks including channel coding such as Gaussian multiple access channel with feedback and source coding such as lossy Gaussian Gray-Wyner network.
On one part, we establish the sum-Capacity of the Gaussian multiple-access channel with feedback. The converse bounds that are derived from the dependence-balance argument of Hekstra and Willems meet the achievable scheme introduced by Kramer. Even though the problem is not convex, the factorization of lower convex envelope method that is introduced by Geng and Nair, combined with a Gaussian property are invoked to compute the sum-Capacity. Additionally, we characterize the rate region of lossy Gaussian Gray-Wyner network for symmetric distortion. The problem is not convex, thus the method of factorization of lower convex envelope is used to show the Gaussian optimality of the auxiliaries. Both of the networks, are a long-standing open problem.
On the other part, we consider the common information that is introduced by Wyner and the natural relaxation of Wyner's common information. Wyner's common information is a measure that quantifies and assesses the commonality between two random variables. The operational significance of the newly introduced quantity is in Gray-Wyner network. Thus, computing the relaxed Wyner's common information is directly connected with computing the rate region in Gray-Wyner network. We derive a lower bound to Wyner's common information for any given source. The bound meets the exact Wyner's common information for sources that are expressed as sum of a common random variable and Gaussian noises. Moreover, we derive an upper bound on an extended variant of information bottleneck.
Finally, we use Wyner's common information and its relaxation as a tool to extract common information between datasets. Thus, we introduce a novel procedure to construct features from data, referred to as Common Information Components Analysis (CICA). We establish that in the case of Gaussian statistics, CICA precisely reduces to Canonical Correlation Analysis (CCA), where the relaxing parameter determines the number of CCA components that are extracted. In this sense, we establish a novel rigorous connection between information measures and CCA, and CICA is a strict generalization of the latter. Moreover, we show that CICA has several desirable features, including a natural extension to beyond just two data sets.