Publication

Lossy source coding for a cascade communication system with side-informations

Suhas Diggavi
2006
Conference paper
Abstract

We investigate source coding in a cascade communication system consisting of an encoder, a relay and an end terminal, where both the relay and the end terminal wish to reconstruct source XX with certain fidelities. Additionally, side-informations ZZ and YY are available at the relay and the end terminal, respectively. The side-information ZZ at the relay is a physically degraded version of side-information YY at the end terminal. Inner and outer bounds for the rate distortion region are provided in this work for general discrete memoryless sources. The rate distortion region is characterized when the source and side-informations are jointly Gaussian and physically degraded. The doubly symmetric binary source is also investigated and the inner and outer bounds are shown to coincide in certain distortion regimes. A complete equivalence of the rate-distortion region is established between the problem being considered and the side-information scalable source coding problem, when there is no side-information at the relay. As a byproduct, the same equivalence can be established between the well-known successive refinement problem and Yamamoto's cascade communication system, without relying on their rate-distortion characterization.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (32)
Information theory
Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Distortion
In signal processing, distortion is the alteration of the original shape (or other characteristic) of a signal. In communications and electronics it means the alteration of the waveform of an information-bearing signal, such as an audio signal representing sound or a video signal representing images, in an electronic device or communication channel. Distortion is usually unwanted, and so engineers strive to eliminate or minimize it. In some situations, however, distortion may be desirable.
Mutual information
In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Show more
Related publications (46)

Time vs. Truth: Age-Distortion Tradeoffs and Strategies for Distributed Inference

Yunus Inan

In 1948, Claude Shannon laid the foundations of information theory, which grew out of a study to find the ultimate limits of source compression, and of reliable communication. Since then, information theory has proved itself not only as a quest to find the ...
EPFL2023

Converse for Multi-Server Single-Message PIR with Side Information

Michael Christoph Gastpar, Su Li

Multi-server single-message private information retrieval is studied in the presence of side information. In this problem, K independent messages are replicatively stored at N non-colluding servers. The user wants to privately download one message from the ...
2020

Hard-sphere displacive model of deformation twinning in hexagonal close-packed metals. Revisiting the case of the (56, a) contraction twins in magnesium

Cyril Cayron

Contraction twinning in magnesium alloys leads to new grains that are misoriented from the parent grain by a rotation (56°, a). The classical shear theory of deformation twinning does not specify the atomic displacements and does not explain why contractio ...
Int Union Crystallography2017
Show more
Related MOOCs (8)
Digital Signal Processing [retired]
The course provides a comprehensive overview of digital signal processing theory, covering discrete time, Fourier analysis, filter design, sampling, interpolation and quantization; it also includes a
Digital Signal Processing
Digital Signal Processing is the branch of engineering that, in the space of just a few decades, has enabled unprecedented levels of interpersonal communication and of on-demand entertainment. By rewo
Digital Signal Processing I
Basic signal processing concepts, Fourier analysis and filters. This module can be used as a starting point or a basic refresher in elementary DSP
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.