**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Joint source-channel coding with feedback

2005

EPFL thesis

EPFL thesis

Abstract

Communications is about conveying information from one point to another subject to certain performance constraints. The information is assumed to be generated by a source and may, for example, represent a voice waveform, the reading of a thermal sensor, or packets generated by a node in a network. The channel is the physical medium over which the actual transmission takes place. It might represent a telephone line, a fibre optic link or the air over which our voice propagates during a conversation. The channel is usually subject to disturbances of diverse nature, which for example in the case of a telephone link, might be cross talk from other users, frequency selective distortion or thermal noise. There are various approaches to this problem of conveying the source across the channel, the most common one being the separation based approach. In this approach, the problem of conveying the information is broken down into two parts: efficient representation (also called source coding) and reliable transmission (also called channel coding). Shannon showed that for a wide variety of situations, the breaking up of the original problem into these two simpler problems does not impose any restrictions on the performance of a communication system. However, this property may fail to hold in point-to-point communications if the source is non-ergodic, the channel is time-varying, the system is delay or complexity constrained, or when we are dealing with a network scenario. In this case, there are potential benefits to be gained by looking at the problem from a joint source-channel coding perspective rather than the two narrower perspectives of source and channel coding. Another result, again due to Shannon, shows that the presence of a noiseless feedback link from the output of the channel to the encoder does not increase the capacity of point-to-point memoryless channels. However the presence of a feedback link can help in decreasing the complexity of implementation as well as the delay associated with the system. In this thesis, we combine these two ideas to look at the problem of feedback communications from a joint source-channel coding perspective. We first outline a joint source-channel coding scheme and show rigorously that the expected number of channel uses to decode each source symbols approaches the source entropy divided by the channel capacity. This is a block decoding strategy and the complexity of this scheme is exponential in the block length. However, the system is optimal in the sense that no other scheme can decode source symbols with a lower number of channel uses on the average. We next focus on a modified low-complexity variant of the scheme considered earlier. This low-complexity implementation has a number of desirable properties. First, the source symbols are decoded one by one. The delay per source symbol is essentially a constant. Secondly, simulation results show that the symbol errors are confined locally and do not propagate. Thirdly, the scheme can adaptively learn the parameters of interest and approach optimal performance. Finally, the presence of a high capacity feedback link makes the structure of the encoder extremely simple. In the final part of the thesis, we extend the scheme that we have developed for the single-user scenario to the case when two users want to send data to the same receiver. We show that the computational complexity of each of the users remains the same as the single-user case, but the space complexity of one of the users increases. However, the reduction in complexity over the feedback-free case is still significant.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related MOOCs (14)

Related publications (181)

Related concepts (46)

Digital Signal Processing I

Basic signal processing concepts, Fourier analysis and filters. This module can
be used as a starting point or a basic refresher in elementary DSP

Digital Signal Processing II

Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization

Digital Signal Processing III

Advanced topics: this module covers real-time audio processing (with
examples on a hardware board), image processing and communication system design.

Noisy-channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.

Channel capacity

Channel capacity, in electrical engineering, computer science, and information theory, is the tight upper bound on the rate at which information can be reliably transmitted over a communication channel. Following the terms of the noisy-channel coding theorem, the channel capacity of a given channel is the highest information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability. Information theory, developed by Claude E.

Complexity class

In computational complexity theory, a complexity class is a set of computational problems "of related resource-based complexity". The two most commonly analyzed resources are time and memory. In general, a complexity class is defined in terms of a type of computational problem, a model of computation, and a bounded resource like time or memory. In particular, most complexity classes consist of decision problems that are solvable with a Turing machine, and are differentiated by their time or space (memory) requirements.

This thesis demonstrates that it is feasible for systems code to expose a latency interface that describes its latency and related side effects for all inputs, just like the code's semantic interface describes its functionality and related side effects.Sem ...

Information theory has allowed us to determine the fundamental limit of various communication and algorithmic problems, e.g., the channel coding problem, the compression problem, and the hypothesis testing problem. In this work, we revisit the assumptions ...

In 1948, Claude Shannon laid the foundations of information theory, which grew out of a study to find the ultimate limits of source compression, and of reliable communication. Since then, information theory has proved itself not only as a quest to find the ...