**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Feedback communication over unknown channels

Abstract

Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. Is there a coding scheme (possibly with variable transmission time) that can achieve the Burnashev error exponent uniformly over Q ? For two families of channels we show that the answer is yes. Furthermore, for each of these two classes, in addition to achieve the maximum error exponent, it is possible to uniformly attain any given fraction of the channel capacity. Therefore, in terms of achievable rates and delay, there are situations in which the knowledge of the channel becomes irrelevant. In the second part of the thesis, we show that for arbitrary sets of channels the Burnashev error exponent cannot in general be uniformly achieved. In particular we give a sufficient condition for a pair of channels so that no coding strategy reaches Burnashev's exponent simultaneously on both channels. As a third part we study a scenario where communication is carried by first testing the channel by means of a training sequence, then coding according to the channel estimate. We provide an upper bound on the maximum achievable error exponent of such coding schemes. This bound is typically much lower than the maximum achievable error exponent over a channel with feedback. For example in the case of binary symmetric channels this bound has a slope that vanishes at capacity. This result suggests that in terms of error exponent, a good universal feedback scheme combines channel estimation with information delivery, rather than separating them. In the final chapter, we address the question of communicating quickly and reliably. We consider a simple situation of two message communication over a known channel with feedback. We propose a simple decoding rule, and show that it minimizes a weighted combination of the probability of error and decoding delay for a certain range of crossover probabilities and combination weights.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (7)

Related publications (4)

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code. Formally, it is defined as the limiting ratio of the negative logarithm of the error probability to the block length of the code for large block lengths. For example, if the probability of error of a decoder drops as , where is the block length, the error exponent is . In this example, approaches for large .

Binary symmetric channel

A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be "flipped" with a "crossover probability" of p, and otherwise is received correctly. This model can be applied to varied communication channels such as telephone lines or disk drive storage.

Feedback

Feedback occurs when outputs of a system are routed back as inputs as part of a chain of cause-and-effect that forms a circuit or loop. The system can then be said to feed back into itself. The notion of cause-and-effect has to be handled carefully when applied to feedback systems: Simple causal reasoning about a feedback system is difficult because the first system influences the second and second system influences the first, leading to a circular argument.

Emre Telatar, Aslan Tchamkerten

Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. We study a scenario where communication is carr

2006Shannon, in his landmark 1948 paper, developed a framework for characterizing the fundamental limits of information transmission. Among other results, he showed that reliable communication over a chan

Emre Telatar, Aslan Tchamkerten

Suppose Q is a family of discrete memoryless channels. An unknown member of Q is available with perfect (causal) feedback for communication. A recent result (A. Tchamkerten and I.E. Telatar) shows the

2005