**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# On the use of training sequences for channel estimation

Abstract

Suppose Q is a family of discrete memoryless channels. An unknown member of Q is available with perfect (causal) feedback for communication. A recent result (A. Tchamkerten and I.E. Telatar) shows the existence, for certain families of channels (e.g. binary symmetric channels and Z channels), of coding schemes that achieve Burnashev's exponent universally over these families. In other words, in certain cases, there is no loss in the error exponent by ignoring the channel: transmitter and receiver can design optimal blind coding schemes that perform as well as the best feedback coding schemes tuned for the channel under use. Here we study the situation where communication is carried by first testing the channel by means of a training sequence, then coding the information according to the channel estimate. We provide an upper bound on the maximum achievable error exponent of any such scheme. If we consider binary symmetric channels and Z channels this bound is much lower than Burnashev's exponent. This suggests that in terms of error exponent, a good universal feedback scheme entangles channel estimation with information delivery, rather than separating them.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts

Loading

Related publications

Loading

Related MOOCs

Loading

Related concepts (7)

Binary symmetric channel

A binary symmetric channel (or BSCp) is a common communications channel model used in coding theory and information theory. In this model, a transmitter wishes to send a bit (a zero or a one), and the receiver will receive a bit. The bit will be "flipped" with a "crossover probability" of p, and otherwise is received correctly. This model can be applied to varied communication channels such as telephone lines or disk drive storage.

Error exponent

In information theory, the error exponent of a channel code or source code over the block length of the code is the rate at which the error probability decays exponentially with the block length of the code. Formally, it is defined as the limiting ratio of the negative logarithm of the error probability to the block length of the code for large block lengths. For example, if the probability of error of a decoder drops as , where is the block length, the error exponent is . In this example, approaches for large .

Coding theory

Coding theory is the study of the properties of codes and their respective fitness for specific applications. Codes are used for data compression, cryptography, error detection and correction, data transmission and data storage. Codes are studied by various scientific disciplines—such as information theory, electrical engineering, mathematics, linguistics, and computer science—for the purpose of designing efficient and reliable data transmission methods.

Related publications (5)

Related MOOCs

No results

Loading

Loading

Loading

Shannon, in his landmark 1948 paper, developed a framework for characterizing the fundamental limits of information transmission. Among other results, he showed that reliable communication over a chan

Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. Is there a coding scheme (possibly with variabl

Emre Telatar, Aslan Tchamkerten

Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. We study a scenario where communication is carr

2006