Exponential bounds for list size moments and error probability
Related publications (32)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. We study a scenario where communication is carried by first testing the channel by means of a training ...
We present a new model for LT codes which simplifies the analysis of the error probability of decoding by belief propagation. For any given degree distribution, we provide the first rigorous expression for the limiting bit-error probability as the length o ...
Suppose Q is a family of discrete memoryless channels. An unknown member of Q is available with perfect (causal) feedback for communication. A recent result (A. Tchamkerten and I.E. Telatar) shows the existence, for certain families of channels (e.g. binar ...
Fountain codes have been successfully employed for reliable and efficient transmission of information via erasure channels with unknown erasure rates. This paper introduces the notion of fountain capacity for arbitrary channels, and shows that it is equal ...
We consider communication over a time invariant discrete memoryless channel with noiseless and instantaneous feedback. We assume that the communicating parties are not aware of the underlying channel, however they know that it belongs to some specific fami ...
Communications is about conveying information from one point to another subject to certain performance constraints. The information is assumed to be generated by a source and may, for example, represent a voice waveform, the reading of a thermal sensor, or ...
Suppose Q is a family of discrete memoryless channels. An unknown member of Q will be available, with perfect, causal output feedback for communication. Is there a coding scheme (possibly with variable transmission time) that can achieve the Burnashev erro ...
The random coding capacity of the Gaussian arbitrarily varying channel (GAVC) under a maximal probability of error criterion is equal to that of an additive white Gaussian noise (AWGN) channel with the interference power as additional noise. The determinis ...
Ieee Service Center, 445 Hoes Lane, Po Box 1331, Piscataway, Nj 08855-1331 Usa2006
Burnashev in 1976 gave an exact expression for the reliability function of a discrete memoryless channel (DMC) with noiseless feedback. A coding scheme that achieves this exponent needs, in general, to know the statistics of the channel. Suppose now that t ...
We consider communication over a time-invariant discrete memoryless channel (DMC) with noiseless and instantaneous feedback. We assume that the transmitter and the receiver are not aware of the underlying channel, however, they know that it belongs to some ...