**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Decoding methods

Summary

In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords. These are often used to recover messages sent over a noisy channel, such as a binary symmetric channel.
is considered a binary code with the length ; shall be elements of ; and is the distance between those elements.
One may be given the message , then ideal observer decoding generates the codeword . The process results in this solution:
For example, a person can choose the codeword that is most likely to be received as the message after transmission.
Each codeword does not have an expected possibility: there may be more than one codeword with an equal likelihood of mutating into the received message. In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention. Popular conventions include:
Request that the codeword be resent - automatic repeat-request.
Choose any random codeword from the set of most likely codewords which is nearer to that.
If another code follows, mark the ambiguous bits of the codeword as erasures and hope that the outer code disambiguates them
Given a received vector maximum likelihood decoding picks a codeword that maximizes
that is, the codeword that maximizes the probability that was received, given that was sent. If all codewords are equally likely to be sent then this scheme is equivalent to ideal observer decoding.
In fact, by Bayes Theorem,
Upon fixing , is restructured and
is constant as all codewords are equally likely to be sent.
Therefore,
is maximised as a function of the variable precisely when
is maximised, and the claim follows.
As with ideal observer decoding, a convention must be agreed to for non-unique decoding.
The maximum likelihood decoding problem can also be modeled as an integer programming problem.
The maximum likelihood decoding algorithm is an instance of the "marginalize a product function" problem which is solved by applying the generalized distributive law.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (2)

Related concepts (11)

Related courses (9)

Related lectures (114)

Error correction code

In computing, telecommunication, information theory, and coding theory, forward error correction (FEC) or channel coding is a technique used for controlling errors in data transmission over unreliable or noisy communication channels. The central idea is that the sender encodes the message in a redundant way, most often by using an error correction code or error correcting code (ECC). The redundancy allows the receiver not only to detect errors that may occur anywhere in the message, but often to correct a limited number of errors.

Decoding methods

In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords. These are often used to recover messages sent over a noisy channel, such as a binary symmetric channel. is considered a binary code with the length ; shall be elements of ; and is the distance between those elements. One may be given the message , then ideal observer decoding generates the codeword .

Hamming distance

In information theory, the Hamming distance between two strings of equal length is the number of positions at which the corresponding symbols are different. In other words, it measures the minimum number of substitutions required to change one string into the other, or the minimum number of errors that could have transformed one string into the other. In a more general context, the Hamming distance is one of several string metrics for measuring the edit distance between two sequences.

COM-102: Advanced information, computation, communication II

Text, sound, and images are examples of information sources stored in our computers and/or communicated over the Internet. How do we measure, compress, and protect the informatin they contain?

COM-302: Principles of digital communications

This course is on the foundations of digital communication. The focus is on the transmission problem (rather than being on source coding).

EE-543: Advanced wireless receivers

Students extend their knowledge on wireless communication systems to spread-spectrum communication and to multi-antenna systems. They also learn about the basic information theoretic concepts, about c

Error-Correcting Codes: Reed-SolomonCOM-102: Advanced information, computation, communication II

Explains Reed-Solomon error-correcting codes and their unique decoding property over finite fields.

Error Correction Codes: Decoding and CommunicationCOM-404: Information theory and coding

Explores error correction codes, decoding algorithms, and their role in communication systems.

Convolutional Codes: Encoding and DecodingCOM-302: Principles of digital communications

Explains the encoding and decoding process of convolutional codes using constellation prints.

Anna-Lena Horlemann, Joachim Rosenthal

Cryptosystems based on rank metric codes have been considered as an alternative to McEliece cryptosystems due to the relative difficulty of solving the rank syndrome decoding problem. Generic attacks

We revise the proof of low-rate upper bounds on the reliability function of discrete memoryless channels for ordinary and list-decoding schemes, in particular Berlekamp and Blinovsky's zero-rate bound