Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.
Streaming mediaStreaming media is multimedia that is delivered and consumed in a continuous manner from a source, with little or no intermediate storage in network elements. Streaming refers to the delivery method of content, rather than the content itself. Distinguishing delivery method from the media applies specifically to telecommunications networks, as most of the traditional media delivery systems are either inherently streaming (e.g. radio, television) or inherently non-streaming (e.g. books, videotapes, audio CDs).
Huffman codingIn computer science and information theory, a Huffman code is a particular type of optimal prefix code that is commonly used for lossless data compression. The process of finding or using such a code is Huffman coding, an algorithm developed by David A. Huffman while he was a Sc.D. student at MIT, and published in the 1952 paper "A Method for the Construction of Minimum-Redundancy Codes". The output from Huffman's algorithm can be viewed as a variable-length code table for encoding a source symbol (such as a character in a file).
Mutual informationIn probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.
Time–frequency representationA time–frequency representation (TFR) is a view of a signal (taken to be a function of time) represented over both time and frequency. Time–frequency analysis means analysis into the time–frequency domain provided by a TFR. This is achieved by using a formulation often called "Time–Frequency Distribution", abbreviated as TFD. TFRs are often complex-valued fields over time and frequency, where the modulus of the field represents either amplitude or "energy density" (the concentration of the root mean square over time and frequency), and the argument of the field represents phase.
BroadbandIn telecommunications, broadband is the wide-bandwidth data transmission that transports multiple signals at a wide range of frequencies and Internet traffic types, which enables messages to be sent simultaneously and is used in fast internet connections. The medium can be coaxial cable, optical fiber, wireless Internet (radio), twisted pair, or satellite. In the context of Internet access, broadband is used to mean any high-speed Internet access that is always on and faster than dial-up access over traditional analog or ISDN PSTN services.
Entropy codingIn information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have expected code length greater or equal to the entropy of the source. More precisely, the source coding theorem states that for any source distribution, the expected code length satisfies , where is the number of symbols in a code word, is the coding function, is the number of symbols used to make output codes and is the probability of the source symbol.
Information contentIn information theory, the information content, self-information, surprisal, or Shannon information is a basic quantity derived from the probability of a particular event occurring from a random variable. It can be thought of as an alternative way of expressing probability, much like odds or log-odds, but which has particular mathematical advantages in the setting of information theory. The Shannon information can be interpreted as quantifying the level of "surprise" of a particular outcome.
Data communicationData communication or digital communications, including data transmission and data reception, is the transfer and reception of data in the form of a digital bitstream or a digitized analog signal transmitted over a point-to-point or point-to-multipoint communication channel. Examples of such channels are copper wires, optical fibers, wireless communication using radio spectrum, storage media and computer buses. The data are represented as an electromagnetic signal, such as an electrical voltage, radiowave, microwave, or infrared signal.
OggOgg is a , open container format maintained by the Xiph.Org Foundation. The authors of the Ogg format state that it is unrestricted by software patents and is designed to provide for efficient streaming and manipulation of high-quality digital multimedia. Its name is derived from "ogging", jargon from the computer game Netrek. The Ogg container format can multiplex a number of independent streams for audio, video, text (such as subtitles), and metadata. In the Ogg multimedia framework, Theora provides a lossy video layer.