Concept

Units of information

In computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. In information theory, units of information are also used to measure information contained in messages and the entropy of random variables. The most commonly used units of data storage capacity are the bit, the capacity of a system that has only two states, and the byte (or octet), which is equivalent to eight bits. Multiples of these units can be formed from these with the SI prefixes (power-of-ten prefixes) or the newer IEC binary prefixes (power-of-two prefixes). In 1928, Ralph Hartley observed a fundamental storage principle, which was further formalized by Claude Shannon in 1945: the information that can be stored in a system is proportional to the logarithm of N possible states of that system, denoted logb N. Changing the base of the logarithm from b to a different number c has the effect of multiplying the value of the logarithm by a fixed constant, namely logc N = (logc b) logb N. Therefore, the choice of the base b determines the unit used to measure information. In particular, if b is a positive integer, then the unit is the amount of information that can be stored in a system with b possible states. When b is 2, the unit is the shannon, equal to the information content of one "bit" (a portmanteau of binary digit). A system with 8 possible states, for example, can store up to log2 8 = 3 bits of information. Other units that have been named include: Base b = 3 the unit is called "trit", and is equal to log2 3 (≈ 1.585) bits. Base b = 10 the unit is called decimal digit, hartley, ban, decit, or dit, and is equal to log2 10 (≈ 3.322) bits. Base b = e, the base of natural logarithms the unit is called a nat, nit, or nepit (from Neperian), and is worth log2 e (≈ 1.443) bits. The trit, ban, and nat are rarely used to measure storage capacity; but the nat, in particular, is often used in information theory, because natural logarithms are mathematically more convenient than logarithms in other bases.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (2)
BIO-205: Cellular and molecular biology I
The course covers the regulation of gene expression, which translates the information contained in the genome into function, by adjusting the levels and activities of mRNAs and proteins to the needs o
CS-119(c): Information, Computation, Communication
L'objectif de ce cours est d'introduire les étudiants à la pensée algorithmique, de les familiariser avec les fondamentaux de l'Informatique et de développer une première compétence en programmation (
Related lectures (18)
Distributed Computing Execution Models
Explores challenges in handling large data sizes in distributed computing and discusses declustering techniques and failure management strategies.
Transformed into Z: Summary and Experiences
Explores transforming sequences into Z, convergence, properties, and practical applications with function generators and resistors.
Channel Coding: Convolutional Codes
Explores channel coding with a focus on convolutional codes, emphasizing error detection, correction, and decoding processes.
Show more
Related publications (24)

EdgeAI-Aware Design of In-Memory Computing Architectures

Marco Antonio Rios

Driven by the demand for real-time processing and the need to minimize latency in AI algorithms, edge computing has experienced remarkable progress. Decision-making AI applications stand out for their heavy reliance on data-centric operations, predominantl ...
EPFL2024

HetCache: Synergising NVMe Storage and GPU acceleration for Memory-Efficient Analytics

Anastasia Ailamaki, Periklis Chrysogelos, Hamish Mcniece Hill Nicholson, Syed Mohammad Aunn Raza

Accessing input data is a critical operation in data analytics: i) slow data access significantly degrades performance, and ii) storing everything in the fastest medium, i.e., memory, incurs high operational and hardware costs. Further, while GPUs offer in ...
2023

2D Nanosystems: Applications of 2D Semiconductors for In-Memory Computing

Guilherme Migliato Marega

Machine learning and data processing algorithms have been thriving in finding ways of processing and classifying information by exploiting the hidden trends of large datasets. Although these emerging computational methods have become successful in today's ...
EPFL2023
Show more
Related concepts (16)
Information
Information is an abstract concept that refers to that which has the power to inform. At the most fundamental level, information pertains to the interpretation (perhaps formally) of that which may be sensed, or their abstractions. Any natural process that is not completely random and any observable pattern in any medium can be said to convey some amount of information. Whereas digital signals and other data use discrete signs to convey information, other phenomena and artefacts such as analogue signals, poems, pictures, music or other sounds, and currents convey information in a more continuous form.
Binary logarithm
In mathematics, the binary logarithm (log2n) is the power to which the number 2 must be raised to obtain the value n. That is, for any real number x, For example, the binary logarithm of 1 is 0, the binary logarithm of 2 is 1, the binary logarithm of 4 is 2, and the binary logarithm of 32 is 5. The binary logarithm is the logarithm to the base 2 and is the inverse function of the power of two function. As well as log2, an alternative notation for the binary logarithm is lb (the notation preferred by ISO 31-11 and ISO 80000-2).
Hartley (unit)
The hartley (symbol Hart), also called a ban, or a dit (short for decimal digit), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is . It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.