Summary
In mathematics, the binary logarithm (log2n) is the power to which the number 2 must be raised to obtain the value n. That is, for any real number x, For example, the binary logarithm of 1 is 0, the binary logarithm of 2 is 1, the binary logarithm of 4 is 2, and the binary logarithm of 32 is 5. The binary logarithm is the logarithm to the base 2 and is the inverse function of the power of two function. As well as log2, an alternative notation for the binary logarithm is lb (the notation preferred by ISO 31-11 and ISO 80000-2). Historically, the first application of binary logarithms was in music theory, by Leonhard Euler: the binary logarithm of a frequency ratio of two musical tones gives the number of octaves by which the tones differ. Binary logarithms can be used to calculate the length of the representation of a number in the binary numeral system, or the number of bits needed to encode a message in information theory. In computer science, they count the number of steps needed for binary search and related algorithms. Other areas in which the binary logarithm is frequently used include combinatorics, bioinformatics, the design of sports tournaments, and photography. Binary logarithms are included in the standard C mathematical functions and other mathematical software packages. The integer part of a binary logarithm can be found using the find first set operation on an integer value, or by looking up the exponent of a floating point value. The fractional part of the logarithm can be calculated efficiently. History of logarithms The powers of two have been known since antiquity; for instance, they appear in Euclid's Elements, Props. IX.32 (on the factorization of powers of two) and IX.36 (half of the Euclid–Euler theorem, on the structure of even perfect numbers). And the binary logarithm of a power of two is just its position in the ordered sequence of powers of two. On this basis, Michael Stifel has been credited with publishing the first known table of binary logarithms in 1544.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (11)
MATH-200: Analysis III
Apprendre les bases de l'analyse vectorielle et de l'analyse complexe.
MATH-101(g): Analysis I
Étudier les concepts fondamentaux d'analyse et le calcul différentiel et intégral des fonctions réelles d'une variable.
MATH-189: Mathematics
Ce cours a pour but de donner les fondements de mathématiques nécessaires à l'architecte contemporain évoluant dans une école polytechnique.
Show more
Related lectures (49)
Random-Subcube Model
Introduces the Random-Subcube Model (RSM) for constraint satisfaction problems, exploring its structure, phase transitions, and variable freezing.
Limit Study: Sinus and Logarithm
Focuses on studying a limit involving sinus and logarithm functions as x approaches positive infinity.
Differential Calculation: Trigonometric Derivatives
Explores trigonometric derivatives, composition of functions, and inflection points in differential calculation.
Show more
Related publications (29)

Chung-type law of the iterated logarithm and exact moduli of continuity for a class of anisotropic Gaussian random fields

Cheuk Yin Lee

We establish a Chung-type law of the iterated logarithm and the exact local and uniform moduli of continuity for a large class of anisotropic Gaussian random fields with a harmonizable-type integral representation and the property of strong local nondeterm ...
INT STATISTICAL INST2023
Show more
Related people (1)
Related concepts (16)
Indian mathematics
Indian mathematics emerged in the Indian subcontinent from 1200 BCE until the end of the 18th century. In the classical period of Indian mathematics (400 CE to 1200 CE), important contributions were made by scholars like Aryabhata, Brahmagupta, Bhaskara II, and Varāhamihira. The decimal number system in use today was first recorded in Indian mathematics. Indian mathematicians made early contributions to the study of the concept of zero as a number, negative numbers, arithmetic, and algebra.
Nat (unit)
The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e. One nat is equal to 1/ln 2 shannons ≈ 1.44 Sh or, equivalently, 1/ln 10 hartleys ≈ 0.434 Hart.
Hartley (unit)
The hartley (symbol Hart), also called a ban, or a dit (short for decimal digit), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is . It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.
Show more
Related MOOCs (2)
Trigonometric Functions, Logarithms and Exponentials
Ce cours donne les connaissances fondamentales liées aux fonctions trigonométriques, logarithmiques et exponentielles. La présentation des concepts et des propositions est soutenue par une grande gamm
Trigonometric Functions, Logarithms and Exponentials
Ce cours donne les connaissances fondamentales liées aux fonctions trigonométriques, logarithmiques et exponentielles. La présentation des concepts et des propositions est soutenue par une grande gamm