Related concepts (38)
Speed
In everyday use and in kinematics, the speed (commonly referred to as v) of an object is the magnitude of the change of its position over time or the magnitude of the change of its position per unit of time; it is thus a scalar quantity. The average speed of an object in an interval of time is the distance travelled by the object divided by the duration of the interval; the instantaneous speed is the limit of the average speed as the duration of the time interval approaches zero.
Chebyshev distance
In mathematics, Chebyshev distance (or Tchebychev distance), maximum metric, or L∞ metric is a metric defined on a vector space where the distance between two vectors is the greatest of their differences along any coordinate dimension. It is named after Pafnuty Chebyshev. It is also known as chessboard distance, since in the game of chess the minimum number of moves needed by a king to go from one square on a chessboard to another equals the Chebyshev distance between the centers of the squares, if the squares have side length one, as represented in 2-D spatial coordinates with axes aligned to the edges of the board.
Distance from a point to a line
In Euclidean geometry, the distance from a point to a line is the shortest distance from a given point to any point on an infinite straight line. It is the perpendicular distance of the point to the line, the length of the line segment which joins the point to nearest point on the line. The algebraic expression for calculating it can be derived and expressed in several ways. Knowing the distance from a point to a line can be useful in various situations—for example, finding the shortest distance to reach a road, quantifying the scatter on a graph, etc.
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how one probability distribution P is different from a second, reference probability distribution Q. A simple interpretation of the KL divergence of P from Q is the expected excess surprise from using Q as a model when the actual distribution is P.
Divergence (statistics)
In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold. The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (also called Kullback–Leibler divergence), which is central to information theory.
Proportionality (mathematics)
In mathematics, two sequences of numbers, often experimental data, are proportional or directly proportional if their corresponding elements have a constant ratio. The ratio is called coefficient of proportionality (or proportionality constant) and its reciprocal is known as constant of normalization (or normalizing constant). Two sequences are inversely proportional if corresponding elements have a constant product, also called the coefficient of proportionality.
Similarity measure
In statistics and related fields, a similarity measure or similarity function or similarity metric is a real-valued function that quantifies the similarity between two objects. Although no single definition of a similarity exists, usually such measures are in some sense the inverse of distance metrics: they take on large values for similar objects and either zero or a negative value for very dissimilar objects. Though, in more broad terms, a similarity function may also satisfy metric axioms.
Lee distance
In coding theory, the Lee distance is a distance between two strings and of equal length n over the q-ary alphabet {0, 1, ..., q − 1} of size q ≥ 2. It is a metric defined as If q = 2 or q = 3 the Lee distance coincides with the Hamming distance, because both distances are 0 for two single equal symbols and 1 for two single non-equal symbols. For q > 3 this is not the case anymore; the Lee distance between single letters can become bigger than 1. However, there exists a Gray isometry (weight-preserving bijection) between with the Lee weight and with the Hamming weight.
Absolute difference
The absolute difference of two real numbers and is given by , the absolute value of their difference. It describes the distance on the real line between the points corresponding to and . It is a special case of the Lp distance for all and is the standard metric used for both the set of rational numbers and their completion, the set of real numbers . As with any metric, the metric properties hold: since absolute value is always non-negative. if and only if . (symmetry or commutativity).
Great-circle distance
The great-circle distance, orthodromic distance, or spherical distance is the distance along a great circle. It is the shortest distance between two points on the surface of a sphere, measured along the surface of the sphere (as opposed to a straight line through the sphere's interior). The distance between two points in Euclidean space is the length of a straight line between them, but on the sphere there are no straight lines. In spaces with curvature, straight lines are replaced by geodesics.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.