ForceIn physics, a force is an influence that can cause an object to change its velocity, i.e., to accelerate, unless counterbalanced by other forces. The concept of force makes the everyday notion of pushing or pulling mathematically precise. Because the magnitude and direction of a force are both important, force is a vector quantity. It is measured in the SI unit of newton (N) and often represented by the symbol F.
Centrifugal forceIn Newtonian mechanics, the centrifugal force is an inertial force (also called a "fictitious" or "pseudo" force) that appears to act on all objects when viewed in a rotating frame of reference. It is directed away from an axis which is parallel to the axis of rotation and passing through the coordinate system's origin. If the axis of rotation passes through the coordinate system's origin, the centrifugal force is directed radially outwards from that axis.
ArchimedesArchimedes of Syracuse (ˌɑːrkᵻˈmiːdiːz, ; 287-212 BC) was an Ancient Greek mathematician, physicist, engineer, astronomer, and inventor from the ancient city of Syracuse in Sicily. Although few details of his life are known, he is regarded as one of the leading scientists in classical antiquity. Considered the greatest mathematician of ancient history, and one of the greatest of all time, Archimedes anticipated modern calculus and analysis by applying the concept of the infinitely small and the method of exhaustion to derive and rigorously prove a range of geometrical theorems.
Divergence (statistics)In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold. The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (also called Kullback–Leibler divergence), which is central to information theory.
F-divergenceIn probability theory, an -divergence is a function that measures the difference between two probability distributions and . Many common divergences, such as KL-divergence, Hellinger distance, and total variation distance, are special cases of -divergence. These divergences were introduced by Alfréd Rényi in the same paper where he introduced the well-known Rényi entropy. He proved that these divergences decrease in Markov processes.