Linear approximationIn mathematics, a linear approximation is an approximation of a general function using a linear function (more precisely, an affine function). They are widely used in the method of finite differences to produce first order methods for solving or approximating solutions to equations. Given a twice continuously differentiable function of one real variable, Taylor's theorem for the case states that where is the remainder term.
Rotation matrixIn linear algebra, a rotation matrix is a transformation matrix that is used to perform a rotation in Euclidean space. For example, using the convention below, the matrix rotates points in the xy plane counterclockwise through an angle θ about the origin of a two-dimensional Cartesian coordinate system. To perform the rotation on a plane point with standard coordinates v = (x, y), it should be written as a column vector, and multiplied by the matrix R: If x and y are the endpoint coordinates of a vector, where x is cosine and y is sine, then the above equations become the trigonometric summation angle formulae.
Jackson networkIn queueing theory, a discipline within the mathematical theory of probability, a Jackson network (sometimes Jacksonian network) is a class of queueing network where the equilibrium distribution is particularly simple to compute as the network has a product-form solution. It was the first significant development in the theory of networks of queues, and generalising and applying the ideas of the theorem to search for similar product-form solutions in other networks has been the subject of much research, including ideas used in the development of the Internet.
Pushforward measureIn measure theory, a pushforward measure (also known as push forward, push-forward or image measure) is obtained by transferring ("pushing forward") a measure from one measurable space to another using a measurable function. Given measurable spaces and , a measurable mapping and a measure , the pushforward of is defined to be the measure given by for This definition applies mutatis mutandis for a signed or complex measure. The pushforward measure is also denoted as , , , or .
Finite groupIn abstract algebra, a finite group is a group whose underlying set is finite. Finite groups often arise when considering symmetry of mathematical or physical objects, when those objects admit just a finite number of structure-preserving transformations. Important examples of finite groups include cyclic groups and permutation groups. The study of finite groups has been an integral part of group theory since it arose in the 19th century.
Vandermonde matrixIn linear algebra, a Vandermonde matrix, named after Alexandre-Théophile Vandermonde, is a matrix with the terms of a geometric progression in each row: an matrix with entries , the jth power of the number , for all zero-based indices and . Most authors define the Vandermonde matrix as the transpose of the above matrix. The determinant of a square Vandermonde matrix (when ) is called a Vandermonde determinant or Vandermonde polynomial. Its value is: This is non-zero if and only if all are distinct (no two are equal), making the Vandermonde matrix invertible.
Finite ringIn mathematics, more specifically abstract algebra, a finite ring is a ring that has a finite number of elements. Every finite field is an example of a finite ring, and the additive part of every finite ring is an example of an abelian finite group, but the concept of finite rings in their own right has a more recent history. Although rings have more structure than groups, the theory of finite rings is simpler than that of finite groups.
NumPyNumPy (pronounced ˈnʌmpaɪ () or sometimes ˈnʌmpi ()) is a library for the Python programming language, adding support for large, multi-dimensional arrays and matrices, along with a large collection of high-level mathematical functions to operate on these arrays. The predecessor of NumPy, Numeric, was originally created by Jim Hugunin with contributions from several other developers. In 2005, Travis Oliphant created NumPy by incorporating features of the competing Numarray into Numeric, with extensive modifications.
Quadratic variationIn mathematics, quadratic variation is used in the analysis of stochastic processes such as Brownian motion and other martingales. Quadratic variation is just one kind of variation of a process. Suppose that is a real-valued stochastic process defined on a probability space and with time index ranging over the non-negative real numbers. Its quadratic variation is the process, written as , defined as where ranges over partitions of the interval and the norm of the partition is the mesh.
Inner product spaceIn mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in . Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality (zero inner product) of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or scalar product of Cartesian coordinates.