Singular valueIn mathematics, in particular functional analysis, the singular values, or s-numbers of a compact operator acting between Hilbert spaces and , are the square roots of the (necessarily non-negative) eigenvalues of the self-adjoint operator (where denotes the adjoint of ). The singular values are non-negative real numbers, usually listed in decreasing order (σ1(T), σ2(T), ...). The largest singular value σ1(T) is equal to the operator norm of T (see Min-max theorem).
Square matrixIn mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as shearing or rotation. For example, if is a square matrix representing a rotation (rotation matrix) and is a column vector describing the position of a point in space, the product yields another column vector describing the position of that point after that rotation.
Torsion tensorIn differential geometry, the notion of torsion is a manner of characterizing a twist or screw of a moving frame around a curve. The torsion of a curve, as it appears in the Frenet–Serret formulas, for instance, quantifies the twist of a curve about its tangent vector as the curve evolves (or rather the rotation of the Frenet–Serret frame about the tangent vector). In the geometry of surfaces, the geodesic torsion describes how a surface twists about a curve on the surface.
AngleIn Euclidean geometry, an angle is the figure formed by two rays, called the sides of the angle, sharing a common endpoint, called the vertex of the angle. Angles formed by two rays are also known as plane angles as they lie in the plane that contains the rays. Angles are also formed by the intersection of two planes; these are called dihedral angles. Two intersecting curves may also define an angle, which is the angle of the rays lying tangent to the respective curves at their point of intersection.
Matrix decompositionIn the mathematical discipline of linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems. In numerical analysis, different decompositions are used to implement efficient matrix algorithms. For instance, when solving a system of linear equations , the matrix A can be decomposed via the LU decomposition.
Invertible matrixIn linear algebra, an n-by-n square matrix A is called invertible (also nonsingular, nondegenerate or (rarely used) regular), if there exists an n-by-n square matrix B such that where In denotes the n-by-n identity matrix and the multiplication used is ordinary matrix multiplication. If this is the case, then the matrix B is uniquely determined by A, and is called the (multiplicative) inverse of A, denoted by A−1. Matrix inversion is the process of finding the matrix B that satisfies the prior equation for a given invertible matrix A.
Pauli matricesIn mathematical physics and mathematics, the Pauli matrices are a set of three 2 × 2 complex matrices which are Hermitian, involutory and unitary. Usually indicated by the Greek letter sigma (σ), they are occasionally denoted by tau (τ) when used in connection with isospin symmetries. These matrices are named after the physicist Wolfgang Pauli. In quantum mechanics, they occur in the Pauli equation which takes into account the interaction of the spin of a particle with an external electromagnetic field.
Gamma matricesIn mathematical physics, the gamma matrices, also called the Dirac matrices, are a set of conventional matrices with specific anticommutation relations that ensure they generate a matrix representation of the Clifford algebra It is also possible to define higher-dimensional gamma matrices. When interpreted as the matrices of the action of a set of orthogonal basis vectors for contravariant vectors in Minkowski space, the column vectors on which the matrices act become a space of spinors, on which the Clifford algebra of spacetime acts.
Right angleIn geometry and trigonometry, a right angle is an angle of exactly 90 degrees or /2 radians corresponding to a quarter turn. If a ray is placed so that its endpoint is on a line and the adjacent angles are equal, then they are right angles. The term is a calque of Latin angulus rectus; here rectus means "upright", referring to the vertical perpendicular to a horizontal base line. Closely related and important geometrical concepts are perpendicular lines, meaning lines that form right angles at their point of intersection, and orthogonality, which is the property of forming right angles, usually applied to vectors.
Degenerate bilinear formIn mathematics, specifically linear algebra, a degenerate bilinear form f (x, y ) on a vector space V is a bilinear form such that the map from V to V∗ (the dual space of V ) given by v ↦ (x ↦ f (x, v )) is not an isomorphism. An equivalent definition when V is finite-dimensional is that it has a non-trivial kernel: there exist some non-zero x in V such that for all A nondegenerate or nonsingular form is a bilinear form that is not degenerate, meaning that is an isomorphism, or equivalently in finite dimensions, if and only if for all implies that .