K-frameIn linear algebra, a branch of mathematics, a k-frame is an ordered set of k linearly independent vectors in a vector space; thus k ≤ n, where n is the dimension of the space, and if k = n an n-frame is precisely an ordered basis. If the vectors are orthogonal, or orthonormal, the frame is called an orthogonal frame, or orthonormal frame, respectively. The set of k-frames (particularly the set of orthonormal k-frames) in a given space X is known as the Stiefel manifold, and denoted Vk(X).
Inner product spaceIn mathematics, an inner product space (or, rarely, a Hausdorff pre-Hilbert space) is a real vector space or a complex vector space with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in . Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality (zero inner product) of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or scalar product of Cartesian coordinates.
Dot productIn mathematics, the dot product or scalar product is an algebraic operation that takes two equal-length sequences of numbers (usually coordinate vectors), and returns a single number. In Euclidean geometry, the dot product of the Cartesian coordinates of two vectors is widely used. It is often called the inner product (or rarely projection product) of Euclidean space, even though it is not the only inner product that can be defined on Euclidean space (see Inner product space for more).
Fourier seriesA Fourier series (ˈfʊrieɪ,_-iər) is an expansion of a periodic function into a sum of trigonometric functions. The Fourier series is an example of a trigonometric series, but not all trigonometric series are Fourier series. By expressing a function as a sum of sines and cosines, many problems involving the function become easier to analyze because trigonometric functions are well understood. For example, Fourier series were first used by Joseph Fourier to find solutions to the heat equation.
EigenfunctionIn mathematics, an eigenfunction of a linear operator D defined on some function space is any non-zero function in that space that, when acted upon by D, is only multiplied by some scaling factor called an eigenvalue. As an equation, this condition can be written as for some scalar eigenvalue The solutions to this equation may also be subject to boundary conditions that limit the allowable eigenvalues and eigenfunctions. An eigenfunction is a type of eigenvector.
Orthogonal groupIn mathematics, the orthogonal group in dimension , denoted , is the group of distance-preserving transformations of a Euclidean space of dimension that preserve a fixed point, where the group operation is given by composing transformations. The orthogonal group is sometimes called the general orthogonal group, by analogy with the general linear group. Equivalently, it is the group of orthogonal matrices, where the group operation is given by matrix multiplication (an orthogonal matrix is a real matrix whose inverse equals its transpose).
Dimension (vector space)In mathematics, the dimension of a vector space V is the cardinality (i.e., the number of vectors) of a basis of V over its base field. It is sometimes called Hamel dimension (after Georg Hamel) or algebraic dimension to distinguish it from other types of dimension. For every vector space there exists a basis, and all bases of a vector space have equal cardinality; as a result, the dimension of a vector space is uniquely defined. We say is if the dimension of is finite, and if its dimension is infinite.
Orthogonal basisIn mathematics, particularly linear algebra, an orthogonal basis for an inner product space is a basis for whose vectors are mutually orthogonal. If the vectors of an orthogonal basis are normalized, the resulting basis is an orthonormal basis. Any orthogonal basis can be used to define a system of orthogonal coordinates Orthogonal (not necessarily orthonormal) bases are important due to their appearance from curvilinear orthogonal coordinates in Euclidean spaces, as well as in Riemannian and pseudo-Riemannian manifolds.
Change of basisIn mathematics, an ordered basis of a vector space of finite dimension n allows representing uniquely any element of the vector space by a coordinate vector, which is a sequence of n scalars called coordinates. If two different bases are considered, the coordinate vector that represents a vector v on one basis is, in general, different from the coordinate vector that represents v on the other basis. A change of basis consists of converting every assertion expressed in terms of coordinates relative to one basis into an assertion expressed in terms of coordinates relative to the other basis.
Square-integrable functionIn mathematics, a square-integrable function, also called a quadratically integrable function or function or square-summable function, is a real- or complex-valued measurable function for which the integral of the square of the absolute value is finite. Thus, square-integrability on the real line is defined as follows. One may also speak of quadratic integrability over bounded intervals such as for . An equivalent definition is to say that the square of the function itself (rather than of its absolute value) is Lebesgue integrable.