Power iterationIn mathematics, power iteration (also known as the power method) is an eigenvalue algorithm: given a diagonalizable matrix , the algorithm will produce a number , which is the greatest (in absolute value) eigenvalue of , and a nonzero vector , which is a corresponding eigenvector of , that is, . The algorithm is also known as the Von Mises iteration. Power iteration is a very simple algorithm, but it may converge slowly.
Matrix multiplication algorithmBecause matrix multiplication is such a central operation in many numerical algorithms, much work has been invested in making matrix multiplication algorithms efficient. Applications of matrix multiplication in computational problems are found in many fields including scientific computing and pattern recognition and in seemingly unrelated problems such as counting the paths through a graph. Many different algorithms have been designed for multiplying matrices on different types of hardware, including parallel and distributed systems, where the computational work is spread over multiple processors (perhaps over a network).
Vector multiplicationIn mathematics, vector multiplication may refer to one of several operations between two (or more) vectors. It may concern any of the following articles: Dot product – also known as the "scalar product", a binary operation that takes two vectors and returns a scalar quantity. The dot product of two vectors can be defined as the product of the magnitudes of the two vectors and the cosine of the angle between the two vectors. Alternatively, it is defined as the product of the projection of the first vector onto the second vector and the magnitude of the second vector.
Computational complexity of matrix multiplicationIn theoretical computer science, the computational complexity of matrix multiplication dictates how quickly the operation of matrix multiplication can be performed. Matrix multiplication algorithms are a central subroutine in theoretical and numerical algorithms for numerical linear algebra and optimization, so finding the right amount of time it should take is of major practical relevance. Directly applying the mathematical definition of matrix multiplication gives an algorithm that requires n3 field operations to multiply two n × n matrices over that field (Θ(n3) in big O notation).
Eigenvalues and eigenvectorsIn linear algebra, an eigenvector (ˈaɪgənˌvɛktər) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor. Geometrically, a transformation matrix rotates, stretches, or shears the vectors it acts upon. The eigenvectors for a linear transformation matrix are the set of vectors that are only stretched, with no rotation or shear.