In linear algebra, a diagonal matrix is a matrix in which the entries outside the main diagonal are all zero; the term usually refers to square matrices. Elements of the main diagonal can either be zero or nonzero. An example of a 2×2 diagonal matrix is , while an example of a 3×3 diagonal matrix is. An identity matrix of any size, or any multiple of it (a scalar matrix), is a diagonal matrix.
A diagonal matrix is sometimes called a scaling matrix, since matrix multiplication with it results in changing scale (size). Its determinant is the product of its diagonal values.
As stated above, a diagonal matrix is a matrix in which all off-diagonal entries are zero. That is, the matrix D = (di,j) with n columns and n rows is diagonal if
However, the main diagonal entries are unrestricted.
The term diagonal matrix may sometimes refer to a , which is an m-by-n matrix with all the entries not of the form di,i being zero. For example:
or
More often, however, diagonal matrix refers to square matrices, which can be specified explicitly as a . A square diagonal matrix is a symmetric matrix, so this can also be called a .
The following matrix is square diagonal matrix:
If the entries are real numbers or complex numbers, then it is a normal matrix as well.
In the remainder of this article we will consider only square diagonal matrices, and refer to them simply as "diagonal matrices".
A diagonal matrix can be constructed from a vector using the operator:
This may be written more compactly as .
The same operator is also used to represent block diagonal matrices as where each argument is a matrix.
The operator may be written as:
where represents the Hadamard product and is a constant vector with elements 1.
The inverse matrix-to-vector operator is sometimes denoted by the identically named where the argument is now a matrix and the result is a vector of its diagonal entries.
The following property holds:
A diagonal matrix with equal diagonal entries is a scalar matrix; that is, a scalar multiple λ of the identity matrix I.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Information is processed in physical devices. In the quantum regime the concept of classical bit is replaced by the quantum bit. We introduce quantum principles, and then quantum communications, key d
In linear algebra, a Toeplitz matrix or diagonal-constant matrix, named after Otto Toeplitz, is a matrix in which each descending diagonal from left to right is constant. For instance, the following matrix is a Toeplitz matrix: Any matrix of the form is a Toeplitz matrix. If the element of is denoted then we have A Toeplitz matrix is not necessarily square. A matrix equation of the form is called a Toeplitz system if is a Toeplitz matrix. If is an Toeplitz matrix, then the system has at-most only unique values, rather than .
In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a " matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra.
In linear algebra, the identity matrix of size is the square matrix with ones on the main diagonal and zeros elsewhere. It has unique properties, for example when the identity matrix represents a geometric transformation, the object remains unchanged by the transformation. In other contexts, it is analogous to multiplying by the number 1. The identity matrix is often denoted by , or simply by if the size is immaterial or can be trivially determined by the context.
, ,
The increasing complexity of transformer models in artificial intelligence expands their computational costs, memory usage, and energy consumption. Hardware acceleration tackles the ensuing challenges by designing processors and accelerators tailored for t ...
2024
Spectral algorithms are some of the main tools in optimization and inference problems on graphs. Typically, the graph is encoded as a matrix and eigenvectors and eigenvalues of the matrix are then used to solve the given graph problem. Spectral algorithms ...
This work is concerned with the computation of the action of a matrix function f(A), such as the matrix exponential or the matrix square root, on a vector b. For a general matrix A, this can be done by computing the compression of A onto a suitable Krylov ...