In linear algebra, a Householder transformation (also known as a Householder reflection or elementary reflector) is a linear transformation that describes a reflection about a plane or hyperplane containing the origin. The Householder transformation was used in a 1958 paper by Alston Scott Householder.
Its analogue over general inner product spaces is the Householder operator.
The reflection hyperplane can be defined by its normal vector, a unit vector (a vector with length ) that is orthogonal to the hyperplane. The reflection of a point about this hyperplane is the linear transformation:
where is given as a column unit vector with conjugate transpose .
The matrix constructed from this transformation can be expressed in terms of an outer product as:
is known as the Householder matrix, where is the identity matrix.
The Householder matrix has the following properties:
it is Hermitian: ,
it is unitary: ,
hence it is involutory: .
A Householder matrix has eigenvalues . To see this, notice that if is orthogonal to the vector which was used to create the reflector, then , i.e., is an eigenvalue of multiplicity , since there are independent vectors orthogonal to . Also, notice , and so is an eigenvalue with multiplicity .
The determinant of a Householder reflector is , since the determinant of a matrix is the product of its eigenvalues, in this case one of which is with the remainder being (as in the previous point).
In geometric optics, specular reflection can be expressed in terms of the Householder matrix (see ).
Householder transformations are widely used in numerical linear algebra, for example, to annihilate the entries below the main diagonal of a matrix, to perform QR decompositions and in the first step of the QR algorithm. They are also widely used for transforming to a Hessenberg form. For symmetric or Hermitian matrices, the symmetry can be preserved, resulting in tridiagonalization.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
The microstructure of many alloys and ceramics are constituted of very fine intricate domains (variants) created by diffusive or displacive phase transformations. The course introduces the crystallogr
The course introduces the paradigm of quantum computation in an axiomatic way. We introduce the notion of quantum bit, gates, circuits and we treat the most important quantum algorithms. We also touch
In linear algebra, an eigenvector (ˈaɪgənˌvɛktər) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor. Geometrically, a transformation matrix rotates, stretches, or shears the vectors it acts upon. The eigenvectors for a linear transformation matrix are the set of vectors that are only stretched, with no rotation or shear.
In linear algebra, eigendecomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and eigenvectors. Only diagonalizable matrices can be factorized in this way. When the matrix being factorized is a normal or real symmetric matrix, the decomposition is called "spectral decomposition", derived from the spectral theorem. Eigenvalue, eigenvector and eigenspace A (nonzero) vector v of dimension N is an eigenvector of a square N × N matrix A if it satisfies a linear equation of the form for some scalar λ.
In linear algebra, a QR decomposition, also known as a QR factorization or QU factorization, is a decomposition of a matrix A into a product A = QR of an orthonormal matrix Q and an upper triangular matrix R. QR decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the QR algorithm. Any real square matrix A may be decomposed as where Q is an orthogonal matrix (its columns are orthogonal unit vectors meaning ) and R is an upper triangular matrix (also called right triangular matrix).
We present TimeEvolver, a program for computing time evolution in a generic quantum system. It relies on well-known Krylov subspace techniques to tackle the problem of multiplying the exponential of a large sparse matrix iH, where His the Hamiltonian, with ...
We present a generic method to construct orthogonal projectors for two-dimensional landmark-based parametric spline curves. We construct vector spaces that define a geometric transformation (e.g., affine, similarity, and scaling) that is applied to a refer ...
2017
Cloaking via transformation optics was introduced by Pendry, Schurig, and Smith for the
Maxwell system and Leonhardt in the geometric optics setting. They used a singular change
of variables which blows up a point into a cloaked region. The same transforma ...