In mathematics, the spectral radius of a square matrix is the maximum of the absolute values of its eigenvalues. More generally, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements of its spectrum. The spectral radius is often denoted by ρ(·).
Let λ1, ..., λn be the eigenvalues of a matrix A ∈ Cn×n. The spectral radius of A is defined as
The spectral radius can be thought of as an infimum of all norms of a matrix. Indeed, on the one hand, for every natural matrix norm ; and on the other hand, Gelfand's formula states that . Both of these results are shown below.
However, the spectral radius does not necessarily satisfy for arbitrary vectors . To see why, let be arbitrary and consider the matrix
The characteristic polynomial of is , so its eigenvalues are and thus . However, . As a result,
As an illustration of Gelfand's formula, note that as , since if is even and if is odd.
A special case in which for all is when is a Hermitian matrix and is the Euclidean norm. This is because any Hermitian Matrix is diagonalizable by a unitary matrix, and unitary matrices preserve vector length. As a result,
In the context of a bounded linear operator A on a Banach space, the eigenvalues need to be replaced with the elements of the spectrum of the operator, i.e. the values for which is not bijective. We denote the spectrum by
The spectral radius is then defined as the supremum of the magnitudes of the elements of the spectrum:
Gelfand's formula, also known as the spectral radius formula, also holds for bounded linear operators: letting denote the operator norm, we have
A bounded operator (on a complex Hilbert space) is called a spectraloid operator if its spectral radius coincides with its numerical radius. An example of such an operator is a normal operator.
The spectral radius of a finite graph is defined to be the spectral radius of its adjacency matrix.
This definition extends to the case of infinite graphs with bounded degrees of vertices (i.e.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
This course is an introduction to the spectral theory of linear operators acting in Hilbert spaces. The main goal is the spectral decomposition of unbounded selfadjoint operators. We will also give el
Quantum computing has received wide-spread attention lately due the possibility of a near-term breakthrough of quantum supremacy. This course acts as an introduction to the area of quantum computing.
In mathematics, a matrix norm is a vector norm in a vector space whose elements (vectors) are matrices (of given dimensions). Given a field of either real or complex numbers, let be the K-vector space of matrices with rows and columns and entries in the field . A matrix norm is a norm on . This article will always write such norms with double vertical bars (like so: ).
In linear algebra, an eigenvector (ˈaɪgənˌvɛktər) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor. Geometrically, a transformation matrix rotates, stretches, or shears the vectors it acts upon. The eigenvectors for a linear transformation matrix are the set of vectors that are only stretched, with no rotation or shear.
In mathematics, spectral theory is an inclusive term for theories extending the eigenvector and eigenvalue theory of a single square matrix to a much broader theory of the structure of operators in a variety of mathematical spaces. It is a result of studies of linear algebra and the solutions of systems of linear equations and their generalizations. The theory is connected to that of analytic functions because the spectral properties of an operator are related to analytic functions of the spectral parameter.
In this thesis we address the computation of a spectral decomposition for symmetric
banded matrices. In light of dealing with large-scale matrices, where classical dense
linear algebra routines are not applicable, it is essential to design alternative tech ...
The spread of coronavirus disease 2019 (COVID-19) in Italy prompted drastic measures for transmission containment. We examine the effects of these interventions, based on modeling of the unfolding epidemic. We test modeling options of the spatially explici ...
Every Gelfand pair (G, K) admits a decomposition G = K P, where P < G is an amenable subgroup. In particular, the Furstenberg boundary of G is homogeneous. Applications include the complete classification of non-positively curved Gelfand pairs, relying on ...