In mathematics, spectral graph theory is the study of the properties of a graph in relationship to the characteristic polynomial, eigenvalues, and eigenvectors of matrices associated with the graph, such as its adjacency matrix or Laplacian matrix.
The adjacency matrix of a simple undirected graph is a real symmetric matrix and is therefore orthogonally diagonalizable; its eigenvalues are real algebraic integers.
While the adjacency matrix depends on the vertex labeling, its spectrum is a graph invariant, although not a complete one.
Spectral graph theory is also concerned with graph parameters that are defined via multiplicities of eigenvalues of matrices associated to the graph, such as the Colin de Verdière number.
Two graphs are called cospectral or isospectral if the adjacency matrices of the graphs are isospectral, that is, if the adjacency matrices have equal multisets of eigenvalues.
Cospectral graphs need not be isomorphic, but isomorphic graphs are always cospectral.
A graph is said to be determined by its spectrum if any other graph with the same spectrum as is isomorphic to .
Some first examples of families of graphs that are determined by their spectrum include:
The complete graphs.
The finite starlike trees.
A pair of graphs are said to be cospectral mates if they have the same spectrum, but are non-isomorphic.
The smallest pair of cospectral mates is {K1,4, C4 ∪ K1}, comprising the 5-vertex star and the graph union of the 4-vertex cycle and the single-vertex graph, as reported by Collatz and Sinogowitz in 1957.
The smallest pair of polyhedral cospectral mates are enneahedra with eight vertices each.
Almost all trees are cospectral, i.e., as the number of vertices grows, the fraction of trees for which there exists a cospectral tree goes to 1.
A pair of regular graphs are cospectral if and only if their complements are cospectral.
A pair of distance-regular graphs are cospectral if and only if they have the same intersection array.
Cospectral graphs can also be constructed by means of the Sunada method.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
In graph theory, a strongly regular graph (SRG) is defined as follows. Let G = (V, E) be a regular graph with v vertices and degree k. G is said to be strongly regular if there are also integers λ and μ such that: Every two adjacent vertices have λ common neighbours. Every two non-adjacent vertices have μ common neighbours. The complement of an srg(v, k, λ, μ) is also strongly regular. It is a srg(v, v − k − 1, v − 2 − 2k + μ, v − 2k + λ). A strongly regular graph is a distance-regular graph with diameter 2 whenever μ is non-zero.
In the mathematical field of graph theory, the Laplacian matrix, also called the graph Laplacian, admittance matrix, Kirchhoff matrix or discrete Laplacian, is a matrix representation of a graph. Named after Pierre-Simon Laplace, the graph Laplacian matrix can be viewed as a matrix form of the negative discrete Laplace operator on a graph approximating the negative continuous Laplacian obtained by the finite difference method. The Laplacian matrix relates to many useful properties of a graph.
In linear algebra, an eigenvector (ˈaɪgənˌvɛktər) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor. Geometrically, a transformation matrix rotates, stretches, or shears the vectors it acts upon. The eigenvectors for a linear transformation matrix are the set of vectors that are only stretched, with no rotation or shear.
The class covers topics related to statistical inference and algorithms on graphs: basic random graphs concepts, thresholds, subgraph containment (planted clique), connectivity, broadcasting on trees,
The main goal of this course is to give the student a solid introduction into approaches, methods, and tools for brain network analysis. The student will learn about principles of network science and
The students gain an in-depth knowledge of several current and emerging areas of theoretical computer science. The course familiarizes them with advanced techniques, and develops an understanding of f
Explores consensus algorithms in networked control systems, covering topics like Metropolis-Hasting models and distributed computation of Least-Squares regression.
We prove the non-planarity of a family of 3-regular graphs constructed from the solutions to the Markoff equation x2 + y2 + z2 = xyz modulo prime numbers greater than 7. The proof uses Euler characteristic and an enumeration of the short cycles in these gr ...
Technology mapping transforms a technology-independent representation into a technology-dependent one given a library of cells. This process is performed by means of local replacements that are extracted by matching sections of the subject graph to library ...
2024
Orthogonal group synchronization is the problem of estimating n elements Z(1),& mldr;,Z(n) from the rxr orthogonal group given some relative measurements R-ij approximate to Z(i)Z(j)(-1). The least-squares formulation is nonconvex. To avoid its local minim ...