**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Transpose

Summary

In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal;
that is, it switches the row and column indices of the matrix A by producing another matrix, often denoted by AT (among other notations).
The transpose of a matrix was introduced in 1858 by the British mathematician Arthur Cayley. In the case of a logical matrix representing a binary relation R, the transpose corresponds to the converse relation RT.
The transpose of a matrix A, denoted by AT, ^⊤A, A^⊤, , A′, Atr, tA or At, may be constructed by any one of the following methods:
Reflect A over its main diagonal (which runs from top-left to bottom-right) to obtain AT
Write the rows of A as the columns of AT
Write the columns of A as the rows of AT
Formally, the i-th row, j-th column element of AT is the j-th row, i-th column element of A:
If A is an m × n matrix, then AT is an n × m matrix.
In the case of square matrices, AT may also denote the Tth power of the matrix A. For avoiding a possible confusion, many authors use left upperscripts, that is, they denote the transpose as TA. An advantage of this notation is that no parentheses are needed when exponents are involved: as (^TA)^n = ^T(A^n), notation ^TA^n is not ambiguous.
In this article this confusion is avoided by never using the symbol T as a variable name.
A square matrix whose transpose is equal to itself is called a symmetric matrix; that is, A is symmetric if
A square matrix whose transpose is equal to its negative is called a skew-symmetric matrix; that is, A is skew-symmetric if
A square complex matrix whose transpose is equal to the matrix with every entry replaced by its complex conjugate (denoted here with an overline) is called a Hermitian matrix (equivalent to the matrix being equal to its conjugate transpose); that is, A is Hermitian if
A square complex matrix whose transpose is equal to the negation of its complex conjugate is called a skew-Hermitian matrix; that is, A is skew-Hermitian if
A square matrix whose transpose is equal to its inverse is called an orthogonal matrix; that is, A is orthogonal if
A square complex matrix whose transpose is equal to its conjugate inverse is called a unitary matrix; that is, A is unitary if
Let A and B be matrices and c be a scalar.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (94)

Related courses (98)

MATH-111(e): Linear Algebra

L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et ses applications.

MATH-115(b): Advanced linear algebra II

L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et de démontrer rigoureusement les résultats principaux du sujet.

AR-201(r): Studio BA3 (Gay et Menzel)

The studio focusses on contextual imaginary. Developing narratives, inspired both by the site and by the student's imagination leads to a theme sustaining the project. The narrative helps the student

Matrix (mathematics)

In mathematics, a matrix (plural matrices) is a rectangular array or table of numbers, symbols, or expressions, arranged in rows and columns, which is used to represent a mathematical object or a property of such an object. For example, is a matrix with two rows and three columns. This is often referred to as a "two by three matrix", a " matrix", or a matrix of dimension . Without further specifications, matrices represent linear maps, and allow explicit computations in linear algebra.

Transpose

In linear algebra, the transpose of a matrix is an operator which flips a matrix over its diagonal; that is, it switches the row and column indices of the matrix A by producing another matrix, often denoted by AT (among other notations). The transpose of a matrix was introduced in 1858 by the British mathematician Arthur Cayley. In the case of a logical matrix representing a binary relation R, the transpose corresponds to the converse relation RT.

Unitary matrix

In linear algebra, an invertible complex square matrix U is unitary if its conjugate transpose U* is also its inverse, that is, if where I is the identity matrix. In physics, especially in quantum mechanics, the conjugate transpose is referred to as the Hermitian adjoint of a matrix and is denoted by a dagger (†), so the equation above is written For real numbers, the analogue of a unitary matrix is an orthogonal matrix. Unitary matrices have significant importance in quantum mechanics because they preserve norms, and thus, probability amplitudes.

Related MOOCs (10)

Related lectures (796)

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 2)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

SVD: Singular Value DecompositionMATH-212: Analyse numérique et optimisation

Covers the concept of Singular Value Decomposition (SVD) for compressing information in matrices and images.

Decomposition Spectral: Symmetric Matrices

Covers the decomposition of symmetric matrices into eigenvalues and eigenvectors.

Linear Transformations: Matrices and Applications

Covers linear transformations using matrices, focusing on linearity, image, and kernel.