Block matrixIn mathematics, a block matrix or a partitioned matrix is a matrix that is interpreted as having been broken into sections called blocks or submatrices. Intuitively, a matrix interpreted as a block matrix can be visualized as the original matrix with a collection of horizontal and vertical lines, which break it up, or partition it, into a collection of smaller matrices. Any matrix may be interpreted as a block matrix in one or more ways, with each interpretation defined by how its rows and columns are partitioned.
Residual neural networkA Residual Neural Network (a.k.a. Residual Network, ResNet) is a deep learning model in which the weight layers learn residual functions with reference to the layer inputs. A Residual Network is a network with skip connections that perform identity mappings, merged with the layer outputs by addition. It behaves like a Highway Network whose gates are opened through strongly positive bias weights. This enables deep learning models with tens or hundreds of layers to train easily and approach better accuracy when going deeper.
Product (mathematics)In mathematics, a product is the result of multiplication, or an expression that identifies objects (numbers or variables) to be multiplied, called factors. For example, 21 is the product of 3 and 7 (the result of multiplication), and is the product of and (indicating that the two factors should be multiplied together). When one factor is an integer, the product is called a multiple. The order in which real or complex numbers are multiplied has no bearing on the product; this is known as the commutative law of multiplication.
Hadamard product (matrices)In mathematics, the Hadamard product (also known as the element-wise product, entrywise product or Schur product) is a binary operation that takes in two matrices of the same dimensions and returns a matrix of the multiplied corresponding elements. This operation can be thought as a "naive matrix multiplication" and is different from the matrix product. It is attributed to, and named after, either French-Jewish mathematician Jacques Hadamard or German-Jewish mathematician Issai Schur.
Skew-symmetric matrixIn mathematics, particularly in linear algebra, a skew-symmetric (or antisymmetric or antimetric) matrix is a square matrix whose transpose equals its negative. That is, it satisfies the condition In terms of the entries of the matrix, if denotes the entry in the -th row and -th column, then the skew-symmetric condition is equivalent to The matrix is skew-symmetric because Throughout, we assume that all matrix entries belong to a field whose characteristic is not equal to 2.
Gradient descentIn mathematics, gradient descent (also often called steepest descent) is a iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point, because this is the direction of steepest descent. Conversely, stepping in the direction of the gradient will lead to a local maximum of that function; the procedure is then known as gradient ascent.
Square matrixIn mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order . Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as shearing or rotation. For example, if is a square matrix representing a rotation (rotation matrix) and is a column vector describing the position of a point in space, the product yields another column vector describing the position of that point after that rotation.
Connection (mathematics)In geometry, the notion of a connection makes precise the idea of transporting local geometric objects, such as tangent vectors or tensors in the tangent space, along a curve or family of curves in a parallel and consistent manner. There are various kinds of connections in modern geometry, depending on what sort of data one wants to transport. For instance, an affine connection, the most elementary type of connection, gives a means for parallel transport of tangent vectors on a manifold from one point to another along a curve.
Outer productIn linear algebra, the outer product of two coordinate vectors is the matrix whose entries are all products of an element in the first vector with an element in the second vector. If the two coordinate vectors have dimensions n and m, then their outer product is an n × m matrix. More generally, given two tensors (multidimensional arrays of numbers), their outer product is a tensor. The outer product of tensors is also referred to as their tensor product, and can be used to define the tensor algebra.
Orthogonal matrixIn linear algebra, an orthogonal matrix, or orthonormal matrix, is a real square matrix whose columns and rows are orthonormal vectors. One way to express this is where QT is the transpose of Q and I is the identity matrix. This leads to the equivalent characterization: a matrix Q is orthogonal if its transpose is equal to its inverse: where Q−1 is the inverse of Q. An orthogonal matrix Q is necessarily invertible (with inverse Q−1 = QT), unitary (Q−1 = Q∗), where Q∗ is the Hermitian adjoint (conjugate transpose) of Q, and therefore normal (Q∗Q = QQ∗) over the real numbers.