Parameterized complexityIn computer science, parameterized complexity is a branch of computational complexity theory that focuses on classifying computational problems according to their inherent difficulty with respect to multiple parameters of the input or output. The complexity of a problem is then measured as a function of those parameters. This allows the classification of NP-hard problems on a finer scale than in the classical setting, where the complexity of a problem is only measured as a function of the number of bits in the input.
Non-negative matrix factorizationNon-negative matrix factorization (NMF or NNMF), also non-negative matrix approximation is a group of algorithms in multivariate analysis and linear algebra where a matrix V is factorized into (usually) two matrices W and H, with the property that all three matrices have no negative elements. This non-negativity makes the resulting matrices easier to inspect. Also, in applications such as processing of audio spectrograms or muscular activity, non-negativity is inherent to the data being considered.
Ricci decompositionIn the mathematical fields of Riemannian and pseudo-Riemannian geometry, the Ricci decomposition is a way of breaking up the Riemann curvature tensor of a Riemannian or pseudo-Riemannian manifold into pieces with special algebraic properties. This decomposition is of fundamental importance in Riemannian and pseudo-Riemannian geometry. Let (M,g) be a Riemannian or pseudo-Riemannian n-manifold. Consider its Riemann curvature, as a (0,4)-tensor field.
Tensor rank decompositionIn multilinear algebra, the tensor rank decomposition or the decomposition of a tensor is the decomposition of a tensor in terms of a sum of minimum tensors. This is an open problem. Canonical polyadic decomposition (CPD) is a variant of the rank decomposition which computes the best fitting terms for a user specified . The CP decomposition has found some applications in linguistics and chemometrics. The CP rank was introduced by Frank Lauren Hitchcock in 1927 and later rediscovered several times, notably in psychometrics.
Lie derivativeIn differential geometry, the Lie derivative (liː ), named after Sophus Lie by Władysław Ślebodziński, evaluates the change of a tensor field (including scalar functions, vector fields and one-forms), along the flow defined by another vector field. This change is coordinate invariant and therefore the Lie derivative is defined on any differentiable manifold. Functions, tensor fields and forms can be differentiated with respect to a vector field. If T is a tensor field and X is a vector field, then the Lie derivative of T with respect to X is denoted .