Concept

Multilinear subspace learning

Multilinear subspace learning is an approach for disentangling the causal factor of data formation and performing dimensionality reduction. The Dimensionality reduction can be performed on a data tensor that contains a collection of observations have been vectorized, or observations that are treated as matrices and concatenated into a data tensor. Here are some examples of data tensors whose observations are vectorized or whose observations are matrices concatenated into data tensor s (2D/3D), video sequences (3D/4D), and hyperspectral cubes (3D/4D). The mapping from a high-dimensional vector space to a set of lower dimensional vector spaces is a multilinear projection. When observations are retained in the same organizational structure as matrices or higher order tensors, their representations are computed by performing linear projections into the column space, row space and fiber space. Multilinear subspace learning algorithms are higher-order generalizations of linear subspace learning methods such as principal component analysis (PCA), independent component analysis (ICA), linear discriminant analysis (LDA) and canonical correlation analysis (CCA). Multilinear methods may be causal in nature and perform causal inference, or they may be simple regression methods from which no causal conclusion are drawn. Linear subspace learning algorithms are traditional dimensionality reduction techniques that are well suited for datasets that are the result of varying a single causal factor. Unfortunately, they often become inadequate when dealing with datasets that are the result of multiple causal factors. . Multilinear subspace learning can be applied to observations whose measurements were vectorized and organized into a data tensor for causally aware dimensionality reduction. These methods may also be employed in reducing horizontal and vertical redundancies irrespective of the causal factors when the observations are treated as a "matrix" (ie. a collection of independent column/row observations) and concatenated into a tensor.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (11)
MATH-111(e): Linear Algebra
L'objectif du cours est d'introduire les notions de base de l'algèbre linéaire et ses applications.
MATH-213: Differential geometry I - curves and surfaces
Ce cours est une introduction à la géométrie différentielle classique des courbes et des surfaces, principalement dans le plan et l'espace euclidien.
EE-566: Adaptation and learning
In this course, students learn to design and master algorithms and core concepts related to inference and learning from data and the foundations of adaptation and learning theories with applications.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.