Breaking the Curse of Dimensionality in Deep Neural Networks by Learning Invariant Representations
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
In this dissertation, we propose multiple methods to improve transfer learning for pretrained language models (PLMs). Broadly, transfer learning is a powerful technique in natural language processing, where a language model is first pre-trained on a data-r ...
In the current deep learning paradigm, the amount and quality of training data are as critical as the network architecture and its training details. However, collecting, processing, and annotating real data at scale is difficult, expensive, and time-consum ...
Explainable Artificial Intelligence (XAI) plays a crucial role in enabling human understanding and trust in deep learning systems, often defined as determining which features are most important to a model's prediction. As models get larger, more ubiquitous ...
The recent advance of large language models (LLMs) demonstrates that these large-scale foundation models achieve remarkable capabilities across a wide range of language tasks and domains. The success of the statistical learning approach challenges our unde ...
Quantum many-body control is a central milestone en route to harnessing quantum technologies. However, the exponential growth of the Hilbert space dimension with the number of qubits makes it challenging to classically simulate quantum many-body systems an ...
Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision while preserving graph topological structures and node attributive features. Previous Graph Neural Networks (GNN) require a large number of labeled ...
Species distribution models (SDMs) relate species occurrence data with environmental variables and are used to understand and predict species distributions across landscapes. While some machine learning models have been adopted by the SDM community, recent ...
Meta-learning is the core capability that enables intelligent systems to rapidly generalize their prior ex-perience to learn new tasks. In general, the optimization-based methods formalize the meta-learning as a bi-level optimization problem, that is a nes ...
Incomplete labels are common in multi-task learning for biomedical applications due to several practical difficulties, e.g., expensive annotation efforts by experts, limit of data collection, different sources of data. A naive approach to enable joint lear ...
Abstractive summarization has seen big improvements in recent years, mostly due to advances in neural language modeling, language model pretraining, and scaling models and datasets. While large language models generate summaries that are fluent, coherent, ...