Related publications (7)

Modeling Structured Data in Attention-based Models

Alireza Mohammadshahi

Natural language processing has experienced significant improvements with the development of Transformer-based models, which employ self-attention mechanism and pre-training strategies. However, these models still present several obstacles. A notable issue ...
EPFL2023

Unsupervised Graph Representation Learning with Cluster-aware Self-training and Refining

Yichen Xu, Qiang Liu, Feng Yu

Unsupervised graph representation learning aims to learn low-dimensional node embeddings without supervision while preserving graph topological structures and node attributive features. Previous Graph Neural Networks (GNN) require a large number of labeled ...
New York2023

Introducing dynamicity in JavaBIP

The JavaBIP framework allows the coordination of software components in an exogenous manner, while clearly separating the functional and coordination aspects of the system behaviour. JavaBIP implements the principles of the BIP component framework rooted i ...
2016

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.