Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the Syntax-aware Graph-to-Graph Transformer architecture, which improves the input of graph relations into the self-attention mechanism of the Transformer model. It enables effective conditioning on syntactic dependency graphs for predicting both dependency-based and span-based semantic role labelling graphs.