Lecture

Graph-to-Graph Transformers: Syntax-aware Graph Encoding

Description

This lecture covers the Syntax-aware Graph-to-Graph Transformer architecture, which improves the input of graph relations into the self-attention mechanism of the Transformer model. It enables effective conditioning on syntactic dependency graphs for predicting both dependency-based and span-based semantic role labelling graphs.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.