François Fleuret, Nikolaos Pappas, Angelos Katharopoulos, Apoorv Vyas
Transformers achieve remarkable performance in several tasks but due to their quadratic complexity, with respect to the input’s length, they are prohibitively slow for very long sequences. To address this limitation, we express the self-attention as a line ...
Idiap2020