Publications associées (72)

Driving and suppressing the human language network using large language models

Martin Schrimpf

Transformer models such as GPT generate human-like language and are predictive of human brain responses to language. Here, using functional-MRI-measured brain responses to 1,000 diverse sentences, we first show that a GPT-based encoding model can predict t ...
Berlin2024

Matchertext: Towards Verbatim Interlanguage Embedding

Bryan Alexander Ford

Embedding text in one language within text of another is commonplace for numerous purposes, but usually requires tedious and error-prone “escaping” transformations on the embedded string. We propose a simple cross-language syntactic discipline, matchertext ...
2022

Type-preserving compilation of (most of) FGJ into DOT

Guillaume André Fradji Martres

The Dependent Object Type (DOT) calculus was designed to put Scala on a sound basis, but while DOT relies on structural subtyping, Scala is a fundamentally class-based language. This impedance mismatch means that a proof of DOT soundness by itself is not e ...
2022

PARSINLU: A Suite of Language Understanding Challenges for Persian

Despite the progress made in recent years in addressing natural language understanding (NLU) challenges, the majority of this progress remains to be concentrated on resource-rich languages like English. This work focuses on Persian language, one of the wid ...
2021

Graph Chatbot

Chattez avec Graph Search

Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.

AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.