Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Utterance-level intent detection and token-level slot filling are two key tasks for spoken language understanding (SLU) in task-oriented systems. Most existing approaches assume that only a single intent exists in an utterance. However, there are often multiple intents within an utterance in real-life scenarios. In this paper, we propose a multi-intent SLU framework, called SLIM, to jointly learn multi-intent detection and slot filling based on BERT. To fully exploit the existing annotation data and capture the interactions between slots and intents, SLIM introduces an explicit slot-intent classifier to learn the many-to-one mapping between slots and intents. Empirical results on three public multi-intent datasets demonstrate (1) the superior performance of SLIM compared to the current state-of-the-art for SLU with multiple intents and (2) the benefits obtained from the slot-intent classifier.
Giovanni De Micheli, Alessandro Tempia Calvino
Olaf Blanke, José del Rocio Millán Ruiz, Ronan Boulic, Bruno Herbelin, Ricardo Andres Chavarriaga Lozano, Fumiaki Iwane
Giovanni De Micheli, Alessandro Tempia Calvino, Heinz Riener, Shubham Rai, Akash Kumar