Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture explores neuro-symbolic representations for understanding commonsense knowledge and reasoning, focusing on the challenges of providing machines with large-scale commonsense knowledge. It delves into the concept of commonsense knowledge, the representation of such knowledge, and the reasoning processes involved. The instructor discusses the limitations of deep learning in natural language processing and the shortcomings of neural approaches compared to symbolic approaches. The lecture also covers the application of neural and symbolic approaches in representing and reasoning about commonsense knowledge, emphasizing the importance of structured knowledge graphs. Additionally, it addresses the challenges of prior approaches and introduces a framework for training knowledge models from language models trained on raw text.