This lecture explores the unstoppable rise of computational linguistics in deep learning, discussing grammar formalisms, connectionism, variable binding, and the influence of computational linguistics on deep learning architectures. It also covers early neural network models of language, attention-based models, transformers, and future directions in inducing entities and levels from data.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace