Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the theoretical properties of Recurrent Neural Networks (RNNs), focusing on their infinite and finite precision over time. The instructor discusses common RNN architectures like LSTMs and GRUs, as well as unconventional architectures such as QRNNs. The lecture explores the relationship between RNNs and state machines, questioning if RNNs are equivalent to deterministic finite automata. It also covers the Turing completeness of RNNs, comparing them to Turing machines. The practical power of RNNs is examined through their ability to handle counting-based tasks, revealing the strengths and limitations of different RNN mechanisms. Additionally, the lecture introduces the concept of Quasi-Recurrent Neural Networks (QRNNs) as an alternative to LSTMs, discussing their potential strengths and weaknesses.