The ability to control a behavioral task or stimulate neural activity based on animal behavior in real-time is an important tool for experimental neuroscientists. Ideally, such tools are noninvasive, low-latency, and provide interfaces to trigger external hardware based on posture. Recent advances in pose estimation with deep learning allows researchers to train deep neural networks to accurately quantify a wide variety of animal behaviors. Here, we provide a new DeepLabCut Live ! package that achieves low-latency real-time pose estimation (within 15 ms, >100 FPS), with an additional forward -prediction module that achieves zero-latency feedback, and a dynamic-cropping mode that allows for higher inference speeds. We also provide three options for using this tool with ease: (1) a stand-alone GUI (called DLC Live! GUI), and integration into (2) Bonsai, and (3) Autopilot. Lastly, we benchmarked performance on a wide range of systems so that experimentalists can easily decide what hardware is required for their needs.
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi
Florent Gérard Krzakala, Lenka Zdeborová, Lucas Andry Clarte, Bruno Loureiro