The ability to control a behavioral task or stimulate neural activity based on animal behavior in real-time is an important tool for experimental neuroscientists. Ideally, such tools are (1) noninvasive, (2) low-latency, and (3) provide interfaces to trigger external hardware based on posture (i.e., not just objectbased-tracking). Recent advances in pose estimation with deep learning allows researchers to train deep neural networks to accurately quantify a wide variety of animal behaviors. In extending our efforts towards the animal pose estimation toolbox DeepLabCut, here, we provide a new DeepLabCut-Live! package that achieves low-latency real-time pose estimation (within 15 ms, at >100 FPS), with an additional forwardprediction module that achieves zero-latency feedback. We also provide three options for using this tool with ease: a stand-alone GUI (called DLC-Live! GUI), integration into Bonsai and into AutoPilot. Lastly, we benchmarked performance on a wide range of systems so that experimentalists can easily decide what hardware is required for their needs.
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi
Alexander Mathis, Alberto Silvio Chiappa, Alessandro Marin Vargas, Axel Bisi
Florent Gérard Krzakala, Lenka Zdeborová, Lucas Andry Clarte, Bruno Loureiro