Mathieu Salzmann, Yinlin Hu, Shuxuan Guo
Knowledge distillation facilitates the training of a compact student network by using a deep teacher one. While this has achieved great success in many tasks, it remains completely unstudied for image-based 6D object pose estimation. In this work, we intro ...
Los Alamitos2023