Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Recent years have witnessed significant advance- ment in face recognition (FR) techniques, with their applications widely spread in people’s lives and security-sensitive areas. There is a growing need for reliable interpretations of decisions of such systems. Existing studies relying on various mechanisms have investigated the usage of saliency maps as an explanation approach, but suffer from different limitations. This paper first explores the spatial relationship between face image and its deep representation via gradient backpropagation. Then a new explanation approach FGGB has been conceived, which provides precise and insightful similarity and dissimilarity saliency maps to explain the “Accept” and “Reject” decision of an FR system. Extensive visual presentation and quantitative measurement have shown that FGGB achieves comparable results in similarity maps and superior performance in dis- similarity maps when compared to current state-of-the-art explainable face verification approaches.
Volkan Cevher, Grigorios Chrysos, Fanghui Liu, Yongtao Wu, Elias Abad Rocamora