Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
As an 'early alerting' sense, one of the primary tasks for the human visual system is to recognize distant objects. In the specific context of facial identification, this ecologically important task has received surprisingly little attention. Most studies have investigated facial recognition at short, fixed distances. Under these conditions, the photometric and configural information related to the eyes, nose and mouth are typically found to be primary determinants of facial identity. Here we characterize face recognition performance as a function of viewing distance and investigate whether the primacy of the internal features continues to hold across increasing viewing distances. We find that exploring the distance dimension reveals a qualitatively different salience distribution across a face. Observers' recognition performance significantly exceeds that obtained with the internal facial physiognomy, and also exceeds the computed union of performances with internal and external features alone, suggesting that in addition to the mutual configuration of the eyes, nose and mouth, it is the relationships between these features and external head contours that are crucial for recognition. We have also conducted computational studies with convolutional neural networks trained on the task of face recognition to examine whether this representational bias could emerge spontaneously through exposure to faces. The results provide partial support for this possibility while also highlighting important differences between the human and artificial system. These findings have implications for the nature of facial representations useful for a visual system, whether human or machine, for recognition over large and varying distances.
Touradj Ebrahimi, Yuhang Lu, Zewei Xu
Touradj Ebrahimi, Yuhang Lu, Zewei Xu