Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Perceptual learning is usually thought to be exclusively driven by the stimuli presented during training (and the underlying synaptic learning rules). In some way, we are slaves of our visual experiences. However, learning can occur even when no stimuli are presented at all. For example, Gabor contrast detection improves when only a blank screen is presented and observers are asked to imagine Gabor patches. Likewise, performance improves when observers are asked to imagine the non-existing central line of a bisection stimulus to be offset either to the right or left. Hence, performance can improve without stimulus presentation. As shown in the auditory domain, performance can also improve when the very same stimulus is presented in all learning trials and observers were asked to discriminate differences which do not exist (observers were not told about the set up). Classic models of perceptual learning cannot handle these situations since they need proper stimulus presentation, i.e., variance in the stimuli, such as a left vs. right offset in the bisection stimulus. Here, we show that perceptual learning with identical stimuli occurs in the visual domain, too. Second, we linked the two paradigms by telling observers that only the very same bisection stimulus was presented in all trials and asked them to imagine the central line to be offset either to the left or right. As in imagery learning, performance improved.
Ali H. Sayed, Stefan Vlaski, Virginia Bordignon