Eye tracking is the process of measuring either the point of gaze (where one is looking) or the motion of an eye relative to the head. An eye tracker is a device for measuring eye positions and eye movement. Eye trackers are used in research on the visual system, in psychology, in psycholinguistics, marketing, as an input device for human-computer interaction, and in product design.
In addition, eye trackers are increasingly being used for assistive and rehabilitative applications such as controlling wheelchairs, robotic arms, and prostheses. There are several methods for measuring eye movement, with the most popular variant using video images to extract eye position. Other methods use search coils or are based on the electrooculogram.
In the 1800s, studies of eye movement were made using direct observations. For example, Louis Émile Javal observed in 1879 that reading does not involve a smooth sweeping of the eyes along the text, as previously assumed, but a series of short stops (called fixations) and quick saccades. This observation raised important questions about reading, questions which were explored during the 1900s: On which words do the eyes stop? For how long? When do they regress to already seen words?
Edmund Huey built an early eye tracker, using a sort of contact lens with a hole for the pupil. The lens was connected to an aluminum pointer that moved in response to the movement of the eye. Huey studied and quantified regressions (only a small proportion of saccades are regressions), and he showed that some words in a sentence are not fixated.
The first non-intrusive eye-trackers were built by Guy Thomas Buswell in Chicago, using beams of light that were reflected on the eye, then recording on film. Buswell made systematic studies into reading and picture viewing.
In the 1950s, Alfred L. Yarbus performed eye tracking research, and his 1967 book is often quoted. He showed that the task given to a subject has a very large influence on the subject's eye movement. He also wrote about the relation between fixations and interest:
All the records .
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
La discipline de l'Interaction Homme-Machine (ou HCI : Human-Computer Interaction) vise à systématiquement placer le facteur humain dans la conception de systèmes interactifs.
The goal of VR is to embed the users in a potentially complex virtual environment while ensuring that they are able to react as if this environment were real. The course provides a human perception-ac
The course covers the fundaments of bioelectronics and integrated microelectronics for biomedical and implantable systems. Issues and trade-offs at the circuit and systems levels of invasive microelec
Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".
Eye movement in reading involves the visual processing of written text. This was described by the French ophthalmologist Louis Émile Javal in the late 19th century. He reported that eyes do not move continuously along a line of text, but make short, rapid movements (saccades) intermingled with short stops (fixations). Javal's observations were characterised by a reliance on naked-eye observation of eye movement in the absence of technology.
A saccade (səˈkɑːd , French for jerk) is a quick, simultaneous movement of both eyes between two or more phases of fixation in the same direction. In contrast, in smooth pursuit movements, the eyes move smoothly instead of in jumps. The phenomenon can be associated with a shift in frequency of an emitted signal or a movement of a body part or device. Controlled cortically by the frontal eye fields (FEF), or subcortically by the superior colliculus, saccades serve as a mechanism for fixation, rapid eye movement, and the fast phase of optokinetic nystagmus.
Delves into the resolution power of the human eye and how it distinguishes between sources, with a practical application yielding a minimum resolution distance of 820 meters.
This bachelor project, conducted at the Experimental Museology Laboratory (eM+) at EPFL, focusing on immersive technologies and visualization systems. The project aimed to enhance the Panorama+, a 360-degree stereoscopic interactive visualization system, b ...
2024
, ,
Head tracking combined with head movements have been shown to improve auditory externalization of a virtual sound source and contribute to the performance in localization. With certain technically constrained head-tracking algorithms, as can be found in we ...
Viewers of 360-degree videos are provided with both visual modality to characterize their surrounding views and audio modality to indicate the sound direction. Though both modalities are important for saliency prediction, little work has been done by joint ...