User interfaceIn the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls.
Haptic perceptionHaptic perception (haptόs "palpable", haptikόs "suitable for touch") means literally the ability "to grasp something". Perception in this case is achieved through the active exploration of surfaces and objects by a moving subject, as opposed to passive contact by a static subject during tactile perception. The term haptik was coined by the German Psychologist Max Dessoir in 1892, when suggesting a name for academic research into the sense of touch in the style of that in "acoustics" and "optics".
Somatosensory systemIn physiology, the somatosensory system is the network of neural structures in the brain and body that produce the perception of touch (haptic perception), as well as temperature (thermoception), body position (proprioception), and pain. It is a subset of the sensory nervous system, which also represents visual, auditory, olfactory, and gustatory stimuli. Somatosensation begins when mechano- and thermosensitive structures in the skin or internal organs sense physical stimuli such as pressure on the skin (see mechanotransduction, nociception).
Facial recognition systemA facial recognition system is a technology potentially capable of matching a human face from a or a video frame against a database of faces. Such a system is typically employed to authenticate users through ID verification services, and works by pinpointing and measuring facial features from a given image. Development began on similar systems in the 1960s, beginning as a form of computer application. Since their inception, facial recognition systems have seen wider uses in recent times on smartphones and in other forms of technology, such as robotics.
Gesture recognitionGesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. It is a subdiscipline of computer vision. Gestures can originate from any bodily motion or state, but commonly originate from the face or hand. Focuses in the field include emotion recognition from face and hand gesture recognition since they are all expressions. Users can make simple gestures to control or interact with devices without physically touching them.
Brain–computer interfaceA brain–computer interface (BCI), sometimes called a brain–machine interface (BMI) or smartbrain, is a direct communication pathway between the brain's electrical activity and an external device, most commonly a computer or robotic limb. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions. They are often conceptualized as a human–machine interface that skips the intermediary component of the physical movement of body parts, although they also raise the possibility of the erasure of the discreteness of brain and machine.