Computer mouseA computer mouse (plural mice, also mouses) is a hand-held pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of the pointer (called a cursor) on a display, which allows a smooth control of the graphical user interface of a computer. The first public demonstration of a mouse controlling a computer system was in 1968. Mice originally used two separate wheels to directly track movement across a surface: one in the X-dimension and one in the Y.
Multimodal interactionMultimodal interaction provides the user with multiple modes of interacting with a system. A multimodal interface provides several distinct tools for input and output of data. Multimodal human-computer interaction refers to the "interaction with the virtual and physical environment through natural modes of communication", This implies that multimodal interaction enables a more free and natural communication, interfacing users with automated systems in both input and output.
Sign languageSign languages (also known as signed languages) are languages that use the visual-manual modality to convey meaning, instead of spoken words. Sign languages are expressed through manual articulation in combination with non-manual markers. Sign languages are full-fledged natural languages with their own grammar and lexicon. Sign languages are not universal and are usually not mutually intelligible, although there are also similarities among different sign languages.
Nonverbal communicationNonverbal communication (NVC) is the transmission of messages or signals through a nonverbal platform such as eye contact, facial expressions, gestures, posture, use of objects and body language. It includes the use of social cues, kinesics, distance (proxemics) and physical environments/appearance, of voice (paralanguage) and of touch (haptics). A signal has three different parts to it, including the basic signal, what the signal is trying to convey, and how it is interpreted.
Force TouchForce Touch is a haptic technology developed by Apple Inc. that enables trackpads and touchscreens to distinguish between various levels of force being applied to their surfaces. It uses pressure sensors to add another method of input to Apple's devices. The technology was first unveiled on September 9, 2014, during the introduction of Apple Watch. Starting with the Apple Watch, Force Touch has been incorporated into many products within Apple's lineup. This notably includes MacBooks and the Magic Trackpad 2.
Tangible user interfaceA tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.