Robot locomotionRobot locomotion is the collective name for the various methods that robots use to transport themselves from place to place. Wheeled robots are typically quite energy efficient and simple to control. However, other forms of locomotion may be more appropriate for a number of reasons, for example traversing rough terrain, as well as moving and interacting in human environments. Furthermore, studying bipedal and insect-like robots may beneficially impact on biomechanics.
Graphical user interfaceThe graphical user interface, or GUI (ˌdʒi:juːˈaɪ or ˈɡu:i ), is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicators such as primary notation, instead of text-based UIs, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs), which require commands to be typed on a computer keyboard. The actions in a GUI are usually performed through direct manipulation of the graphical elements.
Tangible user interfaceA tangible user interface (TUI) is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which is no longer used. The purpose of TUI development is to empower collaboration, learning, and design by giving physical forms to digital information, thus taking advantage of the human ability to grasp and manipulate physical objects and materials.
Zooming user interfaceIn computing, a zooming user interface or zoomable user interface (ZUI, pronounced zoo-ee) is a graphical environment where users can change the scale of the viewed area in order to see more detail or less, and browse through different documents. A ZUI is a type of graphical user interface (GUI). Information elements appear directly on an infinite virtual desktop (usually created using vector graphics), instead of in windows. Users can pan across the virtual surface in two dimensions and zoom into objects of interest.
Organic user interfaceIn human–computer interaction, an organic user interface (OUI) is defined as a user interface with a non-flat display. After Engelbart and Sutherland's graphical user interface (GUI), which was based on the cathode ray tube (CRT), and Kay and Weiser's ubiquitous computing, which is based on the flat panel liquid-crystal display (LCD), OUI represents one possible third wave of display interaction paradigms, pertaining to multi-shaped and flexible displays.
Rotating locomotion in living systemsSeveral organisms are capable of rolling locomotion. However, true wheels and propellers—despite their utility in human vehicles—do not seem to play a significant role in the movement of living things (with the exception of certain flagella, which work like corkscrews). Biologists have offered several explanations for the apparent absence of biological wheels, and wheeled creatures have appeared often in speculative fiction.
Central nervous systemThe central nervous system (CNS) is the part of the nervous system consisting primarily of the brain and spinal cord. The CNS is so named because the brain integrates the received information and coordinates and influences the activity of all parts of the bodies of bilaterally symmetric and triploblastic animals—that is, all multicellular animals except sponges and diploblasts. It is a structure composed of nervous tissue positioned along the rostral (nose end) to caudal (tail end) axis of the body and may have an enlarged section at the rostral end which is a brain.
Cognitive roboticsCognitive Robotics or Cognitive Technology is a subfield of robotics concerned with endowing a robot with intelligent behavior by providing it with a processing architecture that will allow it to learn and reason about how to behave in response to complex goals in a complex world. Cognitive robotics may be considered the engineering branch of embodied cognitive science and embodied embedded cognition, consisting of Robotic Process Automation, Artificial Intelligence, Machine Learning, Deep Learning, Optical Character Recognition, , Process Mining, Analytics, Software Development and System Integration.
Mobile interactionMobile interaction is the study of interaction between mobile users and computers. Mobile interaction is an aspect of human–computer interaction that emerged when computers became small enough to enable mobile usage, around the 1990s. Mobile devices are a pervasive part of people's everyday lives. People use mobile phones, PDAs, and portable media players almost everywhere. These devices are the first truly pervasive interaction devices that are currently used for a huge variety of services and applications.
Boston DynamicsBoston Dynamics is an American engineering and robotics design company founded in 1992 as a spin-off from the Massachusetts Institute of Technology. Headquartered in Waltham, Massachusetts, Boston Dynamics has been owned by the Hyundai Motor Group since December 2020, but having only completed the acquisition in June 2021. Boston Dynamics develops of a series of dynamic highly mobile robots, including BigDog, Spot, Atlas, and Handle.