Concept

Tango (platform)

Tango (formerly named Project Tango, while in testing) was an augmented reality computing platform, developed and authored by the Advanced Technology and Projects (ATAP), a skunkworks division of Google. It used computer vision to enable mobile devices, such as smartphones and tablets, to detect their position relative to the world around them without using GPS or other external signals. This allowed application developers to create user experiences that include indoor navigation, 3D mapping, physical space measurement, environmental recognition, augmented reality, and windows into a virtual world. The first product to emerge from ATAP, Tango was developed by a team led by computer scientist Johnny Lee, a core contributor to Microsoft's Kinect. In an interview in June 2015, Lee said, "We're developing the hardware and software technologies to help everything and everyone understand precisely where they are, anywhere." Google produced two devices to demonstrate the Tango technology: the Peanut phone and the Yellowstone 7-inch tablet. More than 3,000 of these devices had been sold as of June 2015, chiefly to researchers and software developers interested in building applications for the platform. In the summer of 2015, Qualcomm and Intel both announced that they were developing Tango reference devices as models for device manufacturers who use their mobile chipsets. At CES, in January 2016, Google announced a partnership with Lenovo to release a consumer smartphone during the summer of 2016 to feature Tango technology marketed at consumers, noting a less than $500 price-point and a small form factor below 6.5 inches. At the same time, both companies also announced an application incubator to get applications developed to be on the device on launch. On 15 December 2017, Google announced that they would be ending support for Tango on March 1, 2018, in favor of ARCore. Tango was different from other contemporary 3D-sensing computer vision products, in that it was designed to run on a standalone mobile phone or tablet and was chiefly concerned with determining the device's position and orientation within the environment.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications (10)

Investigating neural resource allocation in the sensorimotor control of extra limbs

Giulia Dominijanni

The rise of robotic body augmentation brings forth new developments that will transform robotics, human-machine interaction, and wearable electronics. Extra robotic limbs, although building upon restorative technologies, bring their own set of challenges i ...
EPFL2024

Z-REX uncovers a bifurcation in function of Keap1 paralogs

Yimon Aye, Jesse Poganik, Xuyu Liu, Yi Zhao, Kuan-Ting Huang

Studying electrophile signaling is marred by difficulties in parsing changes in pathway flux attributable to on-target, vis-& agrave;-vis off-target, modifications. By combining bolus dosing, knockdown, and Z- REX & mdash;a tool investigating on- target/on ...
eLIFE SCIENCES PUBL LTD2022

Food Talks: visual and interaction principles for representing environmental and nutritional food information in augmented reality

Delphine Ribes Lemay, Nicolas Henchoz, Emily Clare Groves, Andreas Sonderegger

This user-centered design research project aimed to investigate visual and interaction principles for augmented reality (AR) in the context of environmental and nutritional food labelling. Nutritional information on existing food labels is often misunderst ...
2019
Show more
Related people (1)
Related concepts (1)
Augmented reality
Augmented reality (AR) is an interactive experience that combines the real world and computer-generated content. The content can span multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. AR can be defined as a system that incorporates three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects. The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.