February 21, 2014

Google wants to hear from developers who can create mobile apps that rely on precise awareness of users' surroundings.

On Thursday, the company's Advanced Technology and Projects (ATAP) group, not to be confused with its X Lab, invited developers to submit proposals for applications that take advantage of Project Tango, an experimental Android-based phone. The phone is built with custom hardware and software for tracking the device as it moves in real-time and generating a 3D model of the local area.

[ Internet of Things for Insurance? Agriculture Carriers Are Already There. ]

Project Tango hardware, capable of taking a quarter million 3D measurements every second, relies on a software development kit (SDK) with APIs for accessing position, orientation, and depth data through Android apps written in Java or C/C++, or through the Unity Game Engine (which can build for Android devices and other platforms too).

Current smartphones can manage limited tracking of position and orientation, but lack the full range of sensors and precision to run the kinds of applications Google envisions. More significantly, they aren't designed to place the device within a 3D representation of the local environment.

Project Tango phones include a vision processing system, a depth sensor, and a motion tracking camera, along with the gyroscopes and orientation sensors found in other smartphones. They can be thought of as something like a mobile version of Microsoft's Kinect system.

[ Read the rest of this article on InformationWeek. ]