From next month, Google will begin testing in the real world its AR-powered glasses that can do real-time language translation. This is one of several augmented reality prototypes that the Mountain View, California-based company will be testing in the real world.
Google recently published a blog post revealing that the company is working on AR navigation experiences that will help the company understand weather and busy intersections that are difficult, sometimes impossible, to recreate indoors. One of the early AR prototypes is a pair of simple glasses, which the company has been testing in its laboratories. The glasses offer real-time translation as well as transcription and directly fits the object on the lens.
The AR prototype glasses are designed to simulate activities such as translation, transcription, and navigation. They are made of plastic, have an in-lens display, and have audio and visual sensors, such as a microphone and a camera. Both are capable of taking pictures or videography, although image data will be used for navigation, translation, and visual search.
If you are in a country where you do not understand the local language, you can use the camera to translate the written words on the boards in front of you or you can find directions to a nearby restaurant right in your line of vision on the glasses.
The image data is deleted when the experience is completed, except in cases where the image data will be used for analysis and debugging. It is then stored on a secure server with limited access by a small number of Googlers for analysis and debugging. After 30 days, it is deleted, according to Google.
Juston Payne, Google's Group Product Manager, has announced that the company would begin small-scale testing in public spaces in the United States. These include schools, government buildings, healthcare facilities, places of worship, social service facilities, areas for children, emergency response locations, rallies or demonstrations, and other similar locations.