Google’s keynote presentation at its annual developer’s conference closed out with a video showcasing a prototype live translation service on AR glasses.
The video shows Google product managers handing prototype glasses to research participants, “my mother speaks Mandarin and I speak English,” one of the participants explains, with the video showing “a simulated point of view” to bring across the concept of how the glasses could essentially enable real-time subtitles as a translation service next to the face, theoretically allowing people to maintain eye contact more while speaking.
While no details were revealed about the actual specifications of the glasses, the video continued a theme from the event of Google seeking to enhance or augment interactions in the physical world, in stark contrast from a few years ago when Google supported the development of virtual worlds with Daydream. The language Google executives used during the presentation also seemed to contrast with Meta’s current push toward the “metaverse.”
“We’ve been building augmented reality into many Google products, from Google Lens to multisearch, scene exploration, and Live and immersive views in Maps,” Alphabet and Google CEO Sundar Pichai wrote. “These AR capabilities are already useful on phones and the magic will really come alive when you can use them in the real world without the technology getting in the way.”
AR glasses face severe constraints in terms of battery consumption, heat dissipation, brightness, and field of view that seem to place the timelines for true consumer-oriented standalone AR glasses out into the future at least a couple years. Still, in a process that’s been building for a long time, we’re seeing technology giants begin to ready their existing services to power this coming augmented reality platform which Pichai called “the next frontier of computing.”