Google I/O: Using AR, AI, and Cool Gadgets to Be Control F for the Real World
Developer conferences are still the best place to get a sneak peek at the future.
This year the Google I/O conference was heavy on Pixel devices, AI-enhanced services, and search engine refinements. Some observers have complained about the dearth of “wow” moments, but I see genuine innovations that are actually shipping. When did real-time language translation and transcription in a pair of glasses become ho-hum?
For Pixel fans, there are some new choices to consider. As expected, the company announced the Pixel 6A, a $449 handset that uses the latest Google Tensor processor. Preorders start July 21st, with deliveries starting a week later. A low-cost Pixel is excellent news, but Google also showed off the Pixel 7, which arrived this fall. The company gave no details about the Pixel 7, but most Pixel buyers will wait for the flagship. The company also announced a new pair of the Pixel Buds Pro, which will feature noise cancellation, and teased a Pixel tablet in 2023.
Google also promised new privacy features, including giving users a control panel for their Google Ads experience. Log onto your new ad console and choose to block certain ads, and select categories where you want to see more ads. User-controlled ad targeting is long overdue, but it is also sort of like asking the inmates to lock their own cell doors. I’ll keep my VPN and Ad-blocker handy.
Control-F for the Real World
Google has been pushing AR Search for years. Just yesterday I used Lens to identify the species of the purple trees that are blooming all over LA right now. (Jacarandas!) Now Google is promising to expand this capacity to do “scene exploration.” Basically, the camera can scan a whole range of objects, search for their properties online, and then show these details as an AR overlay. Ok, maybe that isn't very basic.
Google used this example. Say you want to find “highly-rated, nut-free, dark chocolate” and point your camera at this store shelf loaded with options. The end result looks like this:
To me, that is a wow moment. Because it won’t just be store shelves. Reverse image lookups are going to be a part of everyday life. Point your camera at an object and you will be able to identify it, research it, and buy it in seconds.
"Scene explorer is a powerful ability in our devices' ability to understand the world the way we do, to see relevant information overlaid in the context of the world all around us," says Prabhakar Raghavan, Google’s SVP of Search. "This is like having a supercharged Control-F [find shortcut] for the world all around you."
But without double, the highlight of the conference was the demo of the translation and transcription eyeglasses. Google Translate does an amazing job of translating in real-time, but it does require you to look into the phone while using it. Now Google has built the same functionality into a pair of glasses so you can look at the person you are talking to. Also, they look way better than Google Glass.
Google has already built a universal translator, but now it is teasing one that actually works the way we want it to. There is no word on when, or even if, such a product would be released. But if that isn’t SciFi enough for you, need to travel more.
Today’s Bits
Beyond immigration: ICE's massive surveillance system has info on most Americans, report says
Clearview AI agrees to a permanent ban on selling facial recognition to private companies
U.S. warns of discrimination in using artificial intelligence to screen job candidates
Automation: A Help or Hindrance for Employees?