When Google first showcased its Duplex voice assistant technology at its developer conference in 2018, it was both impressive and concerning. Today, at I/O 2024, the company may be bringing up those same reactions again, this time by showing off another application of its AI smarts with something called Project Astra.
The company couldn’t even wait till its keynote today to tease Project Astra, posting a video to its social media of a camera-based AI app yesterday. At its keynote today, though, Google’s DeepMind CEO Demis Hassabis shared that his team has “always wanted to develop universal AI agents that can be helpful in everyday life.” Project Astra is the result of progress on that front.
What is Project Astra
According to a video that Google showed during a media briefing yesterday, Project Astra appeared to be an app which has a viewfinder as its main interface. A person holding up a phone pointed its camera at various parts of an office and verbally said “Tell me when you see something that makes sound.” When a speaker next to a monitor came into view, Gemini responded “I see a speaker, which makes sound.”
The person behind the phone stopped and drew an onscreen arrow to the top circle on the speaker and said, “What is that part of the speaker called?” Gemini promptly responded “That is the tweeter. It produces high-frequency sounds.”
Then, in the video that Google said was recorded in a single take, the tester moved over to a cup of crayons further down the table and asked “Give me a creative alliteration about these,” to which Gemini said “Creative crayons color cheerfully. They certainly craft colorful creations.”
Wait, were those Project Astra glasses? Is Google Glass back?
The rest of the video goes on to show Gemini in Project Astra identifying and explaining parts of code on a monitor, telling the user what neighborhood they were in based on the view out the window. Most impressively, Astra was able to answer “Do you remember where you saw my glasses?” even though said glasses were completely out of frame and were not previously pointed out. “Yes, I do,” Gemini said, adding “Your glasses were on a desk near a red apple.”
After Astra located those glasses, the tester put them on and the video shifted to the perspective of what you’d see on the wearable. Using a camera onboard, the glasses scanned the wearer’s surroundings to see things like a diagram on a whiteboard. The person in the video then asked “What can I add here to make this system faster?” As they spoke, an onscreen waveform moved to indicate it was listening, and as it responded, text captions appeared in tandem. Astra said “Adding a cache between the server and database could improve speed.”
The tester then looked over to a pair of cats doodled on the board and asked “What does this remind you of?” Astra said “Schrodinger’s cat.” Finally, they picked up a plush tiger toy, put it next to a cute golden retriever and asked for “a band name for this duo.” Astra dutifully replied “Golden stripes.”
How does Project Astra work?
This means that not only was Astra processing visual data in realtime, it was also remembering what it saw and working with an impressive backlog of stored information. This was achieved, according to Hassabis, because these “agents” were “designed to process information faster by continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall.”
It was also worth noting that, at least in the video, Astra was responding quickly. Hassabis noted in a blog post that “While we’ve made incredible progress developing AI systems that can understand multimodal information, getting response time down to something conversational is a difficult engineering challenge.”
Google has also been working on giving its AI more range of vocal expression, using its speech models to “enhanced how they sound, giving the agents a wider range of intonations.” This sort of mimicry of human expressiveness in responses is reminiscent of Duplex’s pauses and utterances that led people to think Google’s AI might be a candidate for the Turing test.
When will Project Astra be available?
While Astra remains an early feature with no discernible plans for launch, Hassabis wrote that in future, these assistants could be available “through your phone or glasses.” No word yet on whether those glasses are actually a product or the successor to Google Glass, but Hassabis did write that “some of these capabilities are coming to Google products, like the Gemini app, later this year.”
Catch up on all the news from Google I/O 2024 right here!
This article originally appeared on Engadget at https://www.engadget.com/googles-project-astra-uses-your-phones-camera-and-ai-to-find-noise-makers-misplaced-items-and-more-172642329.html?src=rss