Project Astra 2025: Google’s universal AI assistant is now smarter and more proactive

since Its original launch In Google I/O 2024, Project Astra has become a test of AI’s AI’s aids from Google. The multimedia robot is not a producer of consumers, really, and will not be close to anyone outside a small group of laboratories. What ASTRA represents instead is a group of the largest and most ambitious Google dreams about what Amnesty International might be able to in the future. Greg Wayne, Google DeepMind, says ASTRA “” a kind of conceptual car for a global AI assistant. ”
Ultimately, things that work in ASTRA ships to Gemini and other applications. This has already included some team’s work on audio directing, memory and some basic computer use features. Since these features are generally going, the ASTRA team finds something new to work on.
This year, at the I/O Deleper Conference, Google has announced some of the new ASTRA features that indicate how the company is watching to see its assistant – and how you think the assistant can be. In addition to answering questions, and using your phone’s camera to remember where you have left your glasses, ASTRA can now Complete tasks on your behalf. You can do it without asking.
The new ASTRA feature is the most impressive is its new conclusion. “Astra can choose the time of speaking based on the events he sees,” says Wayne. “It is in fact, in a constant sense, monitoring, then he can comment.” This is a big change: Instead of directing your phone to something and asking your assistant in artificial intelligence, the ASTRA plan is that this assistant is constantly watching, listening, and waiting for his moment to intervene. And smart glasses. In this case, you can imagine How glasses in particular? It may be useful for the assistant of full vision and hearing tendian.)
ASTRA Plan is to make her assistant monitoring, listening and waiting for her moment to intervene
If ASTRA is watching while you performed your duty, then Wayne made for example, you may notice that you made a mistake and refer to the place where you made a mistake, instead of waiting until you finish and ask the robot specifically to check your work. If you are intermittent fast, ASTRA may remind you of eating before the specified time ends – or gently wonder if you should really eat at the present time, given the diet plan.
Dimis Hasabis, CEO of DeepMind, says Astra teaching to work at his own will was part of the plan all the time. He calls it “reading the room”, and says that although it is difficult to believe that it is difficult to teach the computer, it is actually much more difficult. Knowing when you should wander, what tone you should take, how to help, and when silence is only, it is something that humans do relatively well but it is difficult to identify or study. And if the product is not working well and begins to raise an unwanted and unwanted level? “Well, no one will use it if he does it,” says Hasabis. These are the risks.
The great pre -emptive assistant is still really, but one thing will definitely require a great deal of information about you. This is another new coming to ASTRA: The assistant can now access information from the web and other Google products. You can see what is in your calendar, in order to tell you when you leave; You can see what is in your e -mail to dig your confirmation number while you are walking even the reception office to register access. At least, this is the idea. If you make it work at all – then constantly and reliable – it will take some time.
Nevertheless, the last piece of puzzle is actually: Astra learns how to use your Android phone. BIBO XIU, Director of Products at DeepMind, showed a clarification show in which she played her phone camera on Sony headphones, and asked about the one she was. Astra said that either WH-1000XM4 or WH-1000XM3 (frankly, how anyone or anything can know the difference), and the Xiu Astra asked to find the guide, then explain how it associated with her phone. After explaining ASTRA, XIU cut: “Can you go forward and open settings and only pair me headphones for me, please?” All this in itself, Astra did so completely.
The process was not completely smooth – XIU had to run a feature that allowed ASTRA to see its phone screen. She says the team is still working on this automatically, but this is the goal, it can understand what it can and cannot see it at the present time. “This type of use of automatic devices is the same thing that Apple works with the next generation of Siri, and the two companies imagine an assistant that can move in applications, modification settings, respond to messages, and even play games without having to touch the screen. It is very difficult to build, of course: it was impressive Xue, and it was a simple task as you can imagine. But Astra is making progress.
Currently, most of the so -called “Agency AI” does not work very well, or at all. Even in the best scenario, it still requires you to do a lot of lifting: you should demand the system at every turn, provide all the additional context and information the application needs, and make sure everything is going smoothly. Google’s goal is to start removing all this work, step by step. She wants to know ASTRA when it is needed, know what to do, know how to do this, and know where you can find what you need to accomplish. Each part of that will require technological breakthroughs, most of which have not yet been done. Then there will be complex problems with the user interface, privacy questions and more problems besides.
If Google or anyone will really build a global assistant, he will have to obtain these things correctly. “It is another level of intelligence to be able to achieve,” says Hasabis. “But if you can, you will categorically feel different with today’s systems. I think the global assistant should be really useful.”