Google, together with Everyday Robots, which is also owned by Alphabet, has developed experimental robots based on AI language models. To understand how the robots learn, the team tested them as “waiters” in Google offices.
Combined with the PaLM (Pathways Language Model) AI language model, Everyday Robots’ SayCan robot became the PaLM-SayCan, a bot capable of assessing its own capabilities, environment, and human-voiced task and then breaking that task into smaller subtasks to achieve the desired goals.
According to the company, already now, in 84% of cases, robots correctly respond to human commands, and in another 74%, they correctly execute them.
Now robots can recognize such simple requests as bring a drink or food, or clean up a spilled liquid. Only about a few dozen simple steps. Such assistant androids are already being tested in Google canteens, where they bring snacks and drinks to employees and wipe tables with a sponge.
However, this is already a breakthrough in the field of artificial intelligence. After all, this paves the way for the creation of multi-purpose robots that can recognize a task and independently determine for themselves the order of its execution.
Of course, Google robots are not ready for sale. The company said it would take some time to get a clear picture of the direct commercial impact.