Amazon has launched a new team inside Lab126 dedicated to agentic AI.
This group will build agentic AI frameworks for robotics. The aim: let robots execute complex, multistep commands using natural language. Think warehouse bots acting like flexible assistants instead of simple machines.
Lab126 is the secretive unit behind the Kindle and Echo. The move pushes Amazon deeper into “physical AI,” where robots hear and act on voice instructions.
Amazon already put an agentic AI web browser tool out earlier this year. Its cloud division has a similar group. And Alexa+—the AI-upgraded voice assistant unveiled in March—is expected to gain agentic functions too.
Amazon CEO Andy Jassy revealed the update during a company event in New York on Feb. 26, 2025.
“These systems enable robots to ‘hear, understand and act on natural language commands, turning warehouse robots into flexible, multi-talented assistants,’” Amazon said.
Agentic AI is the next frontier. Amazon is betting it will transform smart devices and robotics beyond basic chatbots and image generators.