When Boston Dynamics and Google DeepMind announced their partnership at CES in January 2026, the press release read like a standard research collaboration. Read it again six weeks later, after Agile Robots tied itself to DeepMind on March 24, and a different story emerges: Google is writing the operating system for the next generation of humanoid hardware, and the robots in question are already learning from each other across different bodies.
The partnership between Agile Robots, a Munich-based hardware maker founded in 2018, and Google DeepMind puts the Gemini Robotics framework — specifically the Gemini 1.5 Robotics architecture — onto Agile's ONE humanoid, matching human walking speeds of 2.0 meters per second with what Agile describes as highly dexterous hands. Factory floor deployment begins in 2026. That part is in the press release.
What's not in the press release is the data relationship. DeepMind's robotics division, led by director Kanishka Rao, has identified a core bottleneck: unlike large language models, which trained on the accumulated text of the internet, robotics lacks internet-scale physical interaction data. The partnership with Agile is designed to generate that data — robot deployments in real factory environments producing the kind of embodied feedback that Gemini's models need to improve. The bidirectional flow is the point. Agile gets access to state-of-the-art foundation models; DeepMind gets physical world training signal it cannot generate in a lab.
This is the Android play, and it is accelerating. In February 2026, Google folded Intrinsic — its robotics software subsidiary that had been sitting in Alphabet's Other Bets portfolio — back into the main company, explicitly naming Android as the model to emulate, according to CNBC. In November 2025, DeepMind hired Aaron Saunders, the former CTO of Boston Dynamics, as VP of hardware engineering. The trusted tester list on DeepMind's Gemini Robotics page lists seven robotics companies, while Agility Robotics and Apptronik appear separately in the Partner section alongside Universal Robots. Two in six weeks have now signed formal DeepMind partnerships.
The cross-embodiment capability makes this structurally different from a typical vendor relationship. DeepMind's ALOHA 2 project demonstrated that skills trained on one robotic body can transfer to a different platform with reduced fine-tuning — a capability the company says accelerates learning across multiple embodiments. DeepMind's own March 2025 blog post introducing Gemini Robotics notes that the model was trained primarily on ALOHA 2 data but that it can be specialized for more complex embodiments, such as the humanoid Apollo robot developed by Apptronik, with the goal of completing real world tasks. If that holds at factory scale, it means Google is not just building software for one hardware maker's robot — it is building a skill layer that works across all of them. The data Google collects from every deployment improves every other deployment. That is the Android comparison: one OS, many OEMs, and a data moat that compounds.
Zhaopeng Chen, founder, formerly of DLR, who founded Agile Robots in 2018 and has now shipped more than 20,000 robotic solutions globally, framed the opportunity in the partnership announcement: autonomous, intelligent production systems that can transform entire industries. Chen is not a hype-cycle founder. His 20,000 installations span industrial, medical, and research environments — a deployment footprint that gives DeepMind something harder to replicate than a clean demo.
There are open questions that the press release does not address. Who owns the data generated during Agile robot deployments — does it stay with Agile, flow exclusively to Google, or become shared intellectual property? The partnership announcement does not specify terms. Some DeepMind employees raised concerns at an all-hands meeting earlier this year about the lab's discussions with the Department of Defense, according to Business Insider. A company positioning itself as the neutral infrastructure layer for robotics cannot easily be both a defense contractor and a platform trusted by every major hardware OEM simultaneously. Google has not resolved this tension publicly.
The Figure AI trajectory is instructive context. In February 2025, Figure ended its collaboration with OpenAI, citing the need to build AI models in-house after what it described as an internal breakthrough, according to TechCrunch. Figure made the same calculation many robotics companies are making right now: the AI layer is too important to leave to a third party. If Google succeeds in becoming the Android of robotics, the question for every hardware maker on that trusted tester list is whether they are becoming Motorola — which built good hardware on top of Android and captured a share of a massive market — or Apple, which kept the whole stack. Agile Robots has raised more than $270 million from SoftBank Vision Fund 2, Xiaomi, Foxconn Industrial Internet, and others. That investor roster suggests a company with ambitions that extend beyond being a hardware vessel for Google's models.
The real test will not be the demo. It will be what happens on a factory floor in Bavaria, or wherever Agile's ONE units go to work, when the robot encounters a situation Gemini has not seen. How Google and Agile structure the data rights around those failures will determine whether this partnership builds Google's Android or merely rents it.