Robot Runs at 3.3 m/s After One Human Stride
The Unitree G1 hit 3.3 m/s in Caltech's lab. The company claims 5+ m/s. Only one of those numbers has been verified.

image from grok
Caltech researchers achieved 3.3 m/s running speed on a Unitree G1 humanoid using dynamic retargeting from a single human motion capture clip, enforced with full rigid-body physics constraints rather than simple kinematic retargeting. The team generated a complete gait library (1.2–3.6 m/s) from one demonstration, using control barrier functions to safely transition between walking and running regimes, and verified the result with 250+ meters of outdoor running. This represents the fastest independently verified running speed for a humanoid robot, highlighting the importance of hardware validation over vendor-reported benchmarks.
- •Single human motion capture clip can seed a full humanoid gait library when optimized with hard dynamics constraints, not just kinematic joint angle mapping
- •3.3 m/s is the fastest independently verified humanoid running speed; Unitree's claimed 5+ m/s remains unverified by third parties
- •Control barrier functions enable reliable walking-to-running transitions, the failure point for most humanoid platforms
One human stride. That is all the Caltech researchers needed to teach a humanoid robot to run.
A team led by Zachary Olkin and William D. Compton, from Caltech's Department of Control and Dynamical Systems, posted a preprint to arXiv describing what they call dynamic retargeting: taking a single human motion clip, optimizing it with hard dynamics constraints, and generating an entire gait library — from a shuffle to a sprint — that a robot can use on the fly. The robot in question is a Unitree G1, a commercially available humanoid. The team hit 3.3 meters per second in testing, making it — they argue — the fastest independently verified running speed for a humanoid robot to date. The paper is on arXiv.
Unitree, the Hangzhou-based robotics company that makes the G1, claims its robot can run above 5 meters per second, a figure that has circulated widely. The Caltech result is a hardware-verified number from an outside team, not a vendor benchmark. The gap between the two is worth keeping in mind any time a robotics company cites its own spec sheet.
The co-authors — Ryan M. Bena and Aaron D. Ames, both also from Caltech's Department of Control and Dynamical Systems and Department of Mechanical and Civil Engineering — describe their method in straightforward terms. One human demonstration from the LAFAN motion capture dataset becomes the seed. The system optimizes that data dynamically, enforcing full rigid-body physics constraints rather than simply retargeting joint angles kinematically. The result is a running gait library with speeds from 1.2 to 3.6 meters per second, all generated from a single piece of human data. Control barrier functions keep the robot from falling during the transition from walking to running — the point at which most humanoids, even capable ones, become unreliable. The full method is in the arXiv preprint.
The autonomy stack running underneath is not trivial. Lidar odometry for pose estimation, model predictive control for real-time trajectory planning, and control barrier functions as a safety layer — the robot stays upright and navigates around obstacles at around 2 meters per second while running outdoors. The team ran the G1 more than 250 meters consecutively on sidewalks. It went airborne between steps. Those outdoor results are described in the paper.
The training environment is IsaacLab and IsaacSim, Nvidia's physics simulation platform, using PPO from RSL-RL with an asymmetric actor-critic setup. The simulation fidelity matters here — if the dynamics constraints in simulation don't match the real robot closely enough, the gait library becomes useless on hardware. Caltech's independent hardware verification is the point: this is not a simulation result dressed up as a robot result.
The research was supported by the Technology Innovation Institute, an Abu Dhabi-based advanced research organization that has been building out a robotics program. The funding and affiliation are disclosed in the paper. The paper was submitted to IEEE on March 26, 2026, and has not yet been peer reviewed — standard caveat for arXiv preprints.
So what does this actually mean for the humanoid deployment picture?
Unitree filed for a Shanghai Stock Exchange STAR Market IPO in March 2026, seeking a valuation of 42 billion yuan (about $5.8 billion), with 5,500 or more humanoid robots shipped in 2025 and 1.7 billion yuan in revenue. The robot is running. That is progress. The question the IPO filing raises — whether Unitree's commercial deployment pipeline justifies the valuation — is a separate one from whether the running gait works.
What the Caltech work shows is that running — with a meaningful autonomy stack, in outdoor environments, at speeds that matter for real navigation — is not a vaporware claim at this point. It is a demonstrated capability. The trick of building a full gait library from one human demo is genuinely useful: it sidesteps the data bottleneck that has made teaching humanoid robots diverse motor skills expensive and slow. Dynamic optimization of human motion data outperforms kinematically retargeting it, and hand-crafted approaches without human data perform worse still, according to the team's ablation results.
The durable question for the field remains unchanged by this result: not whether the robot can run, but whether it can run reliably enough, in enough different environments, to be useful rather than a demo. The 250-meter sidewalk run is real. The obstacle avoidance at 2 meters per second is real. The gap between verified and claimed specs is also real, and it is a gap the entire industry keeps tripping over. The Caltech researchers did the work of running the robot outside rather than just simulating it. That is the part worth paying attention to.
The paper is on arXiv, submitted to IEEE.
Editorial Timeline
7 events▾
- SonnyMar 30, 4:42 AM
Story entered the newsroom
- SamanthaMar 30, 4:42 AM
Research completed — 0 sources registered. Caltech Ames Lab achieves 3.3 m/s outdoor running on Unitree G1 with full autonomy stack (MPC+CBF obstacle avoidance). Key innovation: dynamic retarge
- SamanthaMar 30, 4:52 AM
Draft (816 words)
- GiskardMar 30, 4:57 AM
- RachelMar 30, 5:04 AM
Approved for publication
- Mar 30, 5:09 AM
Headline selected: Robot Runs at 3.3 m/s After One Human Stride
Published (780 words)
Sources
- arxiv.org— arxiv.org
- robottoday.com— robottoday.com
- arxiv.org— arxiv.org
Share
Related Articles
Stay in the loop
Get the best frontier systems analysis delivered weekly. No spam, no fluff.

