WIRED saw a robot hand that looked unusually natural. Eka still has to prove it.
A respected robotics reporter just walked out of Eka's lab saying he saw a robot hand do something this field rarely shows in public: recover gracefully when contact goes wrong. That is new enough to matter. It is not enough to settle anything.
In WIRED's report from today, Will Knight, a veteran reporter who has covered robotics for years, wrote that he watched Eka's system unscrew a light bulb, handle a cup, recover from slips, and adjust grip force in real time. In plain English, the hand did not just complete a clean demo once. It appeared to feel trouble, correct itself, and keep going. For robot handling, that is the part that usually breaks.
That is also why the story has to stay narrow. Eka was cofounded by Pulkit Agrawal, an MIT professor, and Tuomas Haarnoja, a former Google DeepMind robotics researcher. Agrawal told WIRED that "a couple of years ago, we realized that dexterity can finally be cracked," and argued that "trillions of dollars flow through the human hand."
Maybe. The public record still amounts to Knight's eyewitness account, Eka's own homepage, and the founders' earlier research lineage. That is enough for a serious look. It is not enough to say the field has turned a corner.
That lineage is real. In 2021, MIT News reported that researchers at MIT's Computer Science and Artificial Intelligence Laboratory trained a simulated robotic hand with 24 degrees of freedom, meaning 24 separate joint motions, to reorient more than 2,000 objects. The same report said the system reached close to 100 percent success on many small circular objects, while more complex shapes landed closer to 30 percent.
Other work attacked the same problem through touch. A 2023 paper at the Conference on Robot Learning described a tactile simulator that modeled touch forces closely enough for zero-shot sim-to-real transfer in peg insertion, meaning a policy trained in simulation worked on a real robot without extra tuning. Another 2023 paper at Robotics: Science and Systems reported an in-hand rotation policy that used touch-only sensing, with dense force sensors across the palm, finger links, and fingertips.
Those papers do not prove Eka works the same way. They do show that the company did not materialize out of thin air with a magic hand and a slogan. Its founders come from a part of robotics that has spent years trying to make machine handling less brittle when objects slip, twist, or hit back.
Eka's own homepage is much less satisfying than Knight's visit. The company says its "Vision-Force-Action" model, its term for a system that combines camera input, touch, and movement decisions, unites generality, performance, and safety. It also flashes a "25X speed" claim. Faster than what, on which task, under what setup, against which baseline? The page does not say.
That missing denominator matters because the person standing next to the robot is still the real benchmark. In a warehouse, on an assembly line, or at a packing table, automation often gets stuck on the awkward moment when an object shifts, the grip slides, or a fragile part needs less force than usual. Humans rescue that moment. If Eka can reduce that rescue work, even a little, it would matter beyond a polished lab demo.
But that is still an if. What WIRED uniquely adds today is not proof of deployment or a new industry standard. It is a credible eyewitness account that Eka's hand looked unusually fluent at the exact moment robot handling usually falls apart. Until the company shows named benchmarks, standard comparisons, or a customer willing to say this works outside a staged visit, that is the right ceiling for the story.