Sony's table-tennis robot finally beat a professional player. Not an elite amateur — a professional, the kind who earns a living at the game. Sony AI said Ace did it in March 2026, six months after a paper in Nature showed the same robot winning three of five matches against elite club players. The pro win is genuinely new. The earlier results are the foundation that makes it credible.
Table tennis is a useful robotics benchmark precisely because it is hard. The ball exceeds 20 meters per second with spin above 160 revolutions per second. Every shot reshapes the next exchange. The opponent is not cooperating. The court is not a lab. In the Nature paper, Ace played under official International Table Tennis Federation rules against five elite players and two professionals, Minami Ando and Kakeru Sone. The paper's result: three wins against elite players, losses to both professionals. The pro win came later, reported by Reuters as Sony's own claim, and Sony AI says it happened again in December 2025.
What makes Ace physically competitive is a substantial sensing stack. On Ace's project page, Sony says the system uses 12 high-speed sensors. The Nature paper breaks that out: nine external cameras locate the ball in 3D space at 200 times per second with 3.0 millimeter average error and 10.2 milliseconds average latency, while three event-based vision systems estimate spin and angular velocity at roughly 400 to 700 hertz. A custom arm with eight degrees of freedom, two sliding joints and six rotating ones, executes the response. The control policy, trained in simulation with deep reinforcement learning, is queried at 31.25 hertz and turned into motion trajectories sampled at 1 kilohertz. Sony says in a company blog post that Ace's end-to-end latency is 20.2 milliseconds, versus roughly 230 milliseconds for elite human players.
That is a real mechanical advantage. Sony AI researcher Michael Spranger told AP News the goal was fairness and comparability, not a raw sensor advantage. John Billingsley, a robotics researcher at the University of Southern Queensland, gave AP the less charitable read: Sony approached the problem "mob-handed" with "sledgehammer techniques" no human gets to bring to the table. Both readings are accurate. The sensing stack is real. So is the real-time adaptation it enables.
The more useful question is what Ace is a proxy for. A warehouse robot that perceives a changing environment and adjusts on the fly is worth more than one that runs a fixed routine. A drone that reacts to wind and obstacles mid-flight rather than rerouting is the same stack, applied differently. Ace demonstrates a piece of that capability in a domain where success and failure are unambiguous and fast. The sensing overhead is an engineering constraint, not a verdict on the direction.
Fast perception, adversarial adaptation, real-time control: that is the part that matters beyond the demo.