It was a clear Plexiglas wall — the kind of obstacle that trips up almost every vision-based drone on the market. Fog machine cranked, fake snow swirling, lights killed to near-darkness. The tiny X-shaped quadrotor drifted forward, ultrasound sensors pinging, and then stopped. Backed up. Turned. Navigated around it anyway.
That moment, documented in a paper published this week in Science Robotics, is the whole argument for what Nitin Sanket and his team at Worcester Polytechnic Institute have built: a palm-sized drone that navigates the way bats do — by listening. Two ultrasonic sensors, no GPU, no cameras. In more than 180 tests across dense fog, darkness, and simulated snow, the drone found its way around transparent and thin obstacles with a success rate between 72 and 100 percent, according to The Engineer.
The drone, called PeAR Bat, weighs less than 100 grams and measures under 100 millimeters. It draws just 0.6 milliwatts per sensor — a thousand times more power-efficient than a USB camera, per Dronelife's reporting on the project. Current manufacturing cost sits around $300 per unit, with the team estimating that could drop to roughly $50 at scale. The whole thing fits in the palm of a hand and uses the same kind of ultrasonic sensor you'd find in an automatic public restroom faucet.
Sanket, an assistant professor of robotics engineering at WPI, got the idea the way a lot of good robotics ideas start: by staring at what nature already does better. Bats weigh less than two paper clips and navigate dark, damp caves using echolocation — a biological sonar system far more robust than anything engineers have replicated in silicon. "We are nowhere close to what nature has achieved," Sanket told the Associated Press. "But the goal is that one day in the future, we will be there and these will be useful for deployment in the wild."
The NSF apparently thinks he might get there. Sanket's project, "Sound Navigation: Enabling Tiny Robots to Find Their Way Through Smoke, Dust, and Darkness," won a $704,908 Foundational Research in Robotics grant starting September 1, 2025. The grant was notable because Sanket applied solo — robotics grants typically go to teams. That says something about how the NSF views the problem he is trying to solve.
The problem is real. Search-and-rescue situations happen in darkness, in smoke, in conditions where the first thing to fail is the power grid — and therefore the camera-based drones that depend on good visibility. "Currently, search and rescue robots are mainly operational in broad daylight," Sanket told the AP. "The problem is that search and rescues are dull, dangerous and dirty jobs that happen a lot of times in darkness."
Ryan Williams, an associate professor at Virginia Tech who has worked on autonomous drone search patterns, told the AP that truly autonomous drone swarms for search and rescue "effectively nil" right now. That gap — between the conditions where disasters happen and the conditions where current drones can operate — is exactly what Sanket is trying to close.
There is, as always, a but. The milliwatt power system that makes the PeAR Bat so efficient also limits flight time to around five minutes. Five minutes is not a lot of time in a collapsed building or a flooded neighborhood. The team is working on extending it. Sanket told the WPI news office: "In a real search-and-rescue mission, a few more seconds of flight time could mean the difference between life and death for a survivor." That is not hyperbole — it is an honest acknowledgment that the current hardware is a proof of concept, not a field-deployable product.
The five-minute ceiling is the right frame for everything else the paper shows. This is a lab demonstration with real constraints, not a deployed system. The obstacle avoidance worked reliably — the drone detected obstacles as close as 5 centimeters with a 120-by-60-degree field of view. But those numbers come from controlled testing environments, not a collapsed parking structure after midnight with rain coming down. The gap between "works in the lab" and "works in the field" is where most drone research dies.
The engineering challenges Sanket's team solved to get even here are worth understanding. Propeller noise is a nightmare for ultrasound sensors — the spinning blades create acoustic interference that makes the echo data unreadable. The team solved it with 3D-printed acoustic metamaterial shells designed to dampen that interference, a hardware fix paired with physics-informed deep learning to clean up the signal on the software side. Sanket described it to Dronelife as "like talking to your friend with a jet next to you" — the sensor trying to hear echoes while the rotors scream.
This is also a lesson in what not to throw money at. The standard answer to drone navigation in degraded conditions is better sensing — more sensors, more compute, LIDAR stacks that can cost more than the drone itself. Sanket's approach is the opposite: parsimonious AI, minimal hardware, a system that does exactly one thing and tries to do it extremely well. The tradeoff is that it cannot see color, cannot read signs, cannot do any of the things a camera enables. But in a smoke-filled room, none of that matters. What matters is: is there a wall in front of me?
The co-authors on the project include Colin Balfour, an undergraduate researcher, and Deepak Singh, a PhD candidate — both undergrads and grad students doing real engineering on a problem with a clear human application. That is the kind of work that does not always make the headlines but moves the field.
What happens next is the harder part: taking a system that works in a Worcester lab and making it work in the world. The team plans to develop smaller, lighter devices capable of longer flight times, and improve navigation speeds. The broader principles — low-power sound-based sensing — could also apply beyond drones: self-driving cars in fog, coral reef exploration, volcanic monitoring, anywhere light fails but sound can still reach.
For now, the PeAR Bat is a lab result, not a rescue tool. That distinction matters. But it is a lab result that points at a real gap in how we currently deploy aerial robots in the conditions where people most need them — dark, dirty, dangerous, and underpowered. The question is not whether bats are better engineers than we are. They are. The question is whether we can build something useful while we are still catching up.