When a gunman opened fire at Bufords on West 6th Street in Austin on the morning of March 1, an ambulance trying to reach the scene had to navigate around a Waymo robotaxi that had stopped in the road to pick up a passenger. The robot was in the wrong place at exactly the wrong moment. A bystander caught it on video. No one died because of the blockage, but in a mass shooting, every second is a variable.
That single moment captures something Waymo has spent years trying to avoid: the robotaxi being the story, rather than the solution. The company, a unit of Alphabet, has built one of the better safety records in autonomous driving — it says its serious crash rate is more than tenfold lower than that of human drivers — and it has trained more than 30,000 first responders globally to work around its vehicles. But a series of incidents documented in public records and reported by TechCrunch shows that as Waymo's fleet grows, its dependence on human beings to handle the moments its software cannot is becoming a systemic problem, not a series of isolated ones.
The most clarifying example happened in San Francisco on December 20, when a citywide power outage knocked out traffic signals across the city. More than 1,500 Waymo robotaxis stalled at intersections without functioning lights, according to KQED. One of the city's 911 dispatchers sat on hold with Waymo's first responder hotline for 53 minutes trying to get a human on the line. Waymo vehicles traversed more than 7,000 dark intersections that night, the company said. During the outage, Waymo performed 62 manual retrievals of stalled vehicles through its own roadside assistance or tow trucks. In two instances, first responders had to physically move a Waymo themselves. Mayor Daniel Lurie said he called Waymo CEO Dara Khosrowshahi directly and demanded the company get the stalled vehicles off the roads immediately.
We traversed more than 7,000 dark signals on Saturday, but the outage created a concentrated spike in these requests, Waymo said in a statement after the outage. The company has since said it is improving its systems to handle mass outage scenarios faster.
At any given time, Waymo's fleet of roughly 3,000 vehicles is monitored by around 70 remote assistance workers, about half based in the U.S. and half in the Philippines, according to TechCrunch. Median one-way latency for those workers is roughly 150 milliseconds for U.S.-based operations and 250 milliseconds for operations abroad — fast enough for routine decisions, but not always fast enough for situations where context is ambiguous and stakes are high.
That ambiguity has cost them. In January, a Waymo in Austin asked a remote assistance worker to confirm whether a nearby school bus was loading or unloading children. The stop sign and flashing lights were deployed, but the worker told the robotaxi it could proceed. The Waymo then drove past the bus as it was loading children. The National Transportation Safety Board is investigating the January 12 incident, as well as a January 14 incident involving a Waymo and a special-needs school bus on a different route. Both are part of a broader NTSB inquiry that also includes a January 23 collision in Santa Monica in which a Waymo struck a nine-year-old girl who ran into the street from behind a double-parked SUV toward school; the vehicle reduced speed from 17 miles per hour to under 6 mph before contact, the NTSB said.
The Austin Independent School District recorded 19 instances of Waymo robotaxis illegally passing stopped school buses since the start of the 2025-2026 school year, according to Reuters. Waymo issued a voluntary software recall affecting more than 3,000 vehicles after the incidents. The company said it issued an earlier software update that it believed would resolve the school bus issue; five more incidents occurred in November after that update, the district said.
In August, a Waymo robotaxi tried to pass stopped traffic on the shoulder of Interstate 280 near Redwood City, California, while firefighters were responding to an incident. It got stuck. Roughly 30 minutes after Waymo called 911, a California Highway Patrol officer got behind the wheel and drove the robotaxi to a nearby park-and-ride lot. During that wait, officers were mistakenly told for about 10 minutes that Waymo wanted the passenger to drive the vehicle away. The miscommunication was only resolved when Waymo called 911 a second time.
A first responder in Atlanta had to disengage a Waymo in February after it drove into an active crime scene, before one of the company's roadside assistance workers retrieved it. This week in Nashville, a police officer manually drove a Waymo robotaxi away after it got stuck in an intersection.
Mary Ellen Carroll, executive director of the San Francisco Department of Emergency Management, attended a March 2 hearing on the December blackout and was direct. They're becoming a default roadside assistance for these vehicles, which we do not think is tenable, she said.
District Supervisor Bilal Mahmood, who oversaw the hearing, said Waymo did not provide the answer his office was looking for. I was asking: how are you going to take more accountability to ensure that our first responders are not doing that? he told TechCrunch.
Waymo has trained more than 30,000 first responders globally on how to interact with its robotaxis, the company said, and it has developed a dedicated first responder hotline. The company says its safety record is genuine and improving. Waymo provided more than 400,000 paid rides per week as of early 2026, according to TechCrunch, and served more than 14 million trips in 2025 — more than tripling its prior year and surpassing 20 million lifetime rides, according to its own year-in-review post. The company is expanding to 10 U.S. cities and is targeting more than 20 additional cities in 2026, including Tokyo and London, according to its blog.
The tension here is not really about whether Waymo's cars drive better than humans. By the numbers the company publishes, they probably do. The tension is about what happens at the edge cases — the dark intersection, the ambiguous school bus, the mass shooting, the stuck vehicle — and who absorbs the cost of those edge cases. Right now, the answer in too many documented instances is: a 911 dispatcher on hold, a police officer behind the wheel, a firefighter flagging down a robotaxi on a highway shoulder.
Waymo says it is improving. The December outage led to operational changes, the company said. The school bus software has been updated. The first responder training program is real and scaling. But 70 remote operators watching 3,000 vehicles across multiple cities, with a global expansion plan that could double or triple those numbers in the next year, is a ratio that will keep producing exactly the kind of moments described above — moments where the robot is in the way of the response, not part of it.
The Austin shooting did not become a story about the blocked ambulance. The Santa Monica collision did not become a story about the vehicle that hit a child. But as Waymo scales, the edge cases don't average out — they multiply. The question for the company's expansion plans is not whether its software is good enough. It may well be. The question is whether its human infrastructure is ready for what happens when it isn't.