Waymo Robotaxi Recall Software After Ignoring School Bus Stop Signs
- Waymo is issuing a voluntary software recall after its robotaxis were caught illegally passing stopped school buses in Atlanta and Austin.
- Austin School District officials report 19 violations this year with several occurring even after Waymo deployed a supposed software fix in November.
- US regulators at the NHTSA have intensified their investigation and demanded detailed technical data on how the autonomous system detects school zones
![]() |
| Image From WSJ |
The intersection of autonomous technology and public safety has reached another critical flashpoint. Waymo has announced plans to voluntarily issue a software recall following alarming reports regarding how its robotaxis behave around school buses. The Alphabet owned company confirmed that it will file the necessary paperwork with federal regulators early next week. This move comes as the company attempts to quell growing anxiety among parents and officials about the safety of self driving vehicles in school zones.
The specific issue centers on the vehicle's ability to recognize and respect the unique traffic laws associated with school buses. Waymo states that it identified the problem and actually deployed a software update to its fleet on November 17. The company argues that this update has already improved performance to a level that exceeds human driving capabilities in similar scenarios. However the formality of a federal recall is still required to satisfy regulatory obligations and ensure public transparency.
This decision was not made in a vacuum. It follows a period of intensified scrutiny from the National Highway Traffic Safety Administration. The federal agency has been watching the deployment of these vehicles closely. Criticism has also mounted from local officials in cities like Atlanta and Austin where these robotaxis operate on public roads. The pressure from these municipalities has likely accelerated Waymo's decision to formalize the fix.
The catalyst for the current investigation appears to be a specific incident in Atlanta. The NHTSA’s Office of Defects Investigation opened a file in October after reviewing footage of a Waymo vehicle maneuvering erratically around a stopped school bus. The bus had its stop arm extended and its lights flashing while unloading children. Despite these clear visual warnings the autonomous vehicle crossed perpendicularly in front of the bus from the right side and then turned left around it.
This maneuver is a fundamental violation of traffic safety norms. Crossing the path of a stopped school bus is considered one of the most dangerous actions a driver can take. The computer vision system seemingly failed to interpret the "do not pass" command that is universally understood by human drivers. While no injuries occurred in this instance the potential for tragedy was obvious to anyone who viewed the footage.
The situation is further complicated by data coming out of Texas. The Austin School District has been tracking the performance of these vehicles with growing frustration. District officials claim that Waymo robotaxis have illegally passed school buses 19 times this year alone. Even more concerning is the assertion that at least five of these incidents occurred after the company allegedly updated its software in November. This discrepancy suggests the problem may be harder to solve than initially thought.
Regulators in Washington are not taking these reports lightly. The NHTSA sent a detailed letter to Waymo on December 3 demanding more information about its fifth generation self driving system. The agency wants to understand exactly how the software identifies school buses and what logic dictates the vehicle's behavior when it encounters them. This level of granular oversight indicates that the "trust us" era of autonomous vehicle testing is ending.
Waymo Chief Safety Officer Mauricio Peña issued a statement defending the company's broader record. He noted that Waymo vehicles experience twelve times fewer injury crashes involving pedestrians than human drivers. However he also admitted that maintaining the highest safety standards means recognizing when behavior falls short. He framed the recall as part of a commitment to continuous improvement rather than an admission of fundamental failure.
The concept of a "recall" in the modern automotive era has evolved significantly. In the past a recall meant physically returning a car to a dealership for repairs. Today it often involves an over the air software update that happens while the vehicle sits in a garage. These digital fixes are becoming standard procedure for modern passenger vehicles and robotaxi alike.
This is not the first time Waymo has had to correct its code this year. The company issued a previous voluntary recall in 2024 following a bizarre incident in Phoenix. In that case a driverless vehicle collided with a telephone pole in an alley during a low speed pullover maneuver. These incidents highlight the difficulty of programming a machine to handle the infinite variability of the real world.
The lack of injuries in the school bus incidents is a relief but it does not absolve the company of responsibility. Safety advocates argue that "near misses" are warning signs that must be heeded before a fatal accident occurs. The fact that these vehicles are struggling with large yellow buses with flashing lights raises questions about their ability to handle more subtle hazards.
The disconnect between Waymo's confidence and the Austin School District's data is a key point of friction. If the software was updated in November and violations continued in December then the fix may be insufficient. This disconnect fuels public skepticism and gives ammunition to critics who believe the technology is being deployed too aggressively.
Waymo has promised to continue investigating the issue and implementing further updates as needed. They are currently tracking the performance of the fleet to ensure the new logic holds up in complex urban environments. The company remains steadfast that safety is its top priority even as it navigates this public relations turbulence.
For now the robotaxis will continue to operate. But they do so under a microscope. Every interaction with a school bus will be watched by cameras and regulators and concerned parents. Waymo is betting that its code is now robust enough to handle the morning school run without causing another national headline.
