Why Waymo’s Self-Driving Cars Struggle with School Buses

▼ Summary
– Waymo is under federal investigation after its robotaxis repeatedly failed to stop for school buses in Austin, a violation that occurred at least 24 times despite a software recall intended to fix the issue.
– The company’s safety record was further compromised when one of its vehicles struck a child in Santa Monica, resulting in minor injuries, marking a shift from incidents with no collisions.
– Experts link the safety problems to Waymo’s shift toward “end-to-end” AI driving systems, which aim for more human-like behavior but may struggle with the complex judgment required in chaotic school zones.
– Despite requests from Austin school officials to halt operations near schools during bus hours, Waymo has refused to pause its robotaxi service in these sensitive areas.
– These incidents have sparked multiple federal investigations and could damage public trust, potentially leading other cities to reconsider allowing Waymo’s vehicles on their streets.
Alphabet’s autonomous vehicle subsidiary, Waymo, has long promoted a reputation for safety and careful operation. However, a persistent and troubling pattern of failures in school zones is challenging that image. The company is grappling with repeated incidents where its robotaxis fail to properly stop for school buses, a serious legal violation across the United States. This issue emerges just as Waymo plans significant expansions, raising urgent questions about the readiness of its technology for complex, real-world environments where children are present.
A federal safety investigation began in December after officials in Austin, Texas, reported numerous violations. Waymo responded with a software update intended to correct the problem, but the fix proved insufficient. The Austin Independent School District has documented at least four new incidents since the update, bringing the total to over two dozen reported violations this school year. In one January event, a Waymo vehicle was recorded driving past a bus with its stop arm extended as children waited to cross. While Waymo notes no collisions resulted from the Austin bus incidents, the company recently acknowledged a separate event in Santa Monica where one of its vehicles struck a child, causing minor injuries.
Safety experts find these events alarming, especially as Waymo has publicly aimed to make its vehicles drive more “confidently assertive.” This shift away from ultra-cautious behavior risks having the AI adopt dangerous human driving tendencies. The unpredictable nature of school zones presents a formidable challenge. Between 2000 and 2023, illegal passing of school buses led to 61 fatalities, nearly half of whom were pedestrians under 18. Navigating these areas requires a blend of strict rule-following and intuitive judgment, a combination that proves difficult for automated systems. Children may dart out unexpectedly, and buses can create visual obstructions, demanding a level of situational awareness that current technology may lack.
Some analysts suggest the root cause could be a technical shift in Waymo’s approach. There is speculation that the company is moving toward “end-to-end” learning systems. Unlike older, modular designs with explicit safety rules, these systems use a single AI model trained on vast datasets of human driving. While this can produce more natural-seeming behavior, critics argue it is a nascent technology that might increase risk in high-stakes scenarios like bus stops by making the vehicle’s decision-making process less transparent and predictable.
In response to the ongoing problems, Austin school officials requested that Waymo voluntarily suspend its robotaxi operations near schools during morning and afternoon bus hours until a reliable solution is found. The company declined this request, a decision that safety specialists have called shortsighted and at odds with its safety-centric branding. Experts argue that pausing operations for roughly 90 minutes each school day would be a minor concession to prevent potential tragedies. One prominent researcher stated that by continuing to operate, Waymo is making a conscious choice to gamble with children’s safety.
The concentration of reported incidents in Austin may be partly due to the city’s installation of cameras on school bus stop arms, which automatically detect violations. Similar problems could be occurring in other cities without being documented. These persistent issues, combined with the California incident, have now triggered multiple federal investigations. While regulatory action in Texas remains uncertain, the negative publicity and ongoing probes could make other cities hesitant to approve Waymo’s services. The company’s current course risks not only regulatory repercussions but also a significant erosion of public trust, which it has spent years carefully building.
(Source: The Verge)

