Waymo Self-Driving Cars Still Fail to Stop for School Buses

▼ Summary
– Waymo’s self-driving vehicles in Austin repeatedly failed to stop for school buses with flashing red lights and extended stop arms, violating traffic laws.
– The company issued a federal software recall after acknowledging at least 12 such incidents to safety regulators, but the problematic passes continued afterward.
– The Austin school district hosted a special data-collection event for Waymo to study the issue, yet over a month later, more violations were reported.
– An expert notes that autonomous systems have a known, persistent difficulty in recognizing flashing lights and long, thin objects like stop arms.
– School district officials expressed frustration, noting some incidents occurred after Waymo’s software fix, and are considering legal action to protect students.
A core promise of autonomous vehicle technology is the concept of fleetwide learning, where one vehicle’s experience can be applied to improve the entire system. However, in Austin, Texas, this principle faced a significant test as Waymo’s self-driving cars repeatedly failed to obey a fundamental traffic law: stopping for school buses. According to the Austin Independent School District (AISD), the vehicles illegally passed buses with flashing red lights and extended stop arms on at least 19 occasions, creating dangerous situations during student pick-up and drop-off.
The issue prompted a federal recall in early December, where Waymo acknowledged at least 12 such incidents to the National Highway Traffic Safety Administration (NHTSA). Company engineers had already developed software updates intended to correct the behavior. Despite this official action and the deployed fixes, the problematic passes continued, as confirmed by school officials and a subsequent report from the National Transportation Safety Board (NTSB).
Internal communications reveal the extensive efforts made to address the failure. AISD even organized a special data collection event in a school parking lot in mid-December, assembling buses and stop-arm signals so Waymo could gather specific information on the visual cues its vehicles were missing. Yet, by mid-January, the district reported at least four more violations. A school police official noted that while about 98% of human drivers who receive one violation do not repeat the offense, the automated driver system did not appear to be learning from its updates or the recall, as infractions persisted.
This pattern highlights persistent challenges within autonomous vehicle software, particularly with recognizing certain safety signals. Experts point out that self-driving systems have historically struggled with interpreting flashing emergency lights and long, thin objects like gates and stop-arms. “If [the company] didn’t fix this a few years ago, the more they drive, the more it’s going to be a problem,” said Missy Cummings, an autonomous vehicle researcher and former NHTSA safety adviser. “That’s exactly what’s happening here.”
The seriousness of the violations is underscored by specific incidents. In one case described in a letter from a district lawyer, a Waymo vehicle passed a bus “only moments after a student crossed in front of the vehicle, and while the student was still in the road.” Notably, five of the alleged incidents occurred after Waymo had informed the district that a software update had resolved the issue. With the NHTSA’s probe already underway, the district’s lawyer stated that AISD was evaluating all legal options to ensure student safety.
Waymo did not respond to requests for comment on the ongoing situation. Both AISD and the NTSB have declined to comment further while the federal investigation remains active.
(Source: Wired)




