
Waymo’s robotaxis have ignited safety alarms by repeatedly ignoring stopped school buses in Texas and Georgia, prompting a voluntary software recall and a federal probe into the technology’s reliability.
Austin school bus cameras first captured the violations in August 2025, with the Austin Independent School District logging 12 by October and alerting the National Highway Traffic Safety Administration. These weren’t one-off errors: incidents piled up several times a week across multiple sites. By early December, Austin tallied 20 cases, while Atlanta schools reported six more. Even Waymo’s November 17 software update failed to halt them, with five additional violations in Austin after that date, the last on December 1.
A Fundamental Traffic Law Breached

Every U.S. state mandates full stops for vehicles facing school buses with flashing red lights and extended stop arms, a safeguard against injuries to children boarding or exiting. Waymo’s fifth-generation autonomous system flouted this core rule time and again, fueling doubts about its fitness for busy urban streets. No injuries resulted, but the pattern exposed gaps in the AI’s grasp of basic road protocol.
Escalating Demands from Local Authorities

Austin officials fired off formal letters in October, pressing Waymo for fixes. Despite claimed improvements, violations continued through November 20. The district’s legal team urged a halt to operations during peak school hours—5:20 a.m. to 9:30 a.m. and 3 p.m. to 7 p.m. on weekdays—but Waymo declined. Austin ISD Police Chief Wayne Sneed voiced frustration, stressing the need to curb known risks to students. By early December, the district issued 20 citations, amassing $2,100 in fines, and began weighing further legal steps.
Near-Misses and Clear Evidence

A November incident on Austin’s South First Street showed a Waymo vehicle slow near a stopped bus, then speed past as a student crossed into its path. In Atlanta, a September 22 video depicted a robotaxi bypassing a bus’s extended arm in daylight, red lights flashing, as students exited. These visuals, backed by bus cameras, left little room for dispute, amplifying fears that the system not only skipped stops but mishandled child proximity.
Federal Scrutiny and Company Response
NHTSA launched a probe after Austin’s alert, sending Waymo a December 3 letter demanding detailed records by January 20, 2026, on bus detection, signal interpretation, and decision-making, plus violation videos and test data. On December 6, Waymo filed its third recall in under two years, affecting its fleet. Chief Safety Officer Mauricio Peña noted the company experiences “twelve times fewer injury crashes involving pedestrians than human drivers,” framing the recall as a proactive step to elevate performance. Waymo maintained no students were directly endangered, citing adequate distances. Yet prior recalls—in February 2024 for 444 vehicles after truck collisions in Phoenix, and June 2024 for over 670 after a pole strike—hinted at recurring object-recognition flaws.
Ongoing Concerns Beyond Buses

Austin police also flagged Waymo vehicles disregarding officers’ emergency lights and hand signals at intersections, pointing to broader lapses in emergency protocol recognition. Undeterred, Waymo eyed 2026 expansions to Dallas, Houston, San Antonio, Miami, and Orlando, even as regulators circled.
These episodes underscore persistent hurdles for self-driving tech in obeying unambiguous rules humans follow instinctively, testing public trust and regulatory patience. NHTSA’s review could spur wider recalls or curbs, while districts push enforcement—shaping whether Waymo’s ambitions advance or stall amid demands for ironclad child safety.
Sources:
USA Today, December 9, 2025
CBS News, December 5, 2025
PC Magazine, December 8, 2025
NHTSA preliminary investigation, October–December 2025
Historical analysis of autonomous vehicle incidents, 2016–2025
ABC News, CNN, CBS News reporting, December 4-8, 2025