One of many purported benefits of self-driving automobile tech is that each automobile can be taught from one car’s errors. Right here’s how Waymo places it on its web site: “The Waymo Driver learns from the collective experiences gathered throughout our fleet, together with earlier {hardware} generations.”
However in Austin, Waymo’s automobiles struggled for months to discover ways to cease for varsity buses as drivers picked up and dropped off youngsters. An official with the Austin Impartial College District (AISD) alleged that the automobiles had, in no less than 19 cases, “illegally and dangerously” handed the district’s college buses whereas their pink lights had been flashing and their cease arms had been prolonged slightly than coming to finish stops, because the legislation requires.
In early December, Waymo even issued a federal recall associated to the incidents, acknowledging no less than 12 of them to federal regulators on the Nationwide Freeway Site visitors Security Administration (NHTSA), which oversees street security. In line with federal filings, engineers with the self-driving car firm had “developed software program adjustments to handle the habits” weeks earlier than.
However even after the recall, the school-bus-passing incidents continued, in keeping with college officers and a report from the Nationwide Transportation Security Board (NTSB), an unbiased federal security watchdog that’s additionally investigating the state of affairs.
Now, electronic mail and textual content messages between college officers and Waymo representatives, obtained by WIRED by a public information request, present the lengths that the Austin public college district and Waymo went to attempt to remedy the issue. AISD even hosted a half-day “information assortment” occasion in a college car parking zone in mid-December, the paperwork present, with a number of staff pulling collectively college buses and stop-arm indicators from throughout the fleet so the self-driving automobile firm might accumulate data associated to automobiles and their flashing lights.
Nonetheless, by mid-January, over a month later, the varsity district reported no less than 4 extra school-bus-passing incidents had taken place in Austin. “The info we collected from the start of the varsity 12 months to the tip of the semester exhibits that about 98 p.c of people who obtain one violation don’t obtain one other,” an official with the varsity’s police division instructed the native NBC affiliate that month. “That tells us that the particular person is studying, however it doesn’t seem the Waymo automated driver system is studying by its software program updates, its recall, what have you ever, as a result of we’re nonetheless having violations.”
The state of affairs raises questions in regards to the self-driving applied sciences’ curious blind spots and the trade’s potential to compensate for them even after they’ve been noticed.
Self-driving software program has lengthy struggled with recognizing flashing emergency lights and street security units with lengthy, skinny arms, together with gates and stop-arms, says Missy Cummings, who researches autonomous automobiles at George Mason College and served as a security adviser to the NHTSA throughout the Biden administration. “If [the company] did not repair this a couple of years in the past, the extra they drive, the extra it’s going to be an issue,” she says. “That’s precisely what’s occurring right here.”
Waymo didn’t reply to WIRED’s requests for remark. A spokesperson for the Austin Impartial College District referred WIRED to the NTSB whereas the incidents are below investigation. A spokesperson for the NTSB declined to reply WIRED’s questions whereas its investigation continues.
Unlawful Passing
By midwinter of 2025, AISD officers had been annoyed. In one of many 19 incidents alleged by a lawyer for the district in a letter later launched by federal street security regulators, a Waymo handed a college bus letting off youngsters “solely moments after a pupil crossed in entrance of the car, and whereas the scholar was nonetheless within the street.”
“Alarmingly,” the lawyer wrote, 5 of the alleged incidents had occurred after Waymo had assured the district that it had up to date its software program to repair the issue. Federal regulators with the NHTSA had already launched a probe into the habits. “Austin ISD is evaluating all potential authorized cures at its disposal and intends to take no matter motion is critical to guard the protection of its college students, if required,” the lawyer warned.
Elevate your perspective with NextTech Information, the place innovation meets perception.
Uncover the newest breakthroughs, get unique updates, and join with a worldwide community of future-focused thinkers.
Unlock tomorrow’s traits at present: learn extra, subscribe to our publication, and change into a part of the NextTech group at NextTech-news.com

