
To quote the ad campaign for Westworld, the 1970s science-fiction flick about an android amusement park gone haywire: “Nothing can go wrong.”
Sure.
Robot-taxi firm Waymo is facing fresh scrutiny after one of its driverless vehicles blocked an ambulance responding to Austin’s mass shooting early Sunday morning, Axios reports, citing bystander video of the incident.
The clip shows one of the company’s distinctive white taxis stopped and completely blocking the road as an EMS crew tries to get through. Eventually, the Waymo vehicle moves along after a police officer approaches and engages with its speaker system.
Austin-Travis County EMS spokeswoman Capt. Christa Stedman told Axios the officer’s actions “quickly” rerouted vehicle, permitting the ambulance to get to the site of the shooting, which left three dead and more than a dozen injured.
Even so, it’s not the first time the tech company — which formally launched service in San Antonio and three other metros roughly a week ago — has racked up complaints for failing to yield to emergency response vehicles.
Waymo, owned by Google parent company Alphabet, also issued a voluntary software recall late last year in response to a federal inquiry into whether its rides were illegally passing school buses.
The Austin EMS spokeswoman may have appeared unperturbed by the latest Waymo incident, but plenty of folks online expressed alarm after seeing the clip.
“@Waymo y’all need to get out of Austin until you correct this shit,” tweeted one person who shared video of the malfunction. “During a mass shooting last night your vehicle was in the middle of the road sideways blocking an ambulance.”
Watchdog group Consumer Reports wasn’t amused either.
In an emailed statement, Consumer Reports Senior Policy Cooper Lohr said Waymo should be required to prove its autonomous cars stay out of the way of emergency responders before they’re deemed safe for the streets.
“Blocking an ambulance during any emergency scenario, and especially during a mass-casualty response, is an unacceptable operational failure that could lead to additional lives lost,” Lohr said. “If Waymo or another company’s autonomous driving system can’t handle flashing lights and sirens in a crisis, it isn’t ready for public roads, and it should be removed from service until the company proves it will handle the situation appropriately.”
Sign Up for SA Current newsletters.
Follow us: Apple News | Google News | NewsBreak | Reddit | Instagram | Facebook | Twitter | Or sign up for our RSS Feed
