Self-Driving Uber SUV Kills Woman In Tempe

A self-driving Uber SUV killed a bicyclist in the early hours of Monday morning.

Tempe, AZ – A woman was struck and killed by a self-driving Uber SUV in the early hours of Monday morning.

The New York Times reported that Uber had suspended testing of its self-driving cars in Tempe, Pittsburgh, San Francisco, and Toronto, in the wake of the accident.

An Uber spokeswoman said the company was “fully cooperating” with the local authorities.

The autonomous Uber had a safety driver at the wheel, as was required for Uber’s autonomous test vehicles operating on public roads, Tech Crunch reported.

Police said that the Uber was driving northbound near Mill Avenue and Curry Road when a woman attempted to cross the street with her bicycle in front of the vehicle outside, of a crosswalk, KNXV reported.

The pedestrian was hit by the Uber, and transported to a nearby hospital where she died, police said.

There were no passengers in the self-driving Uber at the time of the crash, according to KNXV.

"Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident," Uber said in a statement the company released.

Uber CEO Dara Khosrowshahi tweeted his condolences about the tragic death.

Police have not yet released the name of the victim.

The New York Times reported that the collision was the first time anyone was known to have been killed by a self-driving vehicle on public roads.

There are huge implications for self-driving vehicles in the future as a result of this first crash, Tech Crunch reported.

It’s not known if the safety driver who was in the Uber and supposed to be watching out for this sort of problem could be held legally responsible.

Tech Crunch said that results of the investigation and the outcomes from this incident will shape the future of autonomous vehicle regulation in the United States.

Comments (14)
No. 1-14
KorbanDallas
KorbanDallas

Sometimes unexpected things happen so fast there is no time to react. If the self-drive could not stop in time, how can you expect the passenger to?

PapawR1
PapawR1

The ONLY thing that might protect the "safety driver" from liability depends on whether the bicycle rider was riding the bike or pushing it as a pedestrian & what the state laws are in AZ regarding if a bicycle on a roadway is considered a vehicle. Failure to reduce speed to prevent an accident may play a factor AND the whole idea of having a "safety driver" at the wheel is to prevent such a thing happening. If he was reading a book, texting on a phone, sleeping or whatever, then I believe they will find him criminally liable.
Regardless of how the criminal liability portion is ruled, Uber better bring their big checkbook with them to the civil trial. If this woman is a mother, especially to small children, they may need to bring several checkbooks.
Unless they have a lot of politicians in their pockets, don't look for laws allowing these vehicles with no drivers at all in them anytime soon. I realize that is the final goal of the technology, since manned vehicles increase the overhead, but this technology may never be ruled as "safe" & it will never relieve them of liability.

Betts1221
Betts1221

I will do my own driving, thank you!

Just-My-Thoughts
Just-My-Thoughts

I guess that puts the skids on that idea. They also are trying to use bus'. Horrible idea

Hi_estComnDenomn
Hi_estComnDenomn

People get hit by other people every day. This is an unfortunate incident, but I'm still all for autonomous vehicles. It will make everyone safer and travel more efficient.

BlueMoose
BlueMoose

This woman was completely at fault for crossing the street at night and not at the intersection and from the picture I can not see any lights on the bike. All the new cars now come with advanced braking systems that are thousands of times faster in responding to a threat then humans. If this system and the human driver failed to see this woman in time odds are she stepped out right in front of the car with no warning.

Det_John_Kimble
Det_John_Kimble

Curious to see the crash report, vehicle speed, area lighting, etc. If the cyclist crossed the street abruptly maybe even the computer driven vehicle didn't have time to stop. Or, the sensors failed to pick up the cyclist in time. And if its proven the self driving car was at fault, I wonder who pays? Uber? The software company? Insurance company sure, but in the event of a civil suit? I hope this technology succeeds and proves to be far safer than human drivers. The amount of drivers DUI and texting etc these days is scary. And just think. No more worrying about traffic tickets right? If the car is doing the driving how can the passenger be cited? Gonna be interesting to see what happens with all this tech stuff up and coming. I prefer to drive myself for now, until this proves to be better.

charlesjandecka
charlesjandecka

The question will always be: Would a human driver have been able to avoid, or lessen, the crash?

OnionMan622
OnionMan622

Just because the car is self-driving, doesn't mean it's magical. If the woman rode out in front of the car quickly, the car physically may not have been able to stop in time, just like with a human driver. Matter of fact, it probably had a faster reaction time than any human driver would have had in that situation.

JMT33
JMT33

Maybe it is time we get pedestrians out of the streets as well as bicycles

JakeTheCat
JakeTheCat

Probably reacted faster is relevant a real human may have seen her coming if and may have been ready to adjust if need be. Machines cannot THINK outside their rules... which is what makes self driving vehicles that are not on set tracks a problem.

ilia
ilia

True, but no one advertises humans as being the paragon of safety. On the contrary, self-driving vehicles are promoted as the solve-all of traffic problems; supposedly they will practically eliminate crashes, what with their superior sensors and their not being fallible, like us flawed humans. Obviously, it is all media hype meant to make us fork over the money. They still have a long way to go before they're as safe as supposed to be. And a human behind the wheel is no guarantee either; he/she may fall asleep, use the phone or be otherwise distracted, especially if his/her attention is not focused by the very act of driving.

tday1973
tday1973

What??? that is crazy thinking an unmanned car is safer that someone that can put on the breaks..evidently the car didn't have a stop for pedestrian program. Unmanned cars are a risk because they don't think they work on programs.