Autonomous Uber kill vehicle (Dash Cam)

This couldn't possibly be the pedestrian's fault though?

Absolutely agree, it's the pedestrian's fault for crossing the street without regard for traffic, and they paid the ultimate penalty.

However, this collision could've been very substantially mitigated, and possibly (probably) entirely prevented, if only the car had been paying proper attention.

So yes, it's the pedestrian's fault, but this is something that Uber should certainly learn from and adjust their cars' programming so that they don't needlessly kill more people.

The dead person here is easily written off-- a homeless person with dark clothing jaywalking at night, and they aren't around anymore to complain. But what if it was a kid chasing a ball, and the car never even hit the brakes? And the car kept driving, with a horrified Uber customer in the backseat, still taking them to their destination? Or a cyclist riding in the traffic lane, and the car simply failed to notice them, ran over them, and kept going...

With the car completely failing to notice this pedestrian, when it's got a full of sensors that can see in the dark, and which should have seen a slow-moving obstable, the implications of what could happen with a large fleet of these let loose on the streets are horrific.

One article I was reading today said that human-driven cars in the USA average 1 death per 86 million miles. Self-driving cars are estimated to have driven about 10-15 million miles, and they've already got their first death.

I'm not even remotely anti-self-driving-car. But this is a type of collision that should never have happened. The suite of sensors should have been tested/calibrated against a large number of simulated obstacles moving quickly/slowly on a closed course. Tested day, tested night, tested with the sun ahead, tested with the sun behind, tested without car headlights, tested with various types of overhead lights... It was evidently not sufficiently tested. And that's what scares me the most.

I think Uber is rushing this onto the street, and they are probably aware that they haven't tested it properly, but they're doing it anyway. They rationalize that it's OK, because there's a human behind the wheel. But the human's bored, and not paying attention. As a result, an unproven, inadequately-tested car is driving itself without effective supervision. It's as safe as a drunk driver, but maybe with better lane-keeping abilities.
 
Beijing just gave the Ok for testing driver-less cars, pretty scary i think with the number of people running around in that town. ( 33 roads / 105 Km of public road )

Next up Baidu.
 
Darwin
 
Beijing just gave the Ok for testing driver-less cars, pretty scary i think with the number of people running around in that town. ( 33 roads / 105 Km of public road )
.
trust me, the cars are likely to drive better than the people
 
I agree, but its also the human factor that worry me.
And knowing China, there will probably be little fuss about a car going rouge and mow down a whole family,,,,,, riding on a moped.
 
Beijing just gave the Ok for testing driver-less cars, pretty scary i think with the number of people running around in that town. ( 33 roads / 105 Km of public road )

Next up Baidu.
I wont be surprised if we end up with all cars being driven by either the Google Autonomous Drive System or the Baidu ADS, just as all phones are either Google Android or Apple IPhone. Nobody is going to buy cars that use a system that has poor crash statistics so the car manufactures are not going to fit a wide range. I don't think the Uber system has a chance.
 
I agree, but its also the human factor that worry me.
And knowing China, there will probably be little fuss about a car going rouge and mow down a whole family,,,,,, riding on a moped.
I've heard in China that it is common that if you injure somebody, you finish them off so that you don't need to pay compensation. I wonder if the Uber car made that calculation - not possible to stop in time so don't bother?
 
I've heard in China that it is common that if you injure somebody, you finish them off so that you don't need to pay compensation.

there are certainly cases where that has happened, maiming someone is a lot more expensive than killing them
 
I think they need to test all autonomous vehicles first in India. If it passes the India test then it can be certified for use elsewhere.

 
Clearly the pedestrian has some responsibility, but it is still not acceptable for the car to kill the pedestrian, it should have attempted to avoid the accident, or at least slowed down to avoid the death. The pedestrian was occupying that bit of road first, the front of the car hit the pedestrian rather than the front of the pedestrian hitting the car.
That is a very poor analysis ... and I think you already know that.
The pedestrian created the dangerous situation with a stupid action. The car and driver failed to prevent the collision. The technology failed or was disabled and a serious investigation is needed, but that does not change the cause of the incident.
 
That is a very poor analysis ... and I think you already know that.
The pedestrian created the dangerous situation with a stupid action. The car and driver failed to prevent the collision. The technology failed or was disabled and a serious investigation is needed, but that does not change the cause of the incident.
Should Uber, or their driver be charged with any offences, or do you think it was entirely the cyclists fault and they should not even be investigated?
 
I think they need to test all autonomous vehicles first in India. If it passes the India test then it can be certified for use elsewhere.

I can't see any autonomous vehicles in the next 10 years being able to cope with that!

It does raise the question of how they would cope with less disciplined situations in other places.

For example, what if it met a herd of cows on the road here in the UK. Cows can be quite unpredictable when they are using the road to move to a new field but if heading to the milking parlour are not much of a problem. As a human driver I treat cows, horses, sheep very differently, often no problem passing sheep at the speed limit, but horses often require a maximum speed of 10 mph and stopping if they get spooked, and a herd of cows, pull over and turn the engine off. Dogs, depends if they are on a lead or not, they have zero road sense! Autonomous cars need to understand all these things before you can remove the driver.
 
Should Uber, or their driver be charged with any offences,
Quite possibly. If the hardware didn't react and the 'driver' wasn't watching the road then something is dangerously wrong.
or do you think it was entirely the cyclists fault
Not entirely. But mostly.
and they should not even be investigated?
There should be an investigation. All at fault should be named and shamed. The pedestrian should be at the top of the list. Denying the pedestrian's guilt because he died can only send the wrong message, encouraging other pedestrians to act irresponsibly and get killed.

I don't think it's nice at all to pretend a dead person was blameless, not if it leads to further deaths. And it does. It has become socially acceptable for pedestrians to go in the road without looking. We need to change this.


Sent from my SM-G903F using Tapatalk
 
Should Uber, or their driver be charged with any offences, or do you think it was entirely the cyclists fault and they should not even be investigated?
IMO there was only 1 legal infraction in this scenario and that was the pedestrian. Were it not for her illegal action(s) the incident would never had happened. For sure an investigation should take place, but only to determine why the vehicle didn't respond.
 
In India a working autonomous car would just sit there going.

WTF
WTF
WTF
WTF

And after a while


Stack overflow,,,,, BSOD
And everywhere else in the world, once people learn to recognize the autonomous vehicles. Easiest ... bullying ... ever.
 
And everywhere else in the world, once people learn to recognize the autonomous vehicles. Easiest ... bullying ... ever.
Opens up a whole new world for politicians to create laws giving rights to machines. :eek:
 
The thing that concerns me most about self-driving vehicles is the possibility of a single component failure that undoes all the engineering/planning/programming/etc. I really don't think it's possible to have total redundancy of all components and all it will take is one fail - say a steering or brake system servo - and total chaos will result. The same could happen with a human controlled vehicle but my sense is a human would be better able to properly react versus a logic path implemented in the programming.

Likewise I can envision situations where a human driver could easily determine how to proceed but AI could have difficulty. An obstruction (dead animal?) in a 2 lane road way with double yellow center line markings could easily cause AI to go into 'does not compute' mode. JMO.
 
Back
Top