M---
Active Member
- Joined
- Jun 23, 2014
- Messages
- 371
- Reaction score
- 215
- Location
- Vancouver
- Country
- Canada
- Dash Cam
- A118C
This couldn't possibly be the pedestrian's fault though?
Absolutely agree, it's the pedestrian's fault for crossing the street without regard for traffic, and they paid the ultimate penalty.
However, this collision could've been very substantially mitigated, and possibly (probably) entirely prevented, if only the car had been paying proper attention.
So yes, it's the pedestrian's fault, but this is something that Uber should certainly learn from and adjust their cars' programming so that they don't needlessly kill more people.
The dead person here is easily written off-- a homeless person with dark clothing jaywalking at night, and they aren't around anymore to complain. But what if it was a kid chasing a ball, and the car never even hit the brakes? And the car kept driving, with a horrified Uber customer in the backseat, still taking them to their destination? Or a cyclist riding in the traffic lane, and the car simply failed to notice them, ran over them, and kept going...
With the car completely failing to notice this pedestrian, when it's got a full of sensors that can see in the dark, and which should have seen a slow-moving obstable, the implications of what could happen with a large fleet of these let loose on the streets are horrific.
One article I was reading today said that human-driven cars in the USA average 1 death per 86 million miles. Self-driving cars are estimated to have driven about 10-15 million miles, and they've already got their first death.
I'm not even remotely anti-self-driving-car. But this is a type of collision that should never have happened. The suite of sensors should have been tested/calibrated against a large number of simulated obstacles moving quickly/slowly on a closed course. Tested day, tested night, tested with the sun ahead, tested with the sun behind, tested without car headlights, tested with various types of overhead lights... It was evidently not sufficiently tested. And that's what scares me the most.
I think Uber is rushing this onto the street, and they are probably aware that they haven't tested it properly, but they're doing it anyway. They rationalize that it's OK, because there's a human behind the wheel. But the human's bored, and not paying attention. As a result, an unproven, inadequately-tested car is driving itself without effective supervision. It's as safe as a drunk driver, but maybe with better lane-keeping abilities.