Autonomous Uber kill vehicle (Dash Cam)

For the most part, they don't even have self driving trains and trams yet, so how can it be considered viable on the road?
One exception here is the DLR, but even there I wouldn't be surprised if a control centre is monitoring it remotely ready to shut it down in an emergency.
 
self driving trains

We've had fully automated trains here since 1985! https://en.wikipedia.org/wiki/SkyTrain_(Vancouver)

Mind you, it's grade-separated through the entire route, so it never has to deal with any car crossing, and it's got a full control center monitoring the network. If an incident causes the system to go into shutdown mode, it can require personnel to manually restart each train. But it's never had a crash.
 
Quite possibly. If the hardware didn't react and the 'driver' wasn't watching the road then something is dangerously wrong.

Not entirely. But mostly.

There should be an investigation. All at fault should be named and shamed. The pedestrian should be at the top of the list. Denying the pedestrian's guilt because he died can only send the wrong message, encouraging other pedestrians to act irresponsibly and get killed.

I don't think it's nice at all to pretend a dead person was blameless, not if it leads to further deaths. And it does. It has become socially acceptable for pedestrians to go in the road without looking. We need to change this.


Sent from my SM-G903F using Tapatalk
In the UK we have a rule that if a pedestrian has already started to cross a road then the pedestrian has right of way, since a pedestrian can't give way to a car that is not visible. It's not quite the same situation but it is mentioned in rule 170 of the highway code:
"watch out for pedestrians crossing a road into which you are turning. If they have started to cross they have priority, so give way"
I think the same applies if you come around a corner and a pedestrian has already started to cross, though I'm not sure where that is written.

In the case of the Uber car, the pedestrian had already started to cross at the start of the video. Both pedestrian and car could have avoided the accident, but it looks like the pedestrian, having seen the car coming, expected the car to slow a little to give her time to complete the crossing, as a car normally would, by the time she realised it was continuing at full speed it was too late.

I don't think this accident would have happened with a human driver, the human driver would have just slowed a little to give extra time, so I see the fault more with the car than with the pedestrian. A pedestrian making a mistake doesn't give the car the right to kill them unnecessarily, the mistake was the fault of the pedestrian, assuming the car was visible when she started to cross, the death was the fault of the car.

Of course the law is different there so I have no idea what the outcome will be.
 
but it looks like the pedestrian, having seen the car coming, expected the car to slow a little to give her time to complete the crossing, as a car normally would, by the time she realised it was continuing at full speed it was too late.

Homeless people know that cars will yield to them. They are like crows, they won't get hit. Thats why the bum was walking across nonchalant. They know people care more about their cars and wouldn't want the damages. Typical behavior by bums in the states.
 
Nigel, do you have any references on pedestrians always having right of way (priority?) when crossing?

The rule you mention ONLY applies at junctions, and motorists are ONLY required to yield AFTER pedestrians have started to cross.

Pedestrians do NOT have the right to walk into the road when it is unsafe. Far too many people believe they are entitled to. It isn't true, and it infuriates me when I see it happening.

Sent from my SM-G903F using Tapatalk
 
Nigel, do you have any references on pedestrians always having right of way (priority?) when crossing?

The rule you mention ONLY applies at junctions, and motorists are ONLY required to yield AFTER pedestrians have started to cross.

Pedestrians do NOT have the right to walk into the road when it is unsafe. Far too many people believe they are entitled to. It isn't true, and it infuriates me when I see it happening.

Sent from my SM-G903F using Tapatalk
As far as I am aware, the laws on road use do not generally affect pedestrians, with some exceptions like motorways. The law is mainly for motorised vehicles which have a duty of care to all other road users. So pedestrians don't have right of way, however the driver of the vehicle must not put them in danger under any circumstances.

Rules 205, 206, 207 of the highway code give guidance, such as:
  • "Drive carefully and slowly when ... approaching pedestrians on narrow rural roads without a footway or footpath. Always slow down and be prepared to stop if necessary, giving them plenty of room as you drive past."
  • "children and older pedestrians who may not be able to judge your speed and could step into the road in front of you. At 40 mph (64 km/h) your vehicle will probably kill any pedestrians it hits. At 20 mph (32 km/h) there is only a 1 in 20 chance of the pedestrian being killed. So kill your speed"
  • "older pedestrians who may need more time to cross the road. Be patient and allow them to cross in their own time. Do not hurry them by revving your engine or edging forward"
You could try reading the road traffic act: https://www.legislation.gov.uk/ukpga/1988/52/introduction
but it wont help much since the important bits come from case law. The important thing is that all road users have an duty of care to each other, and a duty to do whatever they can to avoid an accident, however the law does target vehicle drivers above all other road users.

In most of Europe it is presumed that the vehicle driver is at fault for an accident involving a pedestrian or cyclist, even if there is a lack of evidence. In the UK it is not presumed.
 
Last edited:
It has become socially acceptable for pedestrians to go in the road without looking. We need to change this.
It has become socially acceptable because pedestrians imposed that social acceptance on road users, backed by the implicit duty of pedestrian protection that road users are obliged to, even if said pedestrians are in the wrong or breaking the law. We need to change this too.
 
Likewise I can envision situations where a human driver could easily determine how to proceed but AI could have difficulty. An obstruction (dead animal?) in a 2 lane road way with double yellow center line markings could easily cause AI to go into 'does not compute' mode. JMO.
There are far too many situations where AI starts to mean Artificial Idiocy for it to be used with infallibility in motorized vehicles.
 
It has become socially acceptable because pedestrians imposed that social acceptance on road users, backed by the implicit duty of pedestrian protection that road users are obliged to, even if said pedestrians are in the wrong or breaking the law. We need to change this too.
+1. The politicians and courts have created an environment where someone can be rewarded for being in the wrong by making innocent people legally liable for the stupidity, negligence and inconsideration of others. :banghead:
 
All right, after this tragic accident there have been many articles about it. I highly recommend you read each and every one of them:

Video suggests huge problem with Uber's driverless car program

Police chief said Uber victim “came from the shadows”—don’t believe it

Leaked data suggests Uber self-driving car program may be way behind Waymo

Inside Uber’s self-driving car mess

The only TL;DR that I can make is that Uber cut so many corners on road safety while testing their autonomous cars that a tragedy like the one in Arizona was bound to happen at some point.

A good comment on ArsTechnica from a few years ago: "It takes a special kind of incompetence to make Unreal Engine 3 run so poorly on modern hardware" [when referring to Batman Arkham Knight PC Port]. I can only adapt it for this case: "It takes a special kind of incompetence to make a good AEB system from Volvo fail with tragic consequences".


That's why there are redundant systems and pretty much every other autonomous car being developed needs to take them into consideration. Otherwise their life saving efficiency won't be better than that of actual humans driving. This isn't a case in which just 1 system failed:

Had the RADAR not picked her up, there's LIDAR. Had LIDAR failed, there were cameras below it to look for traffic and pedestrians. Had even the cameras failed, the safety driver was suppose to pay attention to any dangerous situation and prevent it, much like a human driving instructor on rookie drivers. 4 lines of defense to prevent accidents and improve road safety and all of them fell like dominoes. This is not the safe future we want from self-driving cars, and Waymo/GM/Ford/etc. seem to have it figured out while Uber decided to compensate heavily with more miles covered in friendly places where they aren't obliged to report disengagements to actually help the industry as a whole get better self-driving cars.

As such, unless they change their approach on how they test autonomous cars [including the analysis of their safety drivers bad habits], Bolty McBoltFace self-driving car will end up transporting passengers from point A to point B in a much more secure manner. Yep, no joke, one of the autonomous cars from the GM fleet testing in California is nicknamed that way. You can find it while looking at their mileage/disengagements report.
 
+1. The politicians and courts have created an environment where someone can be rewarded for being in the wrong by making innocent people legally liable for the stupidity, negligence and inconsideration of others. :banghead:
And car manufacturers went about the problem the wrong way: instead of putting pressure on law makers to make stricter rules and put harsher punishments on bad pedestrian behavior in place, they started developing systems to detect the "zombies" and prevent collisions, giving even more credit to the pedestrians' notion that it's the road users who have to watch out for them and not the other way around. :banghead::banghead:
 
The image of the accident scene shows that the cyclist probably couldn't see the car headlights when she started to cross the road, and there was plenty of room to drive around her. Accident was just beyond the white sign in the centre of the image:

JeEObbGg.jpg
 
And so close to a pedestrian crossing
 
And so close to a pedestrian crossing
Yes, the car should have been slowing down and preparing to stop if necessary instead of continuing at lethal speed.
 
I think they should license Roomba's technology of stopping when you bump into something, the car didn't even stop :watching:

cat_roomba.gif
 
The image of the accident scene shows that the cyclist probably couldn't see the car headlights when she started to cross the road,...
All the more reason she should not have been crossing there.
 
More autonomous killing machines. This happened about 15 mins away from the Tesla factory here in the Silicon Valley.

Tesla autopilot sent dude on his way to work crashing into a highway barrier, then he was struck by two trailing vehicles. The victim allegedly complained that the hexed autopilot machine was trying to send him to an early grave via the same barrier day in day out. Don't know why he continued to use the autopilot or if the autopilot took over automatically and doomed him.

 
Last edited:
Yeah , .................. (?)
We all know that people are capable of being ....... Less than intelligent .
This may be a good example of such ... #77
 
Lets just say if our actual progress as humans was mirrored 1:1 by what we think of us self, then we would be living in a Jetsons world now.

Even Einstein had to outline the problem by saying that asking the same question over and over and expect different answers are not smart, and that's pretty much what we do over and over in some ways.
 
More autonomous killing machines. This happened about 15 mins away from the Tesla factory here in the Silicon Valley.

Tesla autopilot sent dude on his way to work crashing into a highway barrier, then he was struck by two trailing vehicles. The victim allegedly complained that the hexed autopilot machine was trying to send him to an early grave via the same barrier day in day out. Don't know why he continued to use the autopilot or if the autopilot took over automatically and doomed him.


Turns out the autopilot was on and did launch him into the barrier. The machines are now using humans as test crash dummies :stig:

p5ekcsctcm0nnkvuuwz5.jpg

https://www.sfgate.com/business/article/Tesla-Says-Driver-s-Hands-Weren-t-on-Wheel-at-12795521.php

Tesla Says Driver's Hands Weren't on Wheel at Time of Accident

Tim Smith and Dana Hull, Bloomberg
Published 8:52 pm, Friday, March 30, 2018

(Bloomberg) -- Tesla Inc. said computer logs of the Model X vehicle involved in a fatal crash a week ago showed the driver didn’t have his hands on the steering wheel for six seconds before the accident.

“The driver had received several visual and one audible hands-on warning earlier in the drive,” Tesla said. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
 
Back
Top