Originally posted by MarkJerling According to the data extracted from the system, the car 'saw' the pedestrian.
"According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. "
As this is an experimental system, and not a fully autonomous vehicle, the driver was supposed to intervene. She did not, as she was not looking at the road at the time of the collision.
This kind of testing -- with the computer watching the road but not fully controlling the car -- is common during development. It collects a lot of data on how the sensors and control algorithms would likely react to varying real-world conditions without relying on them if those systems are not ready to take full control.
There was clearly a serious bit of human negligence behind this tragedy. Certainly the driver should have been paying more attention. But it's also possible that the manager(s) or engineer(s) overseeing the testing were negligent in not clearly instructing the driver about what the computer was controlling or ignoring.
This event and some of the accidents involving Tesla's mis-named "Autopilot" highlight one of the challenges to developing these kinds of systems. Partial automation (SAE levels 2 and 3) may be worse than no automation at all. If the car's systems are good, the driver quickly becomes habituated to the car handling the driving. But if the car's systems are not good enough, the driver won't be paying any attention to the road when the system fails to recognize a dangerous situation.