49-year-old Elaine Herzberg, Killed by Self-Driving Uber car
(Tempe, AZ) ABC15
The Accident Scene
Reverse projection to calibrate video from Uber car. The FARO (LIDAR) scanner is used in several places to make a 3D map of the scene. Because the officer is standing at that location, I *suspect* it is the point of impact.
Where the car and pedestrian ended up. Pedestrian was carried over 40 yards. Can’t tell if there are skid marks. (reportedly, Uber vehicle was going at 40 mph.)
Reverse view of the point of impact. Scene is darker at right where presumably Ms. Herzberg was going. This could be autoexposure on the video.
Notice the 3D FARO scanner on tripod (center) and the back of the Uber Volvo in the distance (left).
Aerial view of park at upper right corresponding to the location of the scanner in the previous image.
Ms. Herzberg entered from the left median and was apparently struck near the right side of the road.
(daylight) Street view of location with scanner, above.
What we know
- Audio from the very beginning of the video is not present. Q/A is. Tempe Police Briefing
- Video from Uber vehicle has not been released
- Nothing has been said about data from the Uber vehicle — LIDAR / SLAM data track
- LIDAR would not have been affected by low light levels — in fact it would have performed better without solar reflections than in daytime.
- Approximate speed of Vehicle: 40mph
- Vehicle was still in autonomous mode. Driver had NOT taken over.
- At this time there is no evidence that the Uber vehicle slowed down prior to impact
- He has taken effort to say “pedestrian,” not “cyclist.”
- She was in the lane of traffic at the time — this is also shown by the impact point on the Uber vehicle (below)
- There are several angles of video. The video shows the pedestrian approaching the vehicle.
Point of impact on the Uber vehicle
NPR Story on Self-Driving Cars dealing with bicycles
www.npr.org/… Bikes May Have To Talk To Self-Driving Cars For Safety's Sake
"Cars have a very regular pattern with the way they move, whereas when people are riding bicycles they change between either acting like cars on the side of the road," says Rowe, an associate engineering professor at Carnegie Mellon University. "They might switch and become pedestrians and go up on the sidewalks. They tend to move in a slightly more erratic way. It's much harder to predict."
My Commentary
Just because something may be difficult to see because of darkness, that is not a suitable argument for a car that is mostly using LIDAR and infrared to see. Don’t assume that a video that seems dark means that 1) a driver couldn’t see the pedestrian, 2) the built-in video cameras (yes CAMERAS) couldn’t see it, especially in infrared, or 3) the LIDAR couldn’t see the pedestrian.
Looking at the aerial image of the area, there would have been a significant amount of time and data locating Ms. Herzberg.
Comments are closed on this story.