Thursday, July 7, 2016

Damn you, Tom Simonite!

Earlier this week, I sent out emails to some transportation researchers asking about Tesla's argument that Autopilot saves lives because the system had one fatality in 130 million miles, the average for U.S. roads as a whole is one per 94 million. It appeared that those 130 were disproportionately driven under safer-than-average conditions, making the comparison largely meaningless.

Unfortunately, all of the points I wanted to make in my post (and then some) were covered yesterday in this excellent piece by Mr. Simonite of the MIT Technology Review.

Soon after, Tesla’s CEO and cofounder Elon Musk threw out more figures intended to prove Autopilot’s worth in a tetchy e-mail to Fortune (first disclosed yesterday). “If anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available,” he wrote.

Tesla and Musk’s message is clear: the data proves Autopilot is much safer than human drivers. But experts say those comparisons are worthless, because the company is comparing apples and oranges.

“It has no meaning,” says Alain Kornhauser, a Princeton professor and director of the university’s transportation program, of Tesla’s comparison of U.S.-wide statistics with data collected from its own cars. Autopilot is designed to be used only for highway driving, and may well make that safer, but standard traffic safety statistics include a much broader range of driving conditions, he says.

Tesla’s comparisons are also undermined by the fact that its expensive, relatively large vehicles are much safer in a crash than most vehicles on the road, says Bryant Walker Smith, an assistant professor at the University of South Carolina. He describes comparisons of the rate of accidents by Autopilot with population-wide statistics as “ludicrous on their face.” Tesla did not respond to a request asking it to explain why Musk and the company compare figures from very different kinds of driving.

Google has in the past drawn similar contrasts between the track record of its self-driving cars and accident statistics for humans, says Smith. He, Kornhauser, and other researchers argue that companies working on autonomous driving technology need to drop such comparisons altogether. In April, a RAND Corporation report concluded that fatalities and injuries are so rare that it would require an automated car to drive as many as hundreds of billions of miles before its performance could be fairly compared with statistics from the much larger population of human drivers.

Instead researchers say that Tesla and others need to release more data on the limitations and performance of automated driving systems if self-driving cars are to become safe and understood enough for mass market use.

You should read the whole thing.

No comments:

Post a Comment