Autonomous Vehicle Safety Reports Leave a Lot to Desire

Madison, Wisconsin – On the heels of the California Department of Motor Vehicle annual autonomous vehicle disengagement report, Beijing’s Innovation Center for Mobility Intelligent (BICMI) published its 2019 survey of self-driving vehicles being tested on local roads.

Both reports, which hinge on tracking disengagements – or the frequency at which human safety drivers were forced to take control of their autonomous vehicles – are a stark reminder of how little we have to measure the safety and performance of autonomous vehicles.

“Comparing disengagement rates between companies is worse than meaningless. It creates perverse incentives,” said Bryant Walker Smith, associate professor at the University of South Carolina’s School of Law and an expert in self-driving cars.

Smith explained to the Verge that if he were to register in California and never test, for instance, he’d look good. “If I wanted to look even better, I’d do a ton of easy freeway miles in California and do my real testing anywhere else,” he continued.

California law requires that every company testing autonomous vehicles on public roads submit data on the number of miles driven and the frequency of disengagements. Beijing is one of the few cities globally to mandate that autonomous car companies disclose their disengagement results too.

According to Verge coverage of the California report, the total number of autonomous miles driven in California last year rose 40%, to more than 2.87 million, thanks largely to a notable increase in public on-road testing by Baidu, Cruise, Pony.ai, Waymo, Zoox, and Lyft.

But the report has generated considerable discussion about whether disengagements communicate anything meaningful. Echoing similar sentiments as Smith, Waymo, which drove 1.45 million miles in California in 2019 and logged a disengagement rate of 0.076 per 1,000 self-driven miles, said that the metric “does not provide relevant insights” into its technology. Cruise, for their part, drove 831,040 miles last year and reported a disengagement rate of 0.082, said the “idea that disengagements give a meaningful signal about whether an [autonomous vehicle] is ready for commercial deployment is a myth.”

Meanwhile, Venture Beat reported that a total of 77 autonomous vehicles from 13 China-based companies covered over 646,000 miles on Beijing roads during 2019, according to the BICMI. That’s up from the 95,000 miles eight firms drove in 2018.

The BICMI report doesn’t break out disengagement numbers by company or vehicle, but it said 86% of disengagements in 2019 resulted from human takeovers. Examples might include drivers tinkering with data-recording equipment, changes in planned routes, or “personal reasons” (like bathroom breaks). The remaining 14% of disengagements were attributable to some form of mechanical or software system failure.

The inherent weakness of using disengagements to measure the success of autonomous vehicles are glaring.

For starters, the very nature of public road testing means that the environments are inconsistent. A mile in Palo Alto is very different than a mile in downtown San Francisco.

Also, in California, the vague DMV definition of disengagement means that car companies aren’t following the same protocols. The DMV defines disengagements as “deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.” That leaves a lot of room for interpretation.

For example, a self-driving car owned by GM Cruise ran a red light in San Francisco after the safety driver took control to avoid blocking a crosswalk. But the company didn’t include the incident in its report because, according to Cruise’s interpretation, the human driver didn’t act out of safety concerns or a failure of the autonomous system.

And finally, nowhere in either report is it specified which advanced driver assist system (i.e. adaptive cruise control) failed and precisely what event triggered the disengagement.

The result is that it is impossible to make an apples-to-apples comparison about the performance and safety of these vehicles.

And if you thought you could turn to the federal government for better data, think again. Most major players in the US have also submitted voluntary safety reports to the federal government as part of the Department of Transportation’s voluntary guidance. But these reports read more like marketing documents.

The reality is that if the public wants to understand how safe self-driving vehicles are, one alternative is to consider leveraging methods and techniques currently being used by insurance providers. After all, insurance groups have a lot at stake when it comes to understanding the quality of these systems.

When the Insurance Institute of Highway Safety (IIHS) set out to evaluate the success rates of active lane-keeping systems in 2018, for instance, it tracked the number of times the vehicle had to overcorrect after crossing clearly marked lane lines. Why? Because evaluating the safety of these vehicles requires the collection of relevant and targeted data about specific advance driver assist systems.

An alternative approach, according to Smith, would be for companies to release testing summaries with details and context about each disengagement. While not ideal, the thought is that with more information about precisely which system failed and under what conditions, the public could conceivably start teasing out some meaningful conclusions. But no company to date has done that.

And short of any new regulations set forth by public authorities, the industry may have to forge its own path by developing their own standards. Dmitry Polishchuk, head of Russian tech giant Yandex’s autonomous car project, noted that Yandex hasn’t released any disengagement reports because they are “waiting for some sort of industry standard” to overcome the discrepancies in how companies are defining and recording disengagements. The right industry standard could help overcome this test-and-don’t-report alternative.

The good news is the SAE has created a task force to provide definitions, information and best practices to support verification and validation of self-driving systems. It’s still unclear, however, when we can expect to see the fruits of their labor.

At this point, there is little reason to expect much will change in 2020, which leaves us in a frightful situation: There are currently millions of miles being driven by automated vehicles on public roads, and no one can agree about what data we should use to measure their safety.

If that’s not reckless driving, then what is?

 

Robert Fischer is President of GTiMA, a Technology and Policy Advisor to Mandli Communications, and an Associate Editor of the SAE International Journal of Connected and Autonomous Vehicles. Follow Rob on Twitter (@Robfischeris) and Linkedin.

Eric Nutt is the Chief Technology Officer of Mandli Communications, Inc., and an Associate Editor of the SAE International Journal of Connected and Automated Vehicles. Follow Eric on LinkedIn.