All News and Perspectives
-
November 29, 2021
-
July 1, 2021
-
April 30, 2021
-
February 15, 2021
-
January 19, 2021
Brief: U.S. DOT Releases New Autonomous Vehicles Comprehensive Plan
-
December 1, 2020
U.S. Transportation Department Key to Biden Meeting Paris Agreement Targets
-
November 24, 2020
Many Frustrated as FCC Rules to Reallocate 5.9 GHz Spectrum Away from Transportation Safety
-
September 17, 2020
-
June 24, 2020
Could Greenhouse Gas Emissions Be Added To COVID-19’s Casualty List?
-
March 9, 2020
-
January 20, 2020
Overcoming The High Carbon Debt of Electric Vehicle Production
-
January 9, 2020
How Cities Can Digitize Their 21st Century Mobility Policies
-
September 26, 2019
-
July 30, 2019
U.S. Falling Behind in Smart City Deployments and Key 21st Century Infrastructure
-
April 19, 2019
-
April 18, 2019
-
April 10, 2019
-
January 31, 2019
-
January 16, 2019
Let’s Hope Trump Considers Infrastructure a National Security Issue Too
-
December 12, 2018
-
August 16, 2018
Autonomous Vehicles: Planners Aren’t Planning, Just Reacting
-
July 26, 2018
Blockchains, Smart Contracts, and the Future Of Transportation Security
-
July 20, 2018
Transportation – The New Villain in America’s Fight Against Greenhouse Gas Emissions
-
April 5, 2018
-
February 26, 2018
-
October 27, 2017
-
October 20, 2017
-
October 11, 2017
-
October 6, 2017
AV START Act Unanimously Clears US Senate Commerce Committee
-
September 25, 2017
Metropolitan Areas + Autonomous Vehicles – Congestion = Savings
-
September 18, 2017
-
September 11, 2017
-
August 8, 2017
-
July 7, 2017
Bills, Bills, Bills: A Look at the AV Bills Currently Moving Through Congress
-
June 27, 2017
-
June 27, 2017
-
June 26, 2017
-
June 19, 2017
-
June 9, 2017
-
May 23, 2017
-
May 18, 2017
-
May 18, 2017
-
May 4, 2017
Can Government Overcome Hurdles to Infrastructure Investment?
-
May 3, 2017
-
April 10, 2017
Proving It: Connected Infrastructure & AV Research Vital to a National Strategy
-
April 9, 2017
Atlanta Bridge Crisis: A Plea For Federal Infrastructure Investment
-
April 7, 2017
Bi-partisan support in the Senate for AV/CV funding. Thank you to our Senator Tammy Baldwin!
-
April 3, 2017
Tractors, Hackers, and Other Factors: The Necessity of Neutral Third Parties in the AV Realm
-
April 3, 2017
-
March 31, 2017
-
March 28, 2017
-
March 28, 2017
-
March 21, 2017
Autonomous Vehicle Safety Reports Leave a Lot to Desire
March 9, 2020 • Robert Fischer, Eric Nutt
Madison, Wisconsin – On the heels of the California Department of Motor Vehicle annual autonomous vehicle disengagement report, Beijing’s Innovation Center for Mobility Intelligent (BICMI) published its 2019 survey of self-driving vehicles being tested on local roads.
Both reports, which hinge on tracking disengagements – or the frequency at which human safety drivers were forced to take control of their autonomous vehicles – are a stark reminder of how little we have to measure the safety and performance of autonomous vehicles.
“Comparing disengagement rates between companies is worse than meaningless. It creates perverse incentives,” said Bryant Walker Smith, associate professor at the University of South Carolina’s School of Law and an expert in self-driving cars.
Smith explained to the Verge that if he were to register in California and never test, for instance, he’d look good. “If I wanted to look even better, I’d do a ton of easy freeway miles in California and do my real testing anywhere else,” he continued.
California law requires that every company testing autonomous vehicles on public roads submit data on the number of miles driven and the frequency of disengagements. Beijing is one of the few cities globally to mandate that autonomous car companies disclose their disengagement results too.
According to Verge coverage of the California report, the total number of autonomous miles driven in California last year rose 40%, to more than 2.87 million, thanks largely to a notable increase in public on-road testing by Baidu, Cruise, Pony.ai, Waymo, Zoox, and Lyft.
But the report has generated considerable discussion about whether disengagements communicate anything meaningful. Echoing similar sentiments as Smith, Waymo, which drove 1.45 million miles in California in 2019 and logged a disengagement rate of 0.076 per 1,000 self-driven miles, said that the metric “does not provide relevant insights” into its technology. Cruise, for their part, drove 831,040 miles last year and reported a disengagement rate of 0.082, said the “idea that disengagements give a meaningful signal about whether an [autonomous vehicle] is ready for commercial deployment is a myth.”
Meanwhile, Venture Beat reported that a total of 77 autonomous vehicles from 13 China-based companies covered over 646,000 miles on Beijing roads during 2019, according to the BICMI. That’s up from the 95,000 miles eight firms drove in 2018.
The BICMI report doesn’t break out disengagement numbers by company or vehicle, but it said 86% of disengagements in 2019 resulted from human takeovers. Examples might include drivers tinkering with data-recording equipment, changes in planned routes, or “personal reasons” (like bathroom breaks). The remaining 14% of disengagements were attributable to some form of mechanical or software system failure.
The inherent weakness of using disengagements to measure the success of autonomous vehicles are glaring.
For starters, the very nature of public road testing means that the environments are inconsistent. A mile in Palo Alto is very different than a mile in downtown San Francisco.
Also, in California, the vague DMV definition of disengagement means that car companies aren’t following the same protocols. The DMV defines disengagements as “deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.” That leaves a lot of room for interpretation.
For example, a self-driving car owned by GM Cruise ran a red light in San Francisco after the safety driver took control to avoid blocking a crosswalk. But the company didn’t include the incident in its report because, according to Cruise’s interpretation, the human driver didn’t act out of safety concerns or a failure of the autonomous system.
And finally, nowhere in either report is it specified which advanced driver assist system (i.e. adaptive cruise control) failed and precisely what event triggered the disengagement.
The result is that it is impossible to make an apples-to-apples comparison about the performance and safety of these vehicles.
And if you thought you could turn to the federal government for better data, think again. Most major players in the US have also submitted voluntary safety reports to the federal government as part of the Department of Transportation’s voluntary guidance. But these reports read more like marketing documents.
The reality is that if the public wants to understand how safe self-driving vehicles are, one alternative is to consider leveraging methods and techniques currently being used by insurance providers. After all, insurance groups have a lot at stake when it comes to understanding the quality of these systems.
When the Insurance Institute of Highway Safety (IIHS) set out to evaluate the success rates of active lane-keeping systems in 2018, for instance, it tracked the number of times the vehicle had to overcorrect after crossing clearly marked lane lines. Why? Because evaluating the safety of these vehicles requires the collection of relevant and targeted data about specific advance driver assist systems.
An alternative approach, according to Smith, would be for companies to release testing summaries with details and context about each disengagement. While not ideal, the thought is that with more information about precisely which system failed and under what conditions, the public could conceivably start teasing out some meaningful conclusions. But no company to date has done that.
And short of any new regulations set forth by public authorities, the industry may have to forge its own path by developing their own standards. Dmitry Polishchuk, head of Russian tech giant Yandex’s autonomous car project, noted that Yandex hasn’t released any disengagement reports because they are “waiting for some sort of industry standard” to overcome the discrepancies in how companies are defining and recording disengagements. The right industry standard could help overcome this test-and-don’t-report alternative.
The good news is the SAE has created a task force to provide definitions, information and best practices to support verification and validation of self-driving systems. It’s still unclear, however, when we can expect to see the fruits of their labor.
At this point, there is little reason to expect much will change in 2020, which leaves us in a frightful situation: There are currently millions of miles being driven by automated vehicles on public roads, and no one can agree about what data we should use to measure their safety.
If that’s not reckless driving, then what is?
Robert Fischer is President of GTiMA, a Technology and Policy Advisor to Mandli Communications, and an Associate Editor of the SAE International Journal of Connected and Autonomous Vehicles. Follow Rob on Twitter (@Robfischeris) and Linkedin.
Eric Nutt is the Chief Technology Officer of Mandli Communications, Inc., and an Associate Editor of the SAE International Journal of Connected and Automated Vehicles. Follow Eric on LinkedIn.