News

Self-driving cars raise safety concerns

A Rinspeed Budii concept autonomous car. (Photo: Yauhen_D, Shutterstock)

On Valentine’s Day in Silicon Valley, one of Google’s experimental, self-driving cars sideswiped a city bus at 2 miles an hour. The incident marked the first time an autonomous car contributed to an accident on a public road, but did nothing to diminish the Obama administration’s enthusiasm for driverless vehicles.

A month after the crash, at an autonomous car conference in Dearborn, Mich., Mark Rosekind, the administrator of the National Highway Traffic Safety Administration, or NHTSA, said his agency and the federal Department of Transportation “are using all the tools we have available to advance what see as a revolution in technology,” according to his prepared remarks. “Our goal is to hasten this revolution.”

Autonomous cars, which have been in development since at least 2009, are known to struggle in inclement weather.

Enthusiasts say autonomous cars will grant mobility to the elderly and the disabled, transform congested freeways and eliminate the human errors responsible for most traffic accidents, which kill about 33,000 people per year. “Automated vehicles open up possibilities for saving lives, saving time and saving fuel,” said Transportation Secretary Anthony Foxx in January at the North American International Auto Show, where he announced the administration wants to spend $3.9 billion, over ten years, to foster the development of driverless cars. “We are bullish on automated vehicles,” he said.

But some automotive safety advocates fear government is embracing the technology too quickly without carefully assessing its actual capabilities and practical implications. With billions of dollars at stake and aggressive lobbying by the tech and automotive industries, safety advocates worry that government regulators will allow themselves – and the public – to be steamrolled in the name of progress and innovation.

“These cars are not ready for prime time,” said Rosemary Shahan, the founder and president of Consumers for Auto Reliability and Safety, a Sacramento, Calif.-based advocacy organization best known for spearheading passage of the state’s automobile lemon law.

Autonomous cars, which have been in development since at least 2009, are known to struggle in inclement weather; rain, fog and snow disrupt their sensors. “We should be requiring them to prove that they’re really ready” before rushing self-driving cars to consumers, Shahan said.

The industry desperately wants the federal government to take the lead on overseeing autonomous cars.

She’s also worried about draft regulations in California that would make occupants responsible for all traffic violations that occur while a driverless car is operating in autonomous mode. Shahan said manufacturers “should be willing to assume the liability.”

At this point, NHTSA is not pursuing formal regulations for autonomous cars. Rather, the agency intends to release, in July, operational guidelines to help manufacturers with the safe deployment of the vehicles as well as model policies to advise states struggling with oversight of autonomous cars. Rosekind describes the agency’s approach as “deliberate” while rejecting the notion that the government needs to strike a balance between innovation and safety, “as if there is some trade-off between the two.”

But Joan Claybrook, a former NHTSA administrator under President Jimmy Carter, said anything short of mandatory standards, developed through the well-trod regulatory process, could potentially endanger the American public. She criticized the process being pursued by the government as vague, and potentially toothless. “I have no idea where they’re headed with this,” she said.

The industry desperately wants the federal government to take the lead on overseeing autonomous cars, to avoid a patchwork of laws emerging at the state level, and it’s well-positioned to influence the process. Several former NHTSA officials now are working for industry on autonomous cars. Former NHTSA Administrator David Strickland recently was named counsel and spokesman for the Self-Driving Coalition for Safer Streets, an industry group comprised of Google, Ford, Lyft, Uber and Volvo that he describes as dedicated to making autonomous cars available to the public “as soon as possible.”

Proponents of autonomous cars have been lobbying against rules that would require a licensed driver to be present while the cars are in operation.

Also working for Google is Ron Medford, a former deputy administrator of NHTSA, who is now director of safety for Google’s self-driving car project, and Daniel Smith, a former manager of NHTSA’s Office of Vehicle Safety, who, according to a Reuters report, is now a Google consultant.Chan Lieu, the agency’s former director of government affairs, policy and strategic planning, is a registered lobbyist for Google and the Association of Global Automakers, which is also interested in autonomous car regulations. Both Lieu and Strickland work for the law firm Venable, LLP, whose clients include the Alliance of Automobile Manufacturers, the top U.S. trade group for the automotive industry, representing 12 car and truck makers.

In May, Consumer Watchdog, an advocacy organization based in Santa Monica, Calif., wrote a letter to Rosekind, the current NHTSA chief, and Foxx, the transportation secretary, demanding that they promise not to work for autonomous car companies for at least seven years after leaving the government. “The revolving door between NHTSA and industry has become an embarrassment to the agency and the administration,” the letter says. Foxx and Rosekind have not responded to the letter; Google declined to comment.

Among other things, proponents of autonomous cars have been lobbying against rules that would require a licensed driver to be present while the cars are in operation. The draft regulations in California, for example,include that rule. But critics say requiring a licensed driver would nullify the benefits of autonomous cars for the elderly and the disabled.

In one instance, a test driver turned around and looked in the backseat for a laptop while the car traveled 65 miles per hour down the freeway in autonomous mode. Google officials say this taught them that autonomous car occupants can’t be trusted to suddenly take over a vehicle.Google wants to take it even a step further and block any rules that would require autonomous cars to have a steering wheel or pedals. In tests, the company found that users came to trust the technology so much that they would become inattentive even when they were explicitly told they had to watch the road in case they needed to take control.

Safety advocates also are worried that autonomous cars will be vulnerable to hacking.

John Simpson of Consumer Watchdog believes this line of thinking is preposterous. He appeared at a NHTSA hearing last month at Stanford University carrying a steering wheel to underscore his point that driver controls should be mandatory even on autonomous cars.

“Deploying a vehicle on public roads today without a steering wheel, brake, accelerator and a human driver capable of intervening when something goes wrong is not merely foolhardy,” Simpson testified. “It is dangerous.”

As evidence, Simpson points to a series of reports that companies testing autonomous cars on California roads must file with the state. The state asks companies to file a report detailing the number of times the autonomous technology failed or a driver had to take over.

The first series of reports, covering September 2014 through November 2015, showed that self-driving cars tested in California on average had to disengage from autonomous mode once every 166 miles driven. The average American drives more than 13,000 miles each year, according to the Federal Highway Administration. “What the disengagement reports show,” Simpson testified, “is that there are many everyday routine traffic situations with which the self-driving robot cars simply can’t cope.”

Video, courtesy of the Associated Press, of the self-driving Google car sideswiping a city bus on Feb. 14 in Silicon Valley.

Speaking later at the same hearing, Chris Urmson, director of the Self Driving Cars Project at Google, said it’s not fair to judge the performance of autonomous technology just by looking at disengagement reports, which only note when something goes wrong, or without comparing it to the performance of human drivers. He said a preliminary analysis indicates that Google’s driverless cars “are already performing, within error, comparable or better than human driving,” although he cautioned that better data is needed to build on that argument. “The concern,” Urmson said, “should not be that the technology comes too soon, but that it comes too late” to save lives.

Safety advocates also are worried that autonomous cars will be vulnerable to hacking. In July 2015, Fiat Chrysler recalled 1.4 million vehicles after researchers showed Wired magazine how they could remotely hack a 2014 Jeep Cherokee and take control of its radio, air conditioning and even its engine controls. Other researchers made a similar demonstration on the CBS program “60 Minutes,” while another hacker posted a video on YouTube showing how he could unlock and remotely start GM vehicles by infiltrating the OnStar RemoteLink mobile app.

Also of concern is the potential threat that terrorists could use autonomous cars as a weapon, like a bomb on wheels.

While no malicious attacks have been reported on traditional automobiles, car companies are so concerned about hacking that they’re actually hiring so-called “white hat” hackers to penetrate their systems and find vulnerabilities. And the potential for mischief could be even greater for autonomous vehicles.

Self-driving cars are expected to rely on sensing technologies and vehicle-to-vehicle and vehicle-to-infrastructure communications, which could create more opportunities for hackers. Particular models may also wirelessly talk to a database housed on a computer server. In theory, a hacker could infiltrate that server and take control of every car that talks to that database.

“If proper security can’t be demonstrated, it’s going to be a hindrance to the promotion of self-driving vehicles,” said Mike Belton, vice president of applied research at the Denver security firm Optiv.

Also of concern is the potential threat that terrorists could use autonomous cars as a weapon, like a bomb on wheels. “The Department of Transportation and NHTSA need to require that all autonomous vehicles have sensors inside to perform a sniff test for hazardous and WMD material and, once detected, disable certain features of the autonomous vehicles,” testified James Niles of the New York think tank Orbit City Lab, at anApril 8 hearing in Washington, D.C.

At that hearing, where NHTSA gathered input for its forthcoming guidelines on autonomous vehicles, government officials repeatedly were warned to take a cautious approach to this new technology. “NHTSA must ensure that this technology is ready, reliable and safe before its deployed on public roads,” said Peter Kurdock, director of regulatory affairs at Advocates for Highway & Auto Safety, “or the results could be catastrophic.”

Ed’s Note: This story appears with the permission of FairWarning, which covers issues of safety, health and corporate conduct.   

Want to see more stories like this? Sign up for The Roundup, the free daily newsletter about California politics from the editors of Capitol Weekly. Stay up to date on the news you need to know.

Sign up below, then look for a confirmation email in your inbox.

 

One response to “Self-driving cars raise safety concerns”

  1. Karl Rowley says:

    Someday it’s possible that the self-driving cars may be proven to be safe, but at this point in time we don’t know. It’s not a forgone conclusion that they are safer than traditional cars. Lets hope that regulators proceed with extreme caution.

Leave a Reply

Your email address will not be published. Required fields are marked *

Support for Capitol Weekly is Provided by: