A virtual grin arose on the face of Harvey Rosenfield with the mention of Isaac Asimov’s Three Laws of Robotics.
The mention was made during a phone call, but the overwhelming satisfaction of a smile was easily detectible in his voice during a conversation that had an otherwise ominous tone.
It was as if he had been waiting for someone to make a connection between a set of principles for autonomous vehicle technology he has authored and the famed robotics rules penned in 1942 by the science fiction author.
Rosenfield is a well-known consumer protection advocate and author of California’s long-standing and far-reaching auto insurance regulation, Proposition 103. Like Asimov, Rosenfield’s intent is to issue a warning to the future. Asimov brought awareness to the potential impact of robots on society; Rosenfield is urging people to be wary of the promise of safety and luxury that self-driving vehicles seem to hold.
Worrying about self-driving vehicles seemingly puts Rosenfield on the same page as the insurance industry, which some believe will be hurt by a decline in the need for personal auto insurance when and if self-driving cars are the norm on the world’s streets and highways.
But Rosenfield, founder of Consumer Watchdog, a Santa Monica, Calif.-based group that’s often at odds with the insurance industry, is not even in the same ballpark. In fact, his warnings about autonomous vehicles take aim at the insurance industry and auto manufacturers.
In a recent report he authored, “Self-Driving Vehicles: The Threat to Consumers,” Rosenfield frets that John and Jane Driver will foot the blame—and often the bill—for autonomous vehicle mishaps and that the insurance and auto industries see the advent of driverless cars as an opportunity to weaken consumer protections.
“The sparkly chimera of robots replacing human drivers—freeing people to spend their drive time more enjoyably and productively—has captivated the public and media, driven by self-interested auto manufacturers and software developers,” the report states. “But there has been very little public discussion of whether self-driving vehicles will coexist or collide with long-standing principles of accountability, transparency and consumer protection that collectively constitute the Personal Responsibility System.”
The system the report refers to is the state-based liability and insurance laws.
In an interview with our sister publication Insurance Journal, Rosenfield said he’s worried about the impact of self-driving vehicles on consumers, particularly lower-income consumers.
“Let’s start with the auto industry,” Rosenfield said. “No. 1, of course, cars will become safer, hopefully, but the more automated they get, the more expensive they’ll be to fix, the more vulnerable they’ll be to hacking, the more costly they’ll be to buy, because this technology is not going to be cheap. The auto industry is probably going to end up marketing the self-driving vehicles to very wealthy people, just the way that they do today with optional high-priced equipment that only the richest people can afford.”
Rosenfield in his report argues that self-driving vehicles may be only as safe as people can afford and that the manufacturing and insurance industries are exploring ways in which they can limit or shift their responsibility, given that safety-related costs and claims are likely to increase as the result of the new technologies.
Wade Newton, a spokesman for the Alliance of Automobile Manufacturers, said self-driving vehicle technologies hold great promise to transform mobility for everyone.
“The Alliance supports policy initiatives that facilitate safety innovations and remove legislative and regulatory hurdles to the advancement of self-driving vehicles,” Newton wrote via email in reply to a request for comment.
He added: “We have not reviewed this report, but given the fact that government figures show that driver behavior contributes to a full 94 percent of crashes, we can all agree that automated systems have the potential to further enhance road safety.”
Rosenfield believes the insurance industry will contend that since self-driving vehicles are just around the corner, drivers aren’t going to be needed any longer, and “therefore, we don’t need consumer protections against insurance rip offs, fraud and abuse that the voters passed back in 1988, when they passed Proposition 103,” he added.
As long as consumers can be blamed for a crash—whether it’s to be blamed by the manufacturer when something goes wrong with the car or by the software company that programmed the vehicle—they will still need to buy insurance coverage and they’re going to need the protections of Prop. 103, he said.
Prop. 103, passed by California voters in November 1988, requires prior approval from the California Department of Insurance before insurance companies can change property/casualty rates. Rate filings from carriers get CDI review as well as public review.
Rosenfield said that Prop. 103 eliminated a “whole host of discriminatory practices that the insurance companies like to engage in when they don’t want to sell insurance to people without elite occupations,” or higher education.
According to him, the industry “has been dying to get rid of Prop. 103,” and autonomous vehicles may be the opportunity to do that.
California’s insurance industry probably wouldn’t be upset to see Prop. 103 go. However, Rosenfield is overstating the role Prop. 103 plays in the world of insurance, according to Mark Sektnan, president of the Association of California Insurance Companies.
“It is also unfortunate that Mr. Rosenfield continues to misrepresent and misunderstand how the advent of driverless cars is going to change the insurance market, and his comments do nothing to further the discussion about how we deal with a future that is almost here,” Sektnan said. “Sadly, he seems locked in the last century and unable to adapt to the changing future. The world has changed a lot since the voters passed the initiative.”
Sektnan did acknowledge that he wouldn’t mind seeing Prop. 103 revamped or rethought as we enter a new world of driverless vehicles.
“It’s almost 30 years old, and everything needs a review after 30 years,” Sektnan said. “I think it’s going to be very hard to fit a driverless vehicle into an insurance rating regimen that is based on a human driver.”
Driving record is one of the rating factors mandated by Prop. 103. When issuing a policy for someone with an autonomous vehicle, which is it—the car’s driving record or the software maker’s?
“When you don’t have a human at the wheel, what does driving record mean?” Sektnan said.
He said the industry has “been engaged in this issue” with the goal of ensuring there is appropriate liability.
Rosenfield believes the liability will fall in the laps of drivers.
“That’s what’s already happened,” he said. “Look, for example, at what happened with the crashes involving Tesla. Tesla has consistently blamed the motorists, the owners of the car for the crashes. I project we see the future being the same as it is now. If there’s a crash, the manufacturers aren’t going to step up and accept responsibility for it; they’re going to try to avoid responsibility and blame the consumer, and fight it out.”
In one of the most watched crash incidents involving an autonomous vehicle, the U.S. National Highway Traffic Safety Administration in early 2017 found that the owner of a Tesla Motors Inc. Model S sedan that drove itself into the side of a truck in 2016 had ignored the manufacturer’s warnings to maintain control even while using the driver-assist function.
Intermingled in Rosenfield’s report, and throughout the IJ interview, he used the phrase “robot cars” to the degree it would make one suspect he was trying to drive home a point. Is he?
“I am, because one of the things that is very clear is that the software in these robot cars is going to take the place of human judgment, of human values, of human morality,” he said. “That software is being written…by Google and other Silicon Valley companies, and eventually of course the auto companies will buy that software. We as consumers, we as the public and we as human beings have no control over the software that’s being written. What happens when the software in the robot car detects that there are pedestrians that are about to jump in front of the car for whatever reason—a baby carriage, a stroller accidentally rolling down the street?”
The car is going to have to make a decision that is now being made by human beings: Whether to drive into a stroller or drive into a tree.
“We don’t know what decision the software is going to make,” Rosenfield said. “It’s going to be a life-and-death decision. Which way is it going to go?”
Asimov wrote his three laws of robotics to deal with decisions that the robots of his imaginary future were to face:
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Rosenfield in his report has offered his own version of these laws. They consist of six principles that he believes must be adopted to deal with the challenges of these “robot cars.”
Asimov’s three laws of robotics have been widely shared in popular entertainment, but it’s not clear that the ideas behind those laws—to protect people—are going to be shared, he said.
“That’s the philosophical conundrum that everybody’s going to have to reckon with,” Rosenfield said.
Rosenfield’s principles are as follows (These have been edited for brevity. See the full report and principles here):
Protect the civil justice system. Disputes over who is at fault in a crash involving a self-driving vehicle will require the full power of the civil justice system to expose the cause of crashes, compensate victims, punish wrongdoers and force manufacturers to make changes in their products to prevent future harm. When autonomous technologies fail, hardware and software manufacturers must be held liable. Lawmakers should reject legislation to limit state consumer protection laws, and manufacturers must not be allowed to evade consumer protections by inserting arbitration clauses, “hold harmless” provisions or other waivers in contracts.
Enact stronger state consumer protections against insurance company abuses. Prop. 103 has protected against unjust insurance rates and discriminatory practices. The law’s emphasis on rewarding drivers with lower insurance premiums based on their safety record, annual mileage, driving experience and other rating factors will be critical in the new automotive era.
Enact auto safety standards. Private companies cannot be trusted to develop and deploy autonomous vehicles without rules. Federal and state auto safety agencies must develop standards for the testing and deployment of the multiple technologies required by these vehicles. Standards must address safety, security, privacy and the software that determines actions in the event of an impending collision.
Stronger laws are needed to protect consumer privacy. Hardware and software manufacturers and insurance companies must be barred from utilizing tracking, sensor or communications data or transferring it to third parties absent separate written consent.
Bar federal interference in state consumer protection laws. Neither Congress nor federal agencies should be permitted to preempt or override stronger state-based civil justice, insurance reform or auto safety laws.
Respect democratic and human values. The sponsors of self-driving vehicles have promoted the myth that machines are infallible to justify the wholesale departure from the founding principles of the nation, including the rule of law, individual and corporate responsibility, legal principles that distinguish between human beings and property, and the transparency of public officials and institutions.
Rosenfield offered up a bottom line on all of this: “No. 1, we think that we need to make sure that the industries involved respect our current human democratic values and cultural morals as they build these cars and deploy them.”