Can Self-Driving Cars Ever Really Be Safe?

Technology
Photo Credit: Pixabay

Shelly Palmer

Analysts estimate that by 2030, self-driving cars and trucks (autonomous vehicles) could account for as much as 60 percent of US auto sales. That’s great! But autonomous vehicles are basically computers on wheels, and computers crash all the time. Besides that, computers get hacked every day. So you gotta ask, “Can self-driving cars ever really be safe?”

The Short Answer

No. Self-driving cars can never really be safe. They will be safer! So much safer that it’s worth a few minutes to understand why.

Humans Are Very Dangerous

First and foremost, according to the National Highway Traffic Safety Administration (NHTSA), 90 percent of all traffic accidents can be blamed on human error. Next, according to the AAA Foundation for Traffic Safety, nearly 80 percent of drivers expressed significant anger, aggression, or road rage behind the wheel at least once in the past year. Alcohol-impaired driving fatalities accounted for 29% of the total vehicle traffic fatalities in 2015. And finally, of the roughly 35,000 annual traffic fatalities, approximately 10 percent of them (3,477 lives in 2015) are caused by distracted driving.

Remove human error from driving, and you will not only save a significant number of lives, you will also dramatically reduce the number of serious injuries associated with traffic accidents – there were over 4.4 million in the United States during 2015.

Data Begins to Make a Case

In May 2016, a 40-year-old man named Joshua Brown died behind the wheel of a Tesla cruising in Autopilot mode on a Florida divided highway. He was the first.

Rage against the machine quickly followed, along with some valid questions about whether Tesla had pushed this nascent technology too fast and too far. Everyone expected the accident to be the fault of a software glitch or a technology failure, but it was not.

The NHTSA investigation found that “a safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted.” In other words, the car didn’t cause the crash. But there was more to the story. The NHTSA’s report concluded, “The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.” In reality, while Mr. Brown’s death was both tragic and unprecedented, the investigation highlighted a simple truth: semi-autonomous vehicles crash significantly less often than vehicles piloted by humans.

What Do You Mean by “Safe”?

The same NHTSA report mentioned 99 percent of US automakers had agreed to include Automatic Emergency Braking (AEB) systems in all new cars by 2025 with the goal of preventing 28,000 crashes and 12,000 injuries. The AEB program is limited to rear-end crashes, but there are a host of other semi-autonomous features in the works, and by the numbers, all of them will make us safer.

That said, this is very new technology, and regulators will need to define what they mean by “safe.” Must our autonomous vehicles drive flawlessly, or do they just need to be better at it than we are? The RAND Corp think tank says, “A fleet of 100 cars would have to drive 275 million miles without failure to meet the safety standards of today’s vehicles in terms of deaths. At the time of the fatal May 2016 crash, Tesla car owners had logged 130 million miles in Autopilot mode.”

The Transition to Fully Autonomous Vehicles

In April 2016, Ford, Alphabet, Lyft, Volvo Cars, and Waymo established the Self-Driving Coalition for Safer Streets to “work with lawmakers, regulators, and the public to realize the safety and societal benefits of self-driving vehicles.” They have their work cut out for them.

Self-Driving Cars Need to Be Trained

In January 2017, Elon Musk tweeted that a software update featuring Shadow mode was being pushed to all Teslas with HW2 Autopilot capabilities. This enabled the car’s autonomous driving AI to “shadow” its human drivers and compare decisions that it (the AI) would make to the decisions that were being made by the human driver. Think of it as self-driving AI in training. The auto industry and several tech giants are working as fast as they can to make autonomous vehicles mainstream. To speed the process, they may need to share some data. Will they? My guess is, absolutely.

Fear and Assessment of Risk

Some people are afraid to fly. When you point out that flying is the safest form of travel by several orders of magnitude, the response is always some version of, “But when a plane crashes everyone dies.” Human beings are not very good at assessing risk. If you don’t have a gas pedal, a brake pedal, or a steering wheel, and your car crashes, you will feel helpless and out of control. And you may die. But, by the numbers, tens of thousands of people will not die or be injured because semi-autonomous driving and ultimately fully autonomous driving will be much safer than pure human driving. Some will counter that it’s cold comfort if you’re the one who is killed or injured, no matter how rare it is. I agree. But, by the numbers, if you were going to make a policy decision for our society at large, you have to agree that saving tens of thousands of lives and millions of injuries is a worthy endeavor.

I’m pretty sure that before 2030, if you are under the age of 25 or over the age of 70, you are going to need a special permit to manually drive a car. I’m also pretty sure that you will not be allowed to manually drive on certain streets and highway lanes because you will pose too great of a threat to the caravans of autonomous vehicles on those roads.

With any luck, the fear-mongers and bureaucrats will get out of the way, and we will all be much safer sooner.

Categories
Technology

RELATED BY

0