In My Opinion: Hit the Brakes – We’re Not Ready for Autonomous Vehicles

Imagine as you drive down the road, you glance over to the driver next to you, but there is no driver. As confusion and panic start to overtake you, you realize there are cameras and sensors on the car and think, cool, it’s a driverless vehicle! But do those feelings of confusion and panic disappear? Are you safe driving next to an autonomous car? Wade Trappe, IEEE Fellow and professor of electrical and computer engineering at Rutgers University, discusses his concerns about the desire to quickly bring fully automated vehicles to the market. Trappe believes researchers, industries, and governments need to do more research, testing, and planning before society widely adopts autonomous vehicles.

“My biggest concern is that while the technology is promising, for technologies like this where there are significant safety risks, one should proceed with caution. I am worried that people want to adopt this technology too early, before ‘all the kinks’ have been worked out. And there will be many kinks. Who is to say that, after a couple more years, all of the ‘corner cases’ have been mapped out?” says Trappe.

These concerns partially stem from issues automated vehicles have experienced on public roads including crashes and traffic violations. So far, automated vehicles have not measured up to the standards Trappe expects for the engineering profession. “When we think of safety requirements for certain engineering applications (e.g. the construction of a bridge), we want the reliability to go out several decimal places (i.e. 99.999% reliability),” says Trappe.

“I am a firm believer in conservative engineering—the designs need to have cautiousness in their decisions—after enough failures have happened, the engineers will go back to design-in fixes and more cautious decisions that say ‘when I encounter a situation where there are two possible choices to make, here’s how to choose the one with the least risk/danger,’” says Trappe.

One component of the issue is the current functionality of the technology vs. consumers’ expectations. Trappe believes that the industry should first focus on driver-assist technology rather than rushing into driver-replace vehicles. And, this difference needs to be emphasized to consumers.  “We saw, with the TESLA incident in China, that drivers want to believe that they can use the ‘driver assist’ technology in a ‘fully automated (no driver needed)’ manner,” says Trappe.

Trappe thinks that fully automated vehicles can safely come to the market, someday. But, first, more preparation is needed – and engineers are crucial to that preparation, while working in collaboration with governments and across the different industries involved.

More and more reliable in-vehicle signal processing is needed for vehicle automation, and this “signal processing is closely tied to more data and more trials. So, government needs to require more trials and higher requirements regarding quality. Meanwhile, industry needs to conduct more trials, collect more data, and conduct large-scale data analysis (where signal processing and machine learning comes into play),” says Trappe.

So when will fully automated vehicles be ready to come to the market?

“I think the near future (1-2 years) is not realistic as technology needs time to develop, for all the various corner cases to be mapped out,” says Trappe.  “The technology will work eventually, but it needs more time… so my horizon would be more along the lines of 5-10 years. Perhaps there will be limited deployments sooner than that (e.g. deployments in cities that are naturally safer than other cities for such technologies to be tried out… for example, Manhattan has to be more complex than tiny-town USA…).”

Trappe believes that the technology is still developing and a slow integration into mainstream use is critical. This would allow engineers to build on early successes, and failures, and to develop favorable public opinion and buy-in of the technology.

“I think deployment will go in stages, with deployments in ‘easier to succeed towns than other harder to succeed towns’. Industries like this can’t handle too many negative press stories, so they will need to deploy first in scenarios where they will succeed… this will allow them to collect more data that can ultimately allow them to deploy in slightly harder settings… which will then allow them to deploy in slightly harder… etc.” says Trappe. “The minimum requirement to go to market would be success in a handful of increasingly difficult cities, and showing over long enough time that the failure rate is extremely low, and risks are comparably low,” says Trappe.

In addition to engineering the safety issues for automated vehicles, Trappe notes that there are other factors that need be address.

“There are sorts of legal/insurance questions to work out—such as who is liable under different conditions? Such a technology needs to have the basic legal framework started (at least) before it can really be deployed,” says Trappe.

“We need to move with caution, collect larger data sets, share them as a community so that we can all have better ground truths upon which to build algorithms for making automated vehicular decisions. (An agency like NIST could host such data for everyone to use),” notes Trappe. “And, we need users who are itching to use this technology to understand that it is a ‘driver-assist’ and should be used in that way long after companies say it is ‘driver-replace’. This means solving one of the hardest problems that has plagued technological innovation: getting users to read the user manual! In this case, though, it is really important that we solve this problem. We are, after all, dealing with human lives.”

Read more from Trappe in his IEEE Signal Processing Magazine article No Need for Speed: More Signal Processing Innovation is Required Before Adopting Automated Vehicles.



  • I believe there are at least two countries in which self-driving cars will not be deployed (at a commercial scale), but for very different reasons. These are US and India.

    In the US, the problem is that there are at least 250M registered vehicles, and at least 350M registered firearms. The reason I am bringing the firearms into the discussion is that cars and firearms signify the American freedom. There are many responsible gun owners and far many responsible drivers. These folks won’t like someone dictating terms regarding who is allowed, and who is not allowed, to drive.

    The primary reason we are pushing for the SDCs is to reduce (or altogether eliminate) the labor costs (For Uber, SDCs is the existential threat, according to Travis Kalanik).
    A typical truck driver makes between $70K to 80K, and truckers losing jobs won’t be good for the truckers, and for the economy. Since driving is one profession that doesn’t even need a college degree, I am very concerned about the lack of transition plan for these folks.

    Regarding the number of vehicles on the road: I think these are way too many, and no one is talking about a practical approach to reduce the number of vehicles on the road.
    It takes a lot of tax dollars to maintain and repair the public roads. If humans are banned from driving (which is a must for the success of SDCs), many of us won’t be buying the cars in the first place. This means no revenues from the excise tax, no insurance, no repairs/maintenance, and no gas. In this situation, I am not sure how the towns, states, and even the federal government, find the resources to maintain the roads on a regular basis to enable SDCs.

    In India, SDCs simply won’t work (according to the CEO of Maruti India; please look up online) as there are no lanes, no lane discipline (if there are any lanes), and people, in general, do not follow the rules.

  • When pigs fly. Who can afford these mythical self-driving cars? What about real people, poor people, people who work on their own cars, people who just plain enjoy driving cars? What city has the extra $ to spend on infrastructure upgrade needed to help these cars? Would you fill the potholes first?

    No matter how many people die on the road today, any self-driving car statistics will add to that number, not replace it. One difference? The rich guy in the auto-car will die – even if he is otherwise smart, responsible, alert, and, had he been driving, would have avoided the accident. Yeah, maybe he could have been drunk too, but still he’s now dead rather than asleep at the curb somewhere. In today’s accidents, usually at least one of the people involved was at fault, and maybe Darwin was taking home his own.

    My “smart” phone updates itself every couple of days, not always for the better, and it is brand #2. When it no longer acts the way I expect, it doesn’t cause me to drive into a telephone pole. Complexity is increasing constantly (exponentially?) in today’s circuits and software. Have you not looked at the errata sheets of today’s microcontrollers and microprocessors? “Debug” is such a sweet and simple word until you’re faced with the swarm of locusts that loom in anything as sophisticated as the driverless car.

    ADAS is the best use of such technology. But it leaves the human to make the ultimate decision – the way it should be. Usually ADAS is mostly annoyance, and often you end up just ignoring it, like the wife griping at you. What did that beep mean? What was that buzz? Let me look over here at this screen where it tells me… crash .

    I’m not too good at math, so help me. First, give me six 9’s of perfect operation (in any and all conditions). How many car-miles are driven every day in America? Then, how many accidents will that be? What portion will result in death? Note that the six 9’s is typically applied to server farms, where failure is essentially a user-noticeable delay in service, not an “oh, hit!” crash of a $100,000 vehicle.

    If you want a self-driving car, next time call a taxi, and try to get one where the driver is driving politely, paying attention to the road as he’s paid to do, and not arguing with his wife or otherwise pecking away on his cell phone.

    Leave the pigs to making bacon.

  • If transportation regulators are convinced autonomous vehicle control software is safe, why not employ in passenger trains? Given that trains operate in a well defined and controlled environment, with gentle curves and slopes, it should be the first step to gaining wide acceptance of autonomous cars and trucks. I don’t foresee motorcyclist embracing the technology, however.

  • I have no particular expertise in this area, but as an aging driver, I welcome the developments and promise of driver assist/autonomous driving. The article and all of the comments make legitimate points. But it seems to me that it will be a long while before autonomous capability will be useable “everywhere/anywhere”. Perhaps an important task for DOT/NTSB would be to develop criteria for zones where autonomous operation is deemed to be acceptable (with corresponding indicators to all drivers as to the bounds of these zones and controls on those who may wish to enter and use them).

  • ~Every fifteen minutes another person is killed by a human driver. Even if we deploy the SDC’s by morning, 40 more will die. Could we resolve this without further delay.

    • What reason do you have to believe SDCs will do any better? Would you feel safer stepping off the curb in front of one of them?

  • The consequence of requiring an excessively high level of validation before deploying this technology is that more lives will be lost. Human drivers are not all that safe, and we have become numb to the carnage that takes place daily on our roads. To save lives, autonomous systems need only be measurably better drivers, overall, compared with humans. It’s hard to say, but this threshold might be reached within just a few years.

    • The traffic safety statistics for the U.S. show that human drivers are actually astonishingly safe — one fatal crash for every 100 million miles (3.4 million hours — or 390 years of 24/7 continuous driving at 30 mph average) and one injury crash for every 1.8 million miles. Show me the software-based consumer product that can get anywhere near that safe when operating in an environment as complex as driving on public roads. Anybody have a smart phone or personal computer than can get within orders of magnitude of those operating statistics without a hiccup?

    • Here is another problem, the difference between autonomous and automated vehicles. What are you presented are related more to ADAS and automated vehicles.

      Concerning autonomous vehicles I fully agree prof.Trappe. It is a difference between the companies’ interests to acquire the market and the public interest to increase the safety.

    • Your opinion suggests that you are neither a cyclist, nor a pedestrian. Personal injury and deaths can be mitigated in car-on-car crashes, but what about the cyclist in the roundabout, or the baby carriage on the sidewalk near a busy street. In fact, your opinion suggests that you don’t think very carefully at all. That, or you stand to gain financially from the early adoption. And as for becoming “numb to the carnage”, you speak strictly for yourself!

  • I see no item about road standers and how the will effect the operation. As soon as you get off the the main road all marking may be missing . I believe this will a big problem plus cost.

  • The lawyers will determine when driver-less cars hit the road. These vehicles will never be perfectly safe. So it will depend on what is the acceptable level of risk for both the owner and the manufacturer. Early adopters will face a storm of litigation and (bad) publicity. Five years seems very optimistic to me.

    • Problem with that approach: What about the cyclists and pedestrians who are mowed down and killed when these machines malfunction? Victims in car-on-car accidents have safety systems to protect them, and they may fare no worse than being pitted against human drivers. Humans are certainly not perfect, but being on a bicycle, I’d take my chances with a -3 sigma human driver before I would a robot.

  • I fully agree with the opinion stated in the article. I am a software engineer and I have been doing it for a long time. Software bugs are a reality. A software product released to the market without them is rare rather than common. The more complex the product the less likely it is to be bug free. At most, it can be expected that the software product will operate correctly in the most common scenarios. An autonomous vehicle is a complex system and it would take a lot of time to perfect. Unlike many other software products, bugs in this type of software can cost people their lives. Therefore, I advice caution rather than haste, especially when an autonomous car may hurt other people than the ones in it.
    One mitigation to this risk is the driver-assist mode. The driver is expected to be at his seat, supervising the operation of the autonomous car and intervening whenever the driver believes the car does not behave correctly. The problem with this mode of operation is that it is not particularly appealing to the user. After all, the main reason for driving an autonomous car is to free our time to do something else: sleeping, reading, working, playing or just relaxing, letting our mind wander off. The driver-assist mode does not do it, so the benefits to the user are not that great, while the risk of treating the car as driver-replace is big. Many will be tempted not to supervise its operation and let it do the work by itself, which at this stage may be dangerous.
    To summarize, personally, I will wait until I am convinced the technology is mature enough and safe before adopting it.

    • A voice of reason in the wilderness – thank you! The motivation for autonomous cars can be found by asking a simple question: “Who will benefit?”. The answer of course is the mega-corporations, and shills like Elon Musk. I’d propose this test: put the CEOs of all of these companies in a situation where there lives are at stake, and depend upon the quality of the machines they propose others buy and use.

  • One element that I see missing in most treatments of this subject is the psychological. Like it or not there are a large number of immature, juvenile drivers out there (some 50 years old or more) who don’t want any sort of real “control” over their habits or actions. One of the major contributors to fatal accidents is excessive speed. The technology exists to remedy this problem, but no one seems to be interested. One reason might be the subsequent impact on the auto industry.

Leave a Reply