Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Why Self Driving Cars are an Incredibly Bad Idea

A Short Fable

A long time ago, in a faraway kingdom, let’s call it Autatia, a powerful mage wanted to help his fellow Autatians. One day, to their amazement, after the mage had cast his most powerful spell, the Autatians witnessed thousands of pants drifting down from the sky – from baby-sized to XXXL. When worn, these pants would make them fly!

The lifestyle of the Autatians changed dramatically. They promptly sold all their horses. Why use horses if the pants could get them everywhere faster? The King’s soldiers got rid of their swords. In a fight their magical pants could make them fly over enemies and they would drop rocks from the sky. The kingdom blossomed. The imperial armies ruled the skies, and soon the previously struggling kingdom had taken flight.

One day, however, the powerful mage was overcome by two necromancers, casting their spells at a congregation of magii. The pants suddenly and catastrophically stopped working. Surprised Autotatians fell from the sky like sharks in a B-movie. For those countrymen who had managed to survive,  life became a lot harder. They had no horses or swords. Young Autatians, who had never known life without flying pants, had no way to get from point A to point B. The kingdom disintegrated. Their enemies set upon them and everybody and their pants were killed.

It’s Safer, Or Is It?

In some ways a self-Driving Car seems like a dream come true. What could be better than punching in a destination, sitting back and letting the car take you there? Plenty. In theory self driving cars would be safer because they remove the human element, which is prone to make mistakes caused by distractions and just plain laziness.

This is the dream. And in three decades from now when AI has advanced far enough this might be quite possible.

But are today’s self driving cars really safe? A study spanning three years from 2012 to 2015 found that autonomous car accidents were five times higher than normal vehicle accidents.

For the Love of Driving

While there does seem to be a good upside to the idea that things will eventually be safer if computers control our means of transportation, the biggest problem with self driving cars is it will take us almost completely away from the driving experience itself.

Technological developments now designed to make us safer may do so, but they also take us a step away from what it means to have a car and to drive a car. If getting somewhere is all you care about, you may be fine with a car that drives itself. But if you grew up loving cars, and have developed a love for cars and driving the great American roads, you will understand how this automation takes us away from the enjoyment of driving itself.

Full or Semi-Autonomous?

In 2014 the Society of Automotive Engineers defined six levels of self driving autonomy in cars:

  • Level 0: No autonomy where the driver is in complete control
  • Level 1: Minor driver assistance, like speed or steering control, but not both. An example is adaptive cruise control where the car can brake by itself. A blind spot monitor or warning device is also a pretty cool idea, and something that really could prevent an accident even if you are a careful driver who is paying attention.

A concern is that all this safety equipment may give a false sense of security and make people pay even less attention than they already are.

  • Level 2: Driver and steering assistance at the same time. The car will still want you to sit at the wheel. It can take over for short periods of less than a minute, but no higher grade functions like lane-merging, is possible.

Even up to this level we can still see a benefit. Features like lane-shift warning, which keeps us in our lane, or front and rear collision warning systems could have some benefit, preventing late-brake impacts. But they could also encourage us to not pay as much attention and to rely on that instead of embracing our responsibility on the road and paying attention to what we are doing with our cars.

  • Level 3: The car does most of the driving but can revert to the driver if it runs into a situation it cannot handle.

Good heavens this is a bad idea. The driver has been watching a movie up to that point, and suddenly the autopilot cuts out and with no situational awareness of the traffic or road, the driver has to take control of a situation the car could not handle. This is big a recipe for disaster as an orange-colored buffoon in the White House. An excellent article on how what can go wrong when an autopilot disengages, and how automation can set us up for disaster can be found here.

  • Level 4: The car does all of the driving and will self-stop in an emergency. Driver controls might still be present and the driver can self-drive should they wish.

  • Level 5: At this level the human is a complete passenger. The steering wheel and all driver controls are absent.

Who to Kill?

A self-driving car is coming down the street. The brakes on the car had failed. A group of kids flock into the street, right in front of the car. The computer is faced with a choice. Swerve into a shop and (probably) kill the driver -or- plough a path through the kids? Which choice is ethically right?

An executive from Mercedes has recently stated that their cars will protect occupants and not bystanders. People would prefer buying cars that protect them and their children. If no regulations were enforced to the contrary by the government, then good luck to all pedestrians, cyclists and kids playing on the side of the road.

In effect developers are training cars who to kill. A very concerning statement – any way you look at it.

The Computer You Sit Inside Can Be Hacked

DEFCON, the world’s largest hacking conference, has added a car hacking village section to their line-up in 2015. At the conference, Charlie Miller and Chris Valasek managed to take remote control of a Jeep 10 miles away and stop it on a highway. This caused an uproar in motor manufacturing ranks. Several models across several brands  were withdrawn, but self-driving car hacking became a thing. And it is a risk.

Below is a video of how the Segway MiniPro can be hacked using its paired mobile app:
The traffic across a whole country could be disrupted – yet another way we are setting ourselves up technologically for a bigger fall. What if all cars were programmed to cause a collision at a certain synchronized moment?

Hackers would try to think of ways to make money from this. A possible scenario is installing ransomware to bring a car to a stop or speed it up past a safe point if money is not wired.

Terrorists could load the car up with bombs and program the car to drive to a strategic target unmanned.

As with encryption algorithms, governments could insist on a back door in the automotive softwares, which could be exploited if the information leaked out.

Liability

The issue of liability will also be huge, and it may require a complete rethinking of the insurance industry as we know it. Who would be at fault in an accident? We know accidents will still happen because there has never been a perfect machine. And what happens when you have a self driven car in an accident with a human driven car? Who then is liable?

With big businesses like the insurance industry, the auto industry, as well as technology companies, you may have a really tough time getting justice as far as damages and health claims go when you do have an accident.

The Transition

We may have self-driving cars sooner than later, with some manufacturers saying they will be on the road as soon as 2020. There are a lot of issues to work out before then, and hopefully these will be worked out before self driving cars arrive.

The transition period will be interesting when you have both types of cars on the road.

The purists will want to keep control of their driving experience, and perhaps they will always want that control.

If you are going to have a self-driving car, you might as well take the bus.

Sources

http://www.cracked.com/blog/why-self-driving-cars-are-tremendously-dumb-idea

https://www.theguardian.com/technology/2016/aug/28/car-hacking-future-self-driving-security

The post Why Self Driving Cars are an Incredibly Bad Idea appeared first on TectoGizmo.



This post first appeared on TectoGizmo - Bringing Tech Home, please read the originial post: here

Share the post

Why Self Driving Cars are an Incredibly Bad Idea

×

Subscribe to Tectogizmo - Bringing Tech Home

Get updates delivered right to your inbox!

Thank you for your subscription

×