Why Self-Driving Cars Must Be Programmed To Kill?

Why Self-Driving Cars Must Be Programmed To Kill

Years ago, self-driving cars were a dream. Today, they’re almost a reality. In a few years, these vehicles will flood the streets. They might even become more popular than ordinary vehicles.

Self-driving cars are more fuel-efficient and cleaner. Thus, they might appeal to a broader audience. Their safety concerns are the only major issue. Passengers or pedestrians’ lives could be in grave danger.

Should self-driving cars be programmed to kill? That’s the big question. Read to know more about self-driving cars and the answer to the above question.

What Are Self-Driving Cars?

Imagine owning a high-tech car that drives you to work now and then. All you do is get ready and hop in. You don’t lift a finger nor require a driver. Your car takes complete care of your commuting. Sounds good, right? That’s the self-driving car for you.

- Advertisement -

These high-tech machines come designed to take driving away from humans. What’s more, they handle driving the same as professional drivers.

Self-driving cars go by numerous names. They’re called robocars, autonomous vehicles, robot cars, and driverless cars.

Think about the comfort we get from partially automated cars. Compare that to fully-autonomous cars. You’ll find that self-driving cars are miles ahead of these ordinary cars in many areas.

Ordinary cars have several advanced technologies. These include intelligent cruise control, automatic overtaking, and even parallel parking programs. With such programs, drivers can sometimes relax while the computer takes over driving.

But despite these sophisticated technologies on ordinary cars, self-driving cars are still miles ahead. They can sense their environment and drive a vehicle with or without human help. Driverless cars use advanced software and sensors to control, navigate, and drive passengers to their destinations.

Will Fully-Autonomous Vehicles Be Safe?

Every self-driving automaker knows that safety is a priority. If self-driving cars aren’t safe, they won’t be considered roadworthy by the government. Hence the effort and resources of automakers would be in vain.

If self-driving cars are safer than ordinary cars, they’ll attract more interest from people. The Uber self-driving car incident that claimed a pedestrian’s life is fresh in people’s heads, albeit it happened years ago.

At the moment, automakers strive to deliver a level-5 automobile, a fully-autonomous car that can drive without human intervention. They are close to delivering a safer and highly sophisticated fully-autonomous automobile. So let’s hope they can achieve a massive breakthrough in the area of safety.

Note – Self-driving vehicles include cars and trucks. The technology is also expanding. Thus, we may even have self-driving heavy-duty trucks in no distant time.

Self-Driving Cars – Layers Of Autonomy

The level of driving automation differs from one automobile to another. Impressively, researchers have managed to place them in groups. The level of driving automation, according to these researchers, is on a scale of 0 to 5.

The bottom line is 0 is the least, while 5 is the highest level of automation. At the highest level, self-driving vehicles don’t require human intervention to drive passengers to their required destinations.

Here’s what each layer of autonomy implies;

  • Level 0 – The first level on the scale, which involves manual control. In other words, the human driver performs the necessary driving tasks, such as braking, accelerating, steering, etc. There’s no automation in such vehicles.
  • Level 1 – Here, driver assistance is available. The car itself controls certain features like automatic braking, cruise control, and others, but they happen at a time.
  • Level 2 – This is where you have partial automation. Two automated functions can happen simultaneously, such as steering while accelerating. Such vehicles also require human drivers to ensure safety.
  • Level 3 – Conditional automation. These sets of vehicles come equipped to sense their environment and perform driving tasks. They can handle all the safety-related functions. Nevertheless, a human driver might be required on the wheels to handle certain driving situations.
  • Level 4 – The high automation stage. Consider such cars fully autonomous in specific driving scenarios.
  • Level 5 – At this stageyou have full automation. Cars at this level can self-drive in all situations. They don’t need human assistance in any circumstance.

How Self-Driving Vehicles Work

Uber, Google, Tesla, Nissan, and Waymo, and others, have successfully delivered a range of impressive self-driving cars. Nevertheless, none of them has been able to achieve level-5, as of the time of writing.

As of the time of writing, Waymo is the leading self-driving car automaker in the world. The company is also closer to developing a level-5 automobile than other automakers. This development is massive, and we hope they can deliver safer and more robust self-driving cars.

Furthermore, each of the autonomous automakers has managed to develop sophisticated technologies for their self-driving vehicles. Design details may vary in these vehicles, but they boast technologies that support the creation and maintenance of internal maps.

These cars have a range of sensors on different parts of their bodies, enabling them to sense the environment.

The Uber’s famous self-driving prototypes, for example, utilize sixty-four laser beams together with other sensors. With these, they were able to develop internal maps.

Furthermore, Google’s prototypes utilized various processes for the creation of their internal maps at various stages. They used lasers, radar, sonar, and even high-powered cameras.

Once the data is gathered, they’re immediately processed by the software, which helps plot a path. After that, the instructions get sent to the “actuators” for execution. These actuators control various systems of the vehicle, such as steering, brake, acceleration, and others.

How self-driving cars operate makes the technology behind them notable. Isn’t it beautiful seeing autonomous cars navigate obstacles and obey traffic rules? It sure is.

Avoiding obstacles and following traffic lights is possible, thanks to their highly sophisticated technology. These include obstacle avoidance algorithms, hard-coded rules, smart objective discrimination, and predictive modeling.

These self-driving cars use object discrimination technology to tell the difference between a motorbike and bicycle, without human assistance. That way, they can prevent accidents to a certain degree.

The fully-autonomous vehicle doesn’t require human assistance. The steering wheel might not even be available in some of these level-5 autonomous vehicles.

Will Self-Driving Vehicles Change The World?

The technology behind self-driving vehicles is too massive to be ignored. Thus, self-driving vehicles will undoubtedly impact lives and change the world. The only question is, how?

Automakers in the self-driving car business view it as a promising venture. With that in mind, they’re able to invest massively, without being bothered about making a profit.

Self-driving automakers are optimistic about creating perfect level-5 automobiles in no distant time. With these vehicles in existence and their safety concerns tackled, automakers will smile to the bank.

Self-driving cars will also change how passengers or car owners commute. With such vehicles, you just have to get ready for work and take the back seat. You can also nap, read, use your devices, or do other tasks while your car handles the driving.

Individuals with disabilities, who have problems getting driver’s license approval, will benefit from these high-tech automobiles. They’ll not only secure their licenses but commute to wherever they want with ease. By the way, level-5 driverless cars don’t need human assistance.

Safety is another reason why the future of self-driving cars looks promising. According to the United States’ Department of Transport, including the NHTSA’s traffic fatality statistical projections for 2017, 37,150 people died because of motor vehicle traffic accidents.

NHTSA’s report shows that 94% of these deadly crashes emanated from poor choices or human error. In other words, the drivers are either distracted or were heavily drunk.

Though autonomous vehicles aren’t entirely safe as of the time of writing, they’ll change the narrative once the right safety measures are in place. Since they cannot be distracted or drunk, self-driving cars can stop or reduce accidents caused by distraction or drunkenness.

The economic impact would be massive if autonomous vehicles can lower the number of crashes that occur annually. According to NHTSA, injuries, and deaths, resulting from motor vehicle accidents, rob the US economy of billions of dollars each year.

What about traffic congestion? This will be reduced drastically when a reasonable number of self-driving cars hit the road, or when self-driving cars outnumber ordinary cars. This means less time spent on commuting, which people can channel to other productive ventures.

Is Hacking Self-Driving Car Possible?

Cybersecurity is everybody’s concern. No one is safe from hacking. As long as you’re using an internet-enabled device, hackers can gain access to your device. How they manage to get through the fences on most devices remains a mystery.

So, is hacking a self-driving car a possibility? It might sound surprising, but yes, it can. All it will take for this to happen is a competent hacker, which isn’t in short supply. They just have to figure out a loophole to take control of a vehicle’s system.

The consequences of hacking a self-driving vehicle can be grave for passengers or car owners. Automakers will also take part of the blame or have their names soiled.

Imagine being in a hacked vehicle, with the hacker having the power to control the vehicle’s acceleration, steering, and other systems. In this case, anything can happen.

Can automakers save their vehicles from being hacked? The answer is yes. Not every computer can be hacked. That’s because cybersecurity experts are also working harder to make life difficult for hackers.

One reasonable step automakers can take to ensure that their self-driving cars are safe from hacking is to change their security architecture. There’s a need for them to partner with security experts and government agencies to address the security concerns of their vehicles.

Automakers can create unified security architecture, verified by security experts. On the other hand, government agencies can create guidelines that every automaker must adhere to.

So, it is evident that cybersecurity experts have a lot to do in ensuring self-driving cars’ safety and success. They need to protect the information assets and critical infrastructures of these cars at all times.

Why Should The Self-Driving Vehicles Be Programmed To Kill?

The technology behind these machines is commendable. These cars can do virtually everything that a professional driver does. Additionally, they’re cleaner, safer, and more fuel-efficient.

However, all the hype is nothing if safety is still a concern. What good is a car that is prone to accidents?

The truth is self-driving cars aren’t entirely safe. There are ethical questions regarding how they should act when faced with an unavoidable accident. What decision should they take? That’s the question.

Should the passenger or owner be sacrificed, or should the vehicle be allowed to crush innocent pedestrians? You can see how dicey the question is.

Automakers have a lot of decisions to make on the algorithm morality of self-driving cars, bearing in mind that their decisions will influence how widespread or acceptable self-driving cars will be.

Now let’s be logical. Can you splash your hard-earned cash on a vehicle that’s programmed to take your life? That’s the situation we have found ourselves regarding self-driving cars.

So the issue of algorithm morality is a serious one. But thankfully, Jean-Francois Bonnefon from the prestigious Toulouse School of Economics based in France makes things easier for everyone. He and a group of others came up with a survey on the issue.

The survey designed for the public, according to the researchers, will provide insight into the moral algorithm of self-driving vehicles. Jean-Francois and his team believe that there’s no right or wrong answer. Everyone’s opinion on the issue counts.

With experimental ethics, Jean-Francois and his friends were able to gauge public opinion on this issue. They created several ethical dilemmas to get people’s opinions on how they would react in each scenario.

Here’s an example. You were fortunate to buy an autonomous vehicle. On a fateful day, while in your vehicle heading to work, an unavoidable event took place. Suddenly, you found your vehicle heading towards a bunch of pedestrians. They were approximate 10 in number.

Assuming there’s nothing you can do other than steer your vehicle to a wall or let it continue and crush the 10 persons.

Now here’s the tricky part. Hitting the wall will cause the passenger or car owner to lose his or her life.

On the other hand, all 10 persons will die if the car isn’t steered off.

In this scenario, what would you do?

In the survey, most people preferred sacrificing the passenger or car owner’s life to spare the 10 pedestrians. But in this case, only a handful of people will be comfortable splashing their cash on a vehicle programmed to take the lives of their owners or passengers.

Jean-Francois and his group of researchers also varied the details to get a more robust opinion. They changed the number of pedestrians, including the decision-maker to either the onboard computer or the passenger.

From the survey, Jean-Francois and his team concluded that most people favored reducing the death toll. Many also didn’t buy into the idea that autonomous vehicles will be programmed to claim their owners’ lives.

Now here’s the shocker. Most persons favoring the reduction of the death toll by sacrificing the passenger or car owner supported the idea, as long as they’re not the ones in the vehicle.

Conclusion

Self-driving cars would change the world for good. But as they prepare to hit the streets in their numbers, issues regarding algorithm morality needs to be trashed out. We’re seeking a solution to road accidents, not make matters worse.

You May Like These Articles As Well:

Does Tesla Take Trade-Ins?

Does AutoZone Install Batteries?