University Assignment: Ethical Analysis of Self-Driving Car Crashes

Verified

Added on  2020/05/16

|6
|1817
|57
Case Study
AI Summary
This case study analyzes the ethical complexities surrounding a crash involving a self-driving car, specifically a Google-modified Lexus SUV. The accident, which occurred in 2016, raises critical questions about assigning responsibility when autonomous vehicles are involved in collisions. The study explores ethical frameworks such as utilitarianism and deontology to evaluate the moral implications of such incidents, considering factors like the car's decision-making process, sensor functionality, and the role of backup systems. It examines the impact on passengers, car manufacturers, and the potential for algorithmic biases. The analysis further investigates options for mitigating such accidents through improved coding, backup mechanisms, and human intervention, ultimately aiming to identify the most ethical and practical solutions for the future of autonomous vehicles. The assignment concludes by emphasizing the importance of ethical considerations in the development and deployment of self-driving technology.
tabler-icon-diamond-filled.svg

Contribute Materials

Your contribution can guide someone’s learning journey. Share your documents today.
Document Page
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Who is Responsible When a Self-Driving Car Crashes?
Name of the Student
Name of the University
Author Note
Document Page
Doing ethics technique
1. What is going on?
One self-driving cars of Google, which is a modified Lexus SUV caused a crash on Valentine’s Day
2016. The car while on its journey was trying to avoid a pile of sandbags surrounding a storm drain
detected. The car could successfully avoid the hazard however, within a matter of a few seconds, it
collided against the side of a car causing an accident (Iozzio, 2016). Although it was not the first
crash of the project however, it was the first non-human error of the project. This incident gives rise
to a major ethical issue questioning whom to be blamed when an autonomous vehicle crashes.
According to Utilitarianism theory of ethics, whether an action is morally right or wrong entirely
depends on the effects (Hayry, 2013). The effect of the accident cause is not positive and therefore,
it can certainly be said that the accidents caused due to driverless cars are not ethical (Thierer &
Hagemann, 2015).
2. What are the facts?
According to deontology theory of ethics, the morality of an action is based on certain rules. The
driverless SUV was able to detect the sandbags surrounding a storm drain and therefore changed its
course. The sensors of the car were therefore perfectly working. On basis of deontology theory, it
can therefore be said that the car was following the commands as given. Furthermore, the car was
able to sense its proximity with the bus. The test driver of the Lexus saw the bus but assumed that
the driver to allow the SUV to continue will slow it down. This gives rise to an important ethical
question that if such incidents are witnessed in future, who is to be blamed for the hazard caused.
The self-driving pioneers however, are starting to make a switch as in last year, Volvo declared that it
would pay for any injuries or damage in the property that would be caused by its fully autonomous
IntelliSafe Autopilot system. This system is possible to make a debut in company’s car by the year
2020 (Iozzio, 2016). This system will incorporate the many backup systems as well so that a human
driver never needs to intervene. This will possible eliminate the risk of human driver being at fault. It
will also incorporate proper security measures so that even if the system fails the car is able to bring
itself to a safe stop. However, in this case, it is difficult to identify whom to be blamed for the
accident (Bonnefon, Shariff & Rahwan, 2015).
3. What are the issues?
The incident highlights that if a self-driving car is not properly incorporated with a backup system, it
can lead to several injuries to the passengers. In this case, the self-driving car although could sense
its proximity with the bus, could not make a proper decision. It was an incorrect decision making or a
command that lead to the crash. Furthermore, the crash could have been avoided if the car would
have a proper emergency braking system. Data from Insurance Institute for Highway safety claims
that a car avoidance braking can reduce the amount of rear end collision by 40 percent. By the
ethical theory of virtue, the key element of an ethical thinking is an individual’s character and not
the rules as stated by the deontology theory (Willis, 2014). The issues associated with the crash of a
driverless car are particularly very complex as it is very difficult to determine whom to be blamed for
the accident caused.
Document Page
4. Who is affected?
This question lists the number of affected clients of the incident. When a self-driving car crashes, a
number of individuals are affected. The passengers of a self-driving car are affected the most along
with the company manufacturing the car. Since a self-driving car needs no human intervention, the
passengers of the cars are mostly affected. The passengers of the car are the most affected in this
case without even any fault of theirs (Crockett, 2013).
This leads to ETHICS ANALYSIS:
5. What are the ethical issues and implications?
In case a driverless car crashes, an ethical issue is faced regarding whom to be blamed for the crash,
the carmaker or the car owner. The algorithm that controls a driverless car should be embedded
with moral principles for guiding their decisions in certain situation of unavoidable harm. In this
incident, if the bus would have slowed down making way for the SUV, the crash could have been
avoided. Therefore, it is not entirely the fault of the driverless car for the crash. The bus driver can
be at a fault too, as he could have avoided the accident if it had made the way for the SUV. It is
however difficult to say whether the bus driver could see the SUV and if he had seen the car, then it
is obvious that he had sensed the crash too. Now the ethical dilemma arises as in who is to be
blamed for the crash, the driverless SUV or the bus? (Hevelke & Nida-Rümelin, 2015). If the car
owner is to be blamed for the accidents, it will oppose the ethical theory of deontology (Goodall,
2014). Furthermore, if this were the case, it is obvious that people will not buy such cars and the
concept of driverless car will die down eventually. On the other hand, if the carmaker is to be
blamed, it will affect the whole company inviting a reputational risk. Therefore, it is needed for the
car manufacturers to employ proper safety requirements if such incident occurs.
6. What can be done about it?
First, it is necessary to ensure that no accidents occur due to driverless cars. In order to ensure that,
the algorithm that controls a driverless car should be intelligently coded so that it can avoid human
loss in cases where an accident is inevitable. It is not ethical to blame the car owner for the accident,
as the operation of a driverless car is not in his hands. The carmaker can be blamed for the accident,
as it is the responsibility of the company to ensure correct decision-making by the driverless car. In
this case, even the driver could have been blame for the accident but since the driverless car
changed its route to centre lane in order to avoid a pile of sandbags that led to its crash with the bus.
Therefore, the only thing that can be done about a driverless car is to code its algorithms properly in
order to avoid the avoidable accidents, which was this particular case (Lin, 2016).
7. What options are there?
There are only two options in order to avoid such accidents. One is coding an algorithm that will
ensure proper and quick decision-making in similar cases as this. The second option includes having
a proper backup mechanism for avoiding such accidents such as emergency braking system. The
accidents due to a machinery or decision-making fault in a driverless car can further be avoided by
human intervention, that is if a human present in a driverless car is given a control of the car if
required, it possible to avoid such incidents. However, if any accidents caused in such cases, it will be
the car owner who will be blamed (Bonnefon, Shariff & Rahwan, 2016).
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
8. Which option is best – and why?
With the emergence of technology, it has now been possible to think and implement a fully
autonomous car. Proper research about the causes of failed test-driving of an autonomous car can
be used to redesign the algorithm that controls the decision-making and action of the car. This will
help in avoiding the accidents that are caused due to the incorrect decision-making of a driverless
car. Along with this option, the car manufacturers should ensure proper backup of safe driving as
well. This is necessary because if an accident occurs the passengers of the car are able to survive.
The use of emergency braking system can help in avoiding certain accidents as well (Litman, 2014).
Document Page
References
Bonnefon, J. F., Shariff, A., & Rahwan, I. (2015). Autonomous vehicles need experimental ethics: are
we ready for utilitarian cars?. arXiv preprint arXiv:1510.03346.
Bonnefon, J. F., Shariff, A., & Rahwan, I. (2016). The social dilemma of autonomous
vehicles. Science, 352(6293), 1573-1576.
Crockett, M. J. (2013). Models of morality. Trends in cognitive sciences, 17(8), 363-366.
Goodall, N. (2014). Ethical decision making during automated vehicle crashes. Transportation
Research Record: Journal of the Transportation Research Board, (2424), 58-65.
Hayry, M. (2013). Liberal utilitarianism and applied ethics. Routledge.
Hevelke, A., & Nida-Rümelin, J. (2015). Responsibility for crashes of autonomous vehicles: an ethical
analysis. Science and engineering ethics, 21(3), 619-630.
Iozzio , C. (2016). Who's Responsible When a Self-Driving Car Crashes?. Retrieved from
https://www.scientificamerican.com/article/who-s-responsible-when-a-self-driving-car-
crashes/#
Lin, P. (2016). Why ethics matters for autonomous cars. In Autonomous Driving (pp. 69-85). Springer
Berlin Heidelberg.
Litman, T. (2014). Autonomous vehicle implementation predictions. Victoria Transport Policy
Institute, 28.
Thierer, A., & Hagemann, R. (2015). Removing roadblocks to intelligent vehicles and driverless
cars. Wake Forest JL & Pol'y, 5, 339.
Willis, J. E. (2014). Learning analytics and ethics: A framework beyond utilitarianism. Educause
Review.
chevron_up_icon
1 out of 6
circle_padding
hide_on_mobile
zoom_out_icon
logo.png

Your All-in-One AI-Powered Toolkit for Academic Success.

Available 24*7 on WhatsApp / Email

[object Object]