Ethical Dilemma: Deontological Ethical Theory

downloadDownload
  • Words 784
  • Pages 2
Download PDF

Deontological Ethical Theory focuses on rights, duties, obligations, and Rules (Brinkman & Sanders, 2013). This theory doesn’t depend on the consequences of the action as by focusing on consequences minorities may be suffered horribly and it does not consider moral worth of each human as being equal. Alexander & Moore (2007) suggests that it is a duty-based theory and according to this theory, what makes a choice right is its conformity with a moral norm, even if it does not lead to maximum happiness. Deontology is also referred as Kantianism, as in its most widely recognized form it was developed by Immanuel Kant in the late 18th century.

According to Tavani (2015), Deontology is only morality-based and doesn’t care about the happiness of the actions. In deontology, moral decisions should be based on one’s duties and rights to others. The main motive of Deontology is to treat others in the way you want to be treated.

Click to get a unique essay

Our writers can write you a new plagiarism-free essay on any topic

Deontology is of two types:

Rule Deontology: According to Tavani (2015), in rule deontology, actions are evaluated to whether it is consistent goodwill and establish goodwill consistency through applying a categorical imperative.

Act deontology: It applies when two duties conflict and then individual situations must be examined to determine which duty surpasses the other (Tavani, 2015)

So, the dilemma here is that should a group of homeless people be killed, or the passenger’s life should be risked to death. It is not too far that we are going to see autonomous vehicles moving around us everywhere and as these will not be humans we need different laws and moral theories to be programmed within the automated vehicles to deal with emergency situations where the vehicle need to make a choice about who should be killed or damaged. According to Thornton et al. (2016), automated vehicles need to satisfy societal expectations, such as accident avoidance and adherence to traffic laws, for mixing with human behavior in traffic. Kirkpatrick (2015) also talks about the moral dilemmas associated with self-driving cars and how its decision-making during accidental situations can affect everyone’s life including soft targets and defenseless pedestrians.

Bonnefon et al. (2016) conducted a research on a group of people about a scenario in which they questioned the subjects that in an accidental situation should a self-driving car kill the passenger or a group of 10 pedestrians. This research suggests that majority of the group members choose that the car should save the pedestrians instead of saving the life of the passenger, making utilitarianism as their moral choice. However, the author also continues that when the same group was asked that if they will buy this car or not, then majority of the group members declined to buy the vehicle by stating that it will not be safe for them as the car will try to save the life of the people outside the car instead of the one sitting inside it. Nahra (2013), through his research suggests that most of people agrees that killing an innocent person is wrong but if it necessary to take innocent lives than they will try save the greatest number of lives possible.

So, as we know the deontological ethical theory is based on following the moral rules and according to Lazar (2018), in deontology killing can be permissible if the good achieved by doing that is in most of the case great enough to justify the act. However, he further suggests that in most of the cases it is against moral rules to kill innocent people for one’s own benefits and thus it is unacceptable according to deontological theory. So, for our dilemma, we can conclude that based on the deontological theory of ethics the decision made by self-driving car should be to crash into a concrete barrier, even though being a passenger, my chance of survival is unknown. As deontology is rule-based it doesn’t matter if the other people are homeless or not as based on our duty, we need to save them, instead of sacrificing them for our own safety. I think the application of deontology in this dilemma is justified as in my opinion the innocent persons should not be sacrificed as even if the self-driving car hits the concrete, I, being a passenger, still have a chance of survival but not the homeless people and it is against our duty and rules to kill anyone for our own benefits. Thus, being a software developer, it is helpful to justify that why the algorithms of the self-driving car should save homeless people instead of the passenger which also coincides with our current traffic rule, in which if we kill a group of homeless people to save our life we break the law and need to be prisoned.

image

We use cookies to give you the best experience possible. By continuing we’ll assume you board with our cookie policy.