Gesetzeslage (PhÜD)

Aus Philo Wiki
Version vom 19. Mai 2016, 07:34 Uhr von Anna (Diskussion | Beiträge) (first)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Wechseln zu:Navigation, Suche

jus ad bellum, jus in bello

Exzerpte aus: Johansson, Linda Autonomous Systems in Society and War : Philosophical Inquiries [1]

It might be argued that ethical evaluations of weapons used in war – such as UAVs, irrespective of their level of autonomy – are meaningless since war is unethical in itself. The ethical evaluation in paper I is made against the backdrop of the laws of war (LOW), as codified in, for instance, the Geneva and Hague conventions. The rules of jus ad bellum specify what criteria must be fulfilled in order to start a war, where “just cause” is the most important one. The rules of jus in bello establish criteria for ethical means of fighting once at war. The rules of jus ad bellum and jus in bello are summed up below13:

Jus ad bellum:

Just cause: The reason for going to war needs to be just and cannot therefore be solely for recapturing things taken or punishing people who have done wrong; innocent life must be in imminent danger and intervention must be to protect life. Examples: self defense from external attack, punishment for a severe wrongdoing which remains uncorrected. This is the first and most important rule. Right intention: The state must intend to fight the war only for the sake of its just cause. Force may be used only in a truly just cause and solely for that purpose – correcting a suffered wrong is considered a right intention, while material gain or maintaining economies are not. Last resort: All peaceful and viable alternatives have been seriously tried and exhausted or are clearly not practical. Legitimate authority: War is only between states. Reasonable chance of success: The idea is that a state’s resort to war must be considered to have a measurable impact on the situation. Proportionality: The anticipated benefits of waging war must be proportionate to its expected evils or harms. (Also known as the principle of macroproportionality to separate it from the jus in bello principle of proportionality).

Jus in bello:

Proportionality/Excess: An attack cannot be launched on a military objective if the civilian damage would be excessive in relation to the military advantage – the value of an attack must be in proportion to what is gained. Discrimination: Only military targets and enemy combatants can be attacked.

Necessity: The attack must be necessary (just war should be governed by the

-

principle of minimum force). This principle is meant to limit excessive and unnecessary death and destruction. Weapons: All international laws on weapons prohibitions must be obeyed, such as the ban on chemical and biological weapons. Nuclear weapons are considered taboo.

There is also jus post bellum which concerns regulations connected to war termination; to ease the transition from war to peace (Orend 2008)14. The effects of robots on jus post bellum are not treated in this thesis. It might be argued that it would be sufficient to look solely at jus in bello, since the UAV is something that is used once in war. In paper I it is argued that the possession of UAVs might affect the interpretation of jus ad bellum as well, since UAVs might increase the inclination to start a war. The reason for this, it is argued in paper I, is that UAVs have advantages in terms of reducing casualties for the UAV possessor, and may make war seem more like a risk-free enterprise – in extreme cases even like a computer game – thereby lowering the threshold for starting a war. The possession of UAVs may – more than other weapons – also affect the interpretation of the LOW, for it may determine which normative moral theory the interpretation of the LOW will be based on.

When looking at weapons used today, it might be argued that there is no morally relevant difference between UAVs and other weapons. According to Asaro, it is important to note that “even if robots did make it easier for a nation to go to war, this in itself does not decide whether that war is just or unjust” (Asaro 2008, p. 48). In paper I, it is argued that there are relevant differences between UAVs and other weapons. First of all: compared to other weapons that might give one country the option to win a war without losing any lives of its own soldiers – like chemical, biological or nuclear weapons – UAVs are permitted according to the LOW. Nuclear weapons are not formally prohibited (which chemical and biological weapons are), but are considered taboo and have not been used in war since World War II. A complete ban of nuclear weapons is being considered by the United Nations. Among permitted weapons today, UAVs may, more than other weapons, provide the owner with a severely increased inclination to start a war against a 14

There is little international law regulating jus post bellum, so one must turn to the moral resources of just war theory (Orend 2008).

10

�country that does not possess the same technology. UAVs are also different from long-range missiles in being more flexible. A UAV may go closer to a target without risking the life of the “pilot” – that is, the operator, who is often situated on the other side of the globe. This is another aspect of UAVs, making warfare dangerously similar to a computer game and therefore increasing the inclination to fire. Distance is one of the most important factors when it comes to firing at other human beings, which with UAVs is combined with experiencing no risk for one’s personal safety (Grossman 1996). One problem with the LOW, pointed out in paper I, is that they are too open for interpretation, and that different normative moral theories might provide conflicting results – to the advantage of the UAV possessor. Three terms – “just” (“just cause”), “necessary” and “excessive” – are discussed in paper I, with utilitarianism, deontology and virtue ethics as a backdrop. The conclusion indicates the importance of revising the LOW or adding some rules that focus specifically on UAVs. For instance, if a country that possesses UAVs intends to start a war against a country that does not, then it is particularly important to determine that the cause really is just. The ethical issues regarding UAVs in war today – which paper I attempts to systemize – concern the implications of remoteness rather than autonomy, whereas paper II, “Autonomous Robots in War: Undermining the Ethical Justification for Killing?” has a more futuristic approach. There it is argued that where large parts of – or even entire – armies would be replaced by robots that are autonomous to a high degree, the justification for killing, as interpreted by just war theory, would be substantially undermined. A necessary criterion, it is argued, is based on reciprocity of risk, something that may be eliminated with autonomous robots. The main difference between traditional war theory – which the LOW are based on – and the challenging views, is the implicit assumption of moral equality of combatants in the traditional view. In paper II it is argued that reciprocal imposition of risk is a necessary condition for the moral equality of combatants. This is supported by the fact that the LOW has a strict division between jus ad

bellum and jus in bello, and by quotes from traditional war theorists regarding threat and harm. It is also argued that advanced, autonomous robots violate the principle of reciprocal imposition of risk, and thereby substantially undermine the 11

�ethical justification for killing. It is investigated whether autonomous robots create a special type of asymmetry (risk imposition) – aside from asymmetry due to strength or asymmetry regarding goals – but concluded that it is a subcategory to that of strength. If one belligerent uses unmanned robots, the ethical assumptions that the LOW rest on become substantially undermined. And the ethical justification cannot be transferred. Paradoxically (even though there is a sharp division between jus ad bellum and jus

in bello), the traditional view on justification, with its justification for killing in war based on the moral equality between combatants, for which the reciprocal imposition of risk is a necessary condition, may find it difficult to permit the use of autonomous robots in war. The justification for a robot to kill a human according to the implicit assumptions in the LOW is substantially undermined, as the LOW stand today. The question is more open with the challenging views – something that has to be established case by case, depending on whether the cause is just, and whether the robot-combatants are just or unjust combatants. To prevent or even prohibit the development of military robotics may not be a viable option, but there may be a need to revise or make additions to the LOW. Another suggestion for dealing with the emergence of robots in war is to consider challenging views regarding the justification for killing in war, which more extensive revisions of the LOW might be based on. Another aspect connected to the use of robots in war is the fact that even though the UAVs today are not autonomous in the sense of pulling the trigger themselves, they are autonomous to a certain degree, and can assist the operator in different ways. This means that there is a relationship between one autonomous and one at least semi-autonomous agent, which potentially affects the conduct in war. This will not be discussed further in this thesis.

Rechtskonformer Drohneneinsatz?