Gesetzeslage (PhÜD)

Aus Philo Wiki
Version vom 19. Mai 2016, 10:24 Uhr von Anna (Diskussion | Beiträge) (Nature of armed drones: edit)
Wechseln zu:Navigation, Suche

jus ad bellum, jus in bello

Exzerpte aus: Johansson, Linda Autonomous Systems in Society and War : Philosophical Inquiries [1]

It might be argued that ethical evaluations of weapons used in war – such as UAVs, irrespective of their level of autonomy – are meaningless since war is unethical in itself. The ethical evaluation in paper I is made against the backdrop of the laws of war (LOW), as codified in, for instance, the Geneva and Hague conventions. The rules of jus ad bellum specify what criteria must be fulfilled in order to start a war, where “just cause” is the most important one. The rules of jus in bello establish criteria for ethical means of fighting once at war. The rules of jus ad bellum and jus in bello are summed up below13:

Jus ad bellum:

  • Just cause: The reason for going to war needs to be just and cannot therefore be solely for recapturing things taken or punishing people who have done

wrong; innocent life must be in imminent danger and intervention must be to protect life. Examples: self defense from external attack, punishment for a severe wrongdoing which remains uncorrected. This is the first and most important rule.

  • Right intention: The state must intend to fight the war only for the sake of its just cause. Force may be used only in a truly just cause and solely for that purpose – correcting a suffered wrong is considered a right intention, while material gain or maintaining economies are not.
  • Last resort: All peaceful and viable alternatives have been seriously tried and exhausted or are clearly not practical.
  • Legitimate authority: War is only between states.
  • Reasonable chance of success: The idea is that a state’s resort to war must be considered to have a measurable impact on the situation.
  • Proportionality: The anticipated benefits of waging war must be proportionate to its expected evils or harms. (Also known as the principle of macroproportionality to separate it from the jus in bello principle of proportionality).

Jus in bello:

  • Proportionality/Excess: An attack cannot be launched on a military objective if the civilian damage would be excessive in relation to the military advantage – the value of an attack must be in proportion to what is gained.
  • Discrimination: Only military targets and enemy combatants can be attacked.
  • Necessity: The attack must be necessary (just war should be governed by the principle of minimum force). This principle is meant to limit excessive and unnecessary death and destruction.
  • Weapons: All international laws on weapons prohibitions must be obeyed, such as the ban on chemical and biological weapons. Nuclear weapons are considered taboo.


Eine neue Art Waffe?

When looking at weapons used today, it might be argued that there is no morally relevant difference between UAVs and other weapons. According to Asaro, it is important to note that “even if robots did make it easier for a nation to go to war, this in itself does not decide whether that war is just or unjust” (Asaro 2008, p. 48). In paper I, it is argued that there are relevant differences between UAVs and other weapons.

First of all: compared to other weapons that might give one country the option to win a war without losing any lives of its own soldiers – like chemical, biological or nuclear weapons – UAVs are permitted according to the LOW. Nuclear weapons are not formally prohibited (which chemical and biological weapons are), but are considered taboo and have not been used in war since World War II. A complete ban of nuclear weapons is being considered by the United Nations. Among permitted weapons today, UAVs may, more than other weapons, provide the owner with a severely increased inclination to start a war against a country that does not possess the same technology.

UAVs are also different from long-range missiles in being more flexible. A UAV may go closer to a target without risking the life of the “pilot” – that is, the operator, who is often situated on the other side of the globe. This is another aspect of UAVs, making warfare dangerously similar to a computer game and therefore increasing the inclination to fire. Distance is one of the most important factors when it comes to firing at other human beings, which with UAVs is combined with experiencing no risk for one’s personal safety (Grossman 1996).

Die Nachteile der Vorteile von Drohnen

Exzerpte aus: Craig Martin: A means-methods paradox and the legality of drone strikes in armed conflict in: The International Journal of Human Rights, 2015 Vol. 19, No. 2, 142–175, [2]

Illustrative drone strikes

The first lethal drone strike was likely in Afghanistan, in February 2002, when a Predator drone was used by the CIA in a Hellfire missile strike targeting a tall man and two other men who were acting deferential towards him – leading the operators to believe it might be Osama Bin Laden – at an old Mujahedeen base called Zhawar Kili. The target was not Osama Bin Laden of course, but the three men were killed in the strike. Speaking for the Pentagon, Rear Admiral John D. Stufflebeem later acknowledged that the target had not been Bin Laden after all, but suggested that the targets were ‘not innocent’, and that ‘initial indications afterward would seem to say that these [were] not peasant people up there farming’. Another spokesperson later added that the Pentagon was ‘convinced that it was an appropriate target’ but that ‘we do not know yet exactly who it was’.

There was no indication of the basis for such conviction, or indeed what criteria had been used for determining that the three men were legitimate targets. The New York Times later identified the three men, and determined that they had been civilians from nearby villages scavenging for scrap metal.

New York Times: Villagers Say U.S. Should Have Looked, Not Leapt
DONALD RUMSFELD: "Someone has said that these people were not what the people managing the Predator believed them to be. We'll just have to find out. There's not much more anyone could add, except there's one version and there's the other version." (NYT)
AMIE MCINTYRE, CNN MILITARY AFFAIRS CORRESPONDENT: Well, Marty, the question has become not so much did the United States kill Osama bin Laden, but did the United States kill innocent people. That's the claim of villagers in the area near Zhawar Kili where this hellfire missile strike took place one week ago today. They claim, at least they've told newspaper reporters, that three peasants foraging for scrap metal in the mountains were hit by a rocket that took their lives. The Pentagon continues to insist that it was a good target.
A CIA drone firing a hellfire missile, by remote control, took out three people they suspect to be members of the al Qaeda organization. And a U.S. military recovery team is bringing back evidence, the Pentagon says, bolsters its case.
STUFFLEBEEM: Things like weapons and ammunition -- include things like communications systems or at least things that would give you the impression that there might have been communication devices, documents in English, having to do with like with applications for credit cards possibly or maybe for airline schedules. So the intelligence that was garnered to be able to facilitate the strike, the initial indications afterwards would seem to say that these are not peasant people up there farming. (CNN) [3]
Victoria Clarke, a civilian spokesman for the Pentagon, added, ``We're convinced that it was an appropriate target, although ``we do not know yet exactly who it was.
Along with herding goats, driving camels and minding wheat fields down on the plain, scavenging metal is about the only cash-yielding work around. Shortly after the American bombing ended, scores of villagers began heading up to the caves, ready to haul a day's yield down to traders in Khost who pay the equivalent of about 60 cents for a camel-load of scrap. From Khost, the scrap goes to Pakistan, where it is fed to steel mills and gun-making workshops. (NYT)

This strike illustrates a number of features common to other drone strikes that have been documented over time in Afghanistan, and which raise significant questions about compliance with IHL and IHRL. To take a couple more examples, in September 2013 a drone strike in Watapur district, Kunar province, targeted a vehicle thought to be carrying insurgents. It was later determined that there were indeed six insurgents in the vehicle, but also eleven civilians, including four women and four children. Along with the six insurgents, ten of the civilians were killed, leaving a young girl seriously injured. The NATO-led International Security Assistance Force (ISAF) media liaison initially denied the presence of civilians, and would not disclose what pre-engagement measures were taken to verify the identity or status of the targets, or whether the insurgents targeted were of strategic value, saying only that one of the insurgents had ‘most likely’ been ‘high level’.

Applicable Laws

It is undisputed that the primary legal regime governing drone strikes in a traditionally defined armed conflict is IHL. It is worth recalling, at the outset, that IHL is animated by two fundamental but somewhat conflicting rationales and purposes. One of these is to require armed forces to engage in hostilities in accordance with specific limits and constraints, in order to reduce human suffering, and in particular to minimise harm to civilians and civilian objects. The second is to provide legal authority for the conduct of such hostilities, and to thereby immunise the lawful combatants from prosecution or other action under different legal regimes, and to immunise the states on whose part they fight, for conduct that is undertaken in accordance with the principles and rules of IHL.

IHL thus both limits and legitimises the conduct of hostilities in armed conflict. The bulk of the IHL regime applies primarily to international armed conflict – that is conflict between or among sovereign states. This body of IHL comprises a host of treaties, the most important of which are the Hague Conventions of 1899 and 1907, the four Geneva Conventions of 1949, and Additional Protocol I of 1977, together with an extensive body of customary international law principles. Only a subset of these rules and principles applies to conduct in a non-international armed conflict, by which is meant hostilities of a sufficiently intense nature between the armed forces of the state and well-organised armed groups (or hostilities among such groups), within some geographically limited theatre of conflict (the exact parameters of which are the subject of some debate).

Nonetheless, the customary international law principles relating to the law of targeting, and to weapons law, which in turn reflect the core principles of IHL, apply in both international and non-international armed conflict. In particular, the principles of necessity, distinction, humanity, discrimination, proportionality, and precautions in attack, all apply to drone strikes whether in non-international or international armed conflict.

Principles of necessity and distinction

The principle of military necessity reflects the duality of IHL’s rationales, both authorising belligerents to use the requisite force to achieve any military advantage that will advance the cause of winning the conflict, while inherently limiting the use of force to lawful means, and to the extent that such force is actually necessary to achieve a specific military objective.

More specifically central to targeting issues, at least for the purposes of our analysis, is the principle of distinction, which is also one of the core principles of IHL. Codified in both Additional Protocols, it provides that armed forces must distinguish between combatants and civilians, and between military objectives and civilian objects. In particular, the principle of distinction requires that armed forces refrain from making civilians or civilian objects the direct object of targeting or attack. This does not mean that the killing of civilians in a strike in and of itself violates the principle of distinction. This is so even when it was known at the time of the targeting decision that the killing of civilians would be a likely or even a sure consequence of the strike. So long as the killing is incidental to a strike in which the primary target is a legitimate military objective, it does not violate the principle of distinction. Such killing would be ‘collateral damage’, which is the focus of the principle of proportionality, to which we will turn presently.

Principle of proportionality

We can now return to the principle of proportionality, mentioned above and closely related to both necessity and distinction. This principle provides that armed forces are prohibited from launching attacks that would be expected to cause incidental death or injury to civilians, or damage to civilian objects, which would be excessive in relation to the concrete and direct military advantage anticipated.


There continues to be some debate over the scope and precise definition of ‘military advantage’, and even leaving that aside, the calculation of how much civilian life is to be considered excessive in relation to the rather incommensurate notion of military advantage is a difficult business. It should also be noted that any ex post assessment of compliance with the principle of proportionality focuses on what was known and anticipated at the time the decision was made.

Principle of precautions in attack

The final principle directly relevant to our analysis of drone strikes is that of precautions in attack. This builds on the rationale underlying the principles of distinction, discrimination, and proportionality, creating a positive obligation to take care to spare the civilian population, civilians, and civil objects from harm. It provides that ‘all feasible precautions’ must be taken to avoid or to minimise incidental harm to civilians and civilian objects.

Nature of armed drones

We turn next to examine the nature of drones as a weapons system, to assess whether there is something inherent to drones that is likely to cause violations of international law. Drones are not, of course, the only weapons system used in targeted killing operations. Cruise missiles, airstrikes with traditional manned aircraft, and even hunter-killer teams have all been used by the US for the targeting of identified individuals, in Afghanistan and elsewhere. But one of the primary questions addressed in this article is whether there are unique features of the drone that contributes to illegality. It is important to thus begin by examining the attributes of the armed drone as a weapons system.

font color="purple">Drones have a number of features that combine in ways that reinforce one another so as to confer a significant comparative advantage over both cruise missiles and manned fixed-wing aircraft, not only in terms of military tactical advantage, but arguably also in terms of enabling optimal compliance with IHL. On the other hand, perhaps somewhat paradoxically, some of these same features may facilitate, or make more likely, certain violations of IHL.

Positive features of drones

To begin, drones such as the MQ-1 Predator and the MQ-9 Reaper can be deployed over a target for comparatively long periods of time – for as long as 22 hours at a time, as compared with perhaps 90 minutes for an F-16 – for observation and intelligence acquisition, thus providing operators with a longer evaluation and decision-making period before lethal force is employed. This feature of ‘persistence’ is reinforced by stealth, arising from the size and low sound of the drone at altitude – up to 50,000 feet – making it difficult to detect in the absence of sophisticated air-defence systems.

As well, since they are typically on-site directly over the target during the decision-making process, they provide for more rapid implementation of a strike once the decision is made, as compared with, for instance, a combined use of drones for surveillance but manned air-strikes or cruise missiles for the final attack.

In addition to this persistence and stealth, a defining feature of the drones is the intelligence gathering and targeting system, which includes ever more sophisticated sensors and video feeds. The most recent innovation is called the ‘Gorgon Stare’, a system of cameras that will deliver video of a five-mile-diameter area at one time, while allowing operators to zoom into any one segment, or multiple segments at a time.

What is more, a signal advantage of the drone, relative to a manned aircraft such as an F-16, is how the intelligence from such sensors and videos are analysed. Each Air Force drone has a team of at least three operators, including a pilot, a sensor operator, and a mission intelligence coordinator. Moreover, while the mission intelligence coordinator is responsible for overseeing the collection and immediate analysis of the intelligence being gathered, there are other individuals, including intelligence analysts, who may participate in the assessment of incoming data from other remote locations, and be part of the decision-making process via dedicated voice-line or on-line ‘chat rooms’. As compared to a pilot in a manned aircraft, the decision-making process involves more people, assessing far greater volumes of sensor information and intelligence, operating under fewer time constraints and without the stress caused by imminent personal risk to themselves. It has been argued, therefore, that this decision-making process is sounder, less prone to errors, and far more likely to comply with legal obligations.

It is also argued by the defenders of drone strikes that the weapons employed by drones are both highly accurate, and characterised by relatively tight blast areas, thus making the drones a high-precision weapon system.

All of these features may be said to combine in ways that make the armed drone weapons system, as it currently exists and is deployed, one that is likely to enhance compliance with IHL and IHRL, and indeed is much more so than other weapons systems used for targeted killing and air strikes. One last point should be made regarding the features of drones in the context of IHL.

Negative features of drones

There are, however, also some corresponding weaknesses or disadvantages flowing from these very same features.

Operators and decision-makers, sitting somewhere thousands of miles away, are limited in large measure to the video and other sensory intelligence being provided by the drone itself. It has been suggested that decision-makers are prone to a so-called ‘soda-straw effect’ – meaning that operators tend to ‘zoom in’ to focus on an increasingly narrow area around the target, with a resulting loss of information regarding the surrounding context – particularly during the final stages prior to firing.

It has been similarly suggested that as the video and sensor feeds become ever more sophisticated and extensive – as evidenced by the new Gorgon Stare system – the operators are prone to suffer from a ‘data crush’, in which there is simply so much data streaming in during the targeting process, with too little time and too few people to analyse it, that crucial evidence regarding civilian presence, to take one example, is more likely to be missed.

There have also been concerns expressed that the operator’s distance and detachment from the conflict zone and their targets, together with the complete absence of reciprocal risk, may somehow increase the likelihood of targeting errors. This is often expressed and explained in different ways. Thus there is the so-called ‘PlayStation’ effect, in which the concern is that the distance and detachment of operators who are killing by video-feed in the afternoon and are home for a BBQ with their families by evening, may simply not have a sufficiently grave appreciation for the moral nature consequences of their actions.

Many of these concerns tend to get brushed aside by defenders of drones. Thus, the ‘PlayStation’ effect is argued to be somewhat speculative, and in any event can be addressed by strict adherence to IHL and compliant ROE. Similarly, the distance and detachment concern is given short shrift on the grounds that it is actually more of a strength than a weakness, given that it creates conditions for more stress-free decision-making. Moreover, these concerns can be seen as being with the operators as much as with the nature of the drone system itself. But the concerns may lack salience in part because they have not been developed in a systematic fashion, organised within a theoretical framework.

Uruzgan strike

Another strike, which became famous because it was one of the few strikes to be made subject to a publicly disclosed investigation, which resulted in administrative action, occurred in Uruzgan province in February 2010. The missiles were actually fired from a helicopter, but a Predator drone and its crew played an integral part in the operation and the drone crew was determined to have been responsible for serious targeting errors that resulted in the strike. The targets were a group of over 20 people who gathered in the pre-dawn hours and set out in a convoy of vehicles across the province. They were later determined to have been civilians, among whom were women and children, who were travelling together for security in order to traverse a dangerous region. The Predator drone observed them as they set out, and shadowed them for more than three hours, providing data on the group to an American ground commander who was leading a unit that was moving to engage a Taliban force in the area. We will return to this incident in the discussion below, but in short the drone crew misinterpreted the data being received from the drone, leading to the conclusion that the group comprised targetable insurgent men. The entire group comprised civilians, including several women and children, and 23 of them were killed in the strike. The military conducted a rare publicly disclosed investigation, and several senior officers, along with the Predator crew operating out of an Air Force base in Nevada, received administrative sanctions. (Dieser Absatz ist aus S.145 des gedrockten Artikels eingefügt. h.h.)

We return to the context of drone strikes, and the question of whether the operators’ distance from the theatre of conflict could lead to misinterpretation. The consequences of this detachment are not only that the operators lack familiarity with the target culture and environment, such that they might make straightforward mistakes in interpreting behaviour due to ignorance of local conditions. Rather, it may contribute to more systemic problems of misperception. As a result of the detachment, the drone crew are entirely immersed in their own particular institutional sub-culture back home, as well as living and operating within the home culture far from the front. Quite apart from the possibility that this may interfere with their understanding of the moral implications of their work – the so-called ‘PlayStation effect’ criticism discussed earlier – there is the prospect that it may be highly conducive to the development of inappropriate and premature assumptions or hypotheses about potential target populations. That will in turn lead to operators misinterpreting ambiguous information and ignoring contrary evidence in a manner that is consistent with and reinforcing the assumptions, with resulting targeting errors.145 Moreover, this tendency would likely be further exacerbated by the ‘data crush’ and ‘soda straw’ concerns that were discussed earlier, providing a more sound theoretical foundation for those criticisms of the drone operations. A study of the 2010 Uruzgan targeting incident suggests that the tragic targeting error may be explained at least in part by precisely this kind of pattern of misperception. The findings of the formal investigation noted that the drone operations team received evidence that was inconsistent with the hypothesis that the group was a Taliban force, but that this evidence had been ‘ignored or downplayed’ by the operators.146 Portions of the dialogue

�Downloaded by [Vienna University Library] at 03:11 20 May 2015


C. Martin

among the pilot, the sensor operator, and the intelligence coordinator who were operating in Nevada, and the screeners who were reviewing intelligence at a location in Florida, and the ground force in Afghanistan that the drone was supposed to be protecting, have been publicly disclosed.147 It suggests shared attitudes, mind-sets, and perspectives about the local population (perceptual sets), and reveals that they held assumptions, hypotheses, and mutually reinforcing mind-sets (evoked sets) about the group of men under observation from virtually the moment they came under observation. The team’s dialogue from the beginning exhibits a collective desire to find evidence of hostility. There were several instances in which information that was ambiguous or even inconsistent with the team’s starting assumption – that the group under observation comprised insurgent fighters – was incongruously interpreted to actually confirm the presumption. For instance, when the trucks stopped and passengers disembarked to pray at one point early in the operation, the camera operator commented ‘this is their force … Praying? I mean, seriously, this is what they do.’148 Praying became evidence of belligerence. At other times, there was frustration when the operators were unable to find more conclusive confirmation, or when evidence that was clearly inconsistent with the presumption was suggested by the intelligence screeners in Florida. Thus the pilot at one stage states, during a discussion of whether the screeners could see any evidence of weapons: ‘I was hoping we could make a rifle out … never mind’. A little later, when one of the screeners raised the possibility of a child having been spotted among the group, the pilot protests: ‘why didn’t he say “possible” child? Why are they so quick to call kids but not to call rifle?’149 Here we see the potential for the group pressure being brought to bear, with the possible modification and distortion of judgment.150 The suggestion that there might be children present was then quickly reinterpreted as being evidence of possible adolescents. That in turn morphed into ‘possibly military age males’. Military age males is, as we have seen, one of the ‘legally inadequate’ criteria for signature strikes which is thought to have been employed by US forces in Afghanistan (it has been reported that as a result of this incident General McChrystal issued an order banning the use of the criteria).151 Information indicating the presence of protected persons was thus assimilated to existing assumptions and hypotheses, and thereby incrementally transformed to become information confirming the presence of targetable fighters. This was, arguably, due in large measure to flawed initial and potentially prematurely established assumptions, resulting in cognitive closure. As one of the team later recounted: ‘we all had it in our head, “Hey, why do you have 20 military age males at 5 a.m. collecting each other?” There can only be one reason, and that’s because we’ve put [U.S. troops] in the area.’152 Here laid bare are indications of the premature formation of assumptions and consequent cognitive closure, bolstered by classic egocentric interpretation of the actions of others. It was only after the strike, when over 20 people lay dead and dying, that the operators finally recognised the presence of women and children, several of whom were younger than six years old.153 It may be that a more rigorous adherence to legally valid criteria for signature strikes could have helped prevent this tragedy, but as we will explore next, it may be that features of the armed drone weapon system facilitate misperception that makes such errors more likely.

Systemic Problem

To the extent that the above account may reflect an example of a more systemic problem of misinterpretation and misperception, it would not appear to be caused by features of the armed drone itself, but rather of the operators and the targeting criteria being employed.

�Downloaded by [Vienna University Library] at 03:11 20 May 2015

The International Journal of Human Rights


Indeed, once we are talking about the psychology of operators we would seem to be, by definition, out of the realm of the weapon itself. In other words, the problems would appear to relate to the ‘methods of warfare’ according to which the weapon is being used, rather than to anything apparently inherent to the ‘means of warfare’ comprising the weapons system. And yet upon closer consideration this may not be the case. To the extent that systemic targeting errors are being caused by misperception and other cognitive problems, these may be caused or facilitated by a combination of features that relate to both the nature of the operators and the policy they operate under, on the one hand, and features of the drone as a weapon system that may systematically influence how the operators behave. The misperception is, in the final analysis, a function of the operators. And it may be enabled and exacerbated by the policies and rules of engagement they are operating under. But the proposition that requires further study is whether features of the drone itself feed into and facilitate such misperception as well. These features may be interwoven in ways that can be difficult to disentangle and assess individually.