Digital Theses Archive


Tesi etd-10162019-122542

Type of thesis
Master univ. I liv.
Humanised Weapons or Killer Robots? Brief analysis of unmanned and autonomous weapons compliance with International Humanitarian Law
Istituto di Diritto, Politica e Sviluppo
Master of Arts in Human Rights and Conflict Management
Presidente Prof. DE GUTTRY, ANDREAS M.T.
  • autonomous lethal weapons
  • autonomous robots
  • autonomous weapons
  • drones
  • humanitarian law
  • humanitarian law
  • IHL
  • international humanitarian law
  • weapon systems
Exam session start date
Since when wars started being fought, humans have tried to fight each other from as far as possible, building weapons capable of hitting targets from afar. The idea of reducing risks for soldiers, keeping them away from battlefields, lied behind the invention of bows, catapults, rifles and missiles. Today, the very same imperative pushes governments to invest in the development of unmanned and autonomous weapon systems. The peculiarity of both these kinds of robotic weapons is that they do not feature the presence of human pilots on board. As a consequence, they can get extremely close to enemy’s facilities and gather large amounts of high-quality intelligence information, or perform particularly dangerous offensive operations, without putting at risk soldiers’ lives. Furthermore, autonomous machines i.e. weapons capable of using lethal force without remote human inputs, have an additional ability: as they cannot be influenced by emotions, they cannot commit human errors, nor suffer post-traumatic psychological damages, nor perpetrate war crimes, as a revenge against enemy troops.<br>Notwithstanding this, however, it is fair to say that unmanned and autonomous weapons present also several criticalities. In particular, before an attack, international humanitarian law (IHL) prescribes the need to carefully evaluate several factors, to determine whether the use of force would be legitimate or not. In theory robotic weapons, whether remotely-controlled or fully autonomous, could comply with these legal requirements and consequently, existing treaties do not prohibit the use of such weapon systems per se. Nevertheless, in reality, operational circumstances and current technological developments cannot ensure that these machines could actually be capable of correctly taking into account variable factors such as the military advantage that a precise strike could entail, the risk of collateral damages and other complex proportionality evaluations. In addition, IHL obliges every State to take full responsibility for the acts committed by its armed forces.<br>However, in case of malfunctioning or breaches of law committed by autonomous weapon systems, there would be an accountability gap, as it would be almost impossible to identify a responsible person for such incidents.<br>Following these considerations, this paper will assess whether unmanned and autonomous weapon systems do ultimately comply with the requirements set by the law of armed conflicts. To answer this question, firstly there will be a preliminary clarification regarding the differences between these two kinds of robotic systems, explaining the different categories to which they might belong. Then, IHL’s most relevant principles will be recalled, trying to understand if, and why, they might represent an issue for unmanned or autonomous weapons. Subsequently, there will be a more in-depth description of the functioning of some of these systems, to better understand which are their characteristics and how they work. Finally, some examples of operations that feature unmanned and autonomous weapons will be provided, underlining the aspects that raise concerns.