Defining semi-autonomous, automated and autonomous weapon systems in order to understand their ethical challenges
- PDF / 515,464 Bytes
- 5 Pages / 595.276 x 790.866 pts Page_size
- 65 Downloads / 251 Views
ORIGINAL ARTICLE
Defining semi‑autonomous, automated and autonomous weapon systems in order to understand their ethical challenges Jean‑François Caron1,2 Accepted: 10 November 2020 © Springer Nature Limited 2020
Abstract There is a lot of misunderstandings when it comes to what has been labelled as “autonomous killing robots”. Indeed, the robotization of weapons is a very complex issue and requires a clear conceptualization of these various types of weapons that are currently being used in the military. This article offers a typology of these weapon systems by distinguishing between semi-autonomous, automated and autonomous weapons. This necessary distinction allows for a better understanding of the ethical challenges associated with these systems. Keywords Autonomous weapon systems · Automated weapon systems · Semi-autonomous weapon system · War · Military technologies · Just war theory If we are to believe some reports, the world of warfare is about to be profoundly changed. Not only are we to witness the deployment of super soldiers on the battlefields (Caron 2018), but autonomous killing robots will also replace human combatants. In light of the various Hollywood scenarios that have been made in the last decades, this prospect is for many of us very problematic. Indeed, how can we think objectively about this possibility when our minds are influenced by movies in which mankind is losing control over the machines, such as War Games or the Terminator franchise? Yet, ignoring that cultural legacy is essential if we are to assess the appropriateness of using these weapons systems and face the ethical questions connected with this new reality. The other challenge is to have a clear understanding of what we are talking about when we are talking about autonomous machines. There are a lot of confusion in this regard as a lot of people tend to assimilate technologies such as drones or the Israeli Iron Dome defense system in the same category as HAL in 2001: A Space Odyssey, namely a computer able to make decisions on its own. This is a serious mistake that needs to be overcome. If the latter system that This text is an extract from Caron (2019). * Jean‑François Caron jean‑[email protected] 1
Nazarbayev University, Nur‑Sultan, Kazakhstan
University of Opole, Opole, Poland
2
came out of Stanley Kubrick and Arthur C. Clark’s minds can be labelled as an autonomous system, the formers fall within two different categories, namely of semi-autonomous and or automated weapon systems. This therefore begs the question of how we can we differentiate between these types of weapons? It can be argued that one way of distinguishing between them is through the relationship humans have with the machines when they are performing tasks; in other words, whether there is a human in the loop as well as how their lethal capacities operate. In this perspective, some weapons’ autonomy is solely preprogrammed, and their lethal capacities remain entirely the prerogative of a human operator. This is the case with many military t
Data Loading...