The killer robot, embodied in the figure of Terminator, fuels fantasies, between immortality, transhumanism and apocalyptic scenarios. This sulphurous imagination continues to dominate the debate on the question of the autonomy of weapon systems. In the name of morality and law, civil society denounces the opening of a Pandora’s box, predicts the arrival of robots on the battlefield and calls for the preventive ban of lethal autonomous weapon systems (Sala) .
→ ANALYSIS. Still no ban on killer robots
For their part, arms manufacturers and the military have been working for a long time on the use ofartificial intelligence and robotics in weapon systems. Thus, automation already applies to functions as diverse as navigation, observation, recognition and acquisition of targets or fire controls. Depending on the level of automation of the weapon system, a continuum goes from the fully teleoperated armed system to the autonomous armed system without human supervision, the latter still a matter of science fiction.
In study (1) published on Tuesday May 24, Laure de Rochegonde, researcher at the French Institute of International Relations (Ifri), invites “overcome any form of Manichaeism”, while noticing “the existence of two parallel debates” more and more divergent. The first, ethical and political, concerns the preventive regulation or prohibition of Sala. The second, technical, concerns the degree of functional autonomy possible and desirable, from a military point of view. Moreover, the semantic confusion around the notion of autonomy is detrimental to a rational approach.
“Tales suggest that an autonomous weapon system would ‘self-select’ to engage one target over another and set its own mission outside of human control,” emphasizes the researcher. However, according to specialists, it is rather a form of autonomy under human supervision which should see the light of day in the coming decades with regard to the decision to fire. “Technological developments will be such that we can increasingly do without humans in the loopemphasizes Laure de Rochegonde. With the accelerating pace of war, a human intelligence will not necessarily be fast enough to have an adequate reaction. »
An intermediate category
In France, the ethics committee, created in 2019 by the Minister for the Armed Forces Florence Parly, has drawn a distinction between the Sala – a red line set by the authorities, which excludes their development and their use –, and a category of weapons intermediaries, called “lethal weapons systems incorporating autonomy” (Salia). One way to pave the way for weapons that can certainly kill but that will be “unable to act alone, without human control, to modify their rules of engagement and to take lethal initiatives”.
Three projects place great emphasis on autonomous weapon systems: the Future Air Combat System (Scaf), the Future Battle Tank (MGCS) and the Future Mine Countermeasures System (Slam-F) launched in 2020. These programs are part of a “race for autonomy” dominated by the United States, China and Russia, in which Israel, South Korea, Turkey, Iran, Pakistan, the United Kingdom, France and Estonia are also looking to do well.
On the regulatory front, Paris favors a multilateral approach within the UN framework of the Convention on Certain Conventional Weapons (CCW). Between “disarming” states in favor of the preventive banning of Sala, in the wake of the coalition of NGOs of the “Stop Killer Robots” campaign, and partisans of laissez-faire like the United States, France defends a median position aimed at reaching an agreement on a definition of Sala and a code of conduct.
In contrast, other states are advocating for an ad hoc process, which is supposed to be more favorable to a preventive ban, such as the Ottawa convention on the prohibition of anti-personnel mines and the Oslo convention on weapons cluster munitions.
We would love to give thanks to the author of this post for this amazing material
Neither Rambo, nor Terminator: Report debunks ‘killer robot’ fantasies
You can view our social media profiles here as well as other pages related to them here.https://yaroos.com/related-pages/