Srl | Item |
1 |
ID:
177041
|
|
|
Summary/Abstract |
How might nuclear deterrence be affected by the proliferation of artificial intelligence (AI) and autonomous systems? How might the introduction of intelligent machines affect human-to-human (and human-to-machine) deterrence? Are existing theories of deterrence still applicable in the age of AI and autonomy? The article builds on the rich body of work on nuclear deterrence theory and practice and highlights some of the variegated and contradictory – especially human cognitive psychological – effects of AI and autonomy for nuclear deterrence. It argues that existing theories of deterrence are not applicable in the age of AI and autonomy and introducing intelligent machines into the nuclear enterprise will affect nuclear deterrence in unexpected ways with fundamentally destabilising outcomes. The article speaks to a growing consensus calling for conceptual innovation and novel approaches to nuclear deterrence, building on nascent post-classical deterrence theorising that considers the implications of introducing non-human agents into human strategic interactions.
|
|
|
|
|
|
|
|
|
|
2 |
ID:
147554
|
|
|
Summary/Abstract |
There is growing concern in some quarters that the drones used by the United States and others represent precursors to the further automation of military force through the use of lethal autonomous weapon systems (LAWS). These weapons, though they do not generally exist today, have already been the subject of multiple discussions at the United Nations. Do autonomous weapons raise unique ethical questions for warfare, with implications for just war theory? This essay describes and assesses the ongoing debate, focusing on the ethical implications of whether autonomous weapons can operate effectively, whether human accountability and responsibility for autonomous weapon systems are possible, and whether delegating life and death decisions to machines inherently undermines human dignity. The concept of LAWS is extremely broad and this essay considers LAWS in three categories: munition, platforms, and operational systems.
|
|
|
|
|
|
|
|
|
|
3 |
ID:
165632
|
|
|
Summary/Abstract |
IN THE PAST three or four years, a movement to ban "autonomous combat robots," which in Russia are called "lethal autonomous systems" (in Western literature, LAWS), has been gaining strength in the world but remains almost unnoticed in Russia. Their prohibition is being advocated by the nongovernmental organizations Stop Killer Robots, Article 36, the International Committee for Robot Arms Control; prominent business leaders like Elon Musk and Steve Wozniak; Nobel laureates; scientists and programmers working in the field of artificial intelligence; and even entire corporations. Some believe that fully autonomous weapons will not be able to comply with International Humanitarian Law (IHL) and could create confusion when it comes to identifying individuals responsible for the illegal actions of robots. Others believe that even if "terminators" could one day perform "combat functions" more precisely and judiciously than human fighters, their autonomous use must still be prohibited in the interest of the highest values of human dignity.
|
|
|
|
|
|
|
|
|
|
4 |
ID:
177751
|
|
|
Summary/Abstract |
Many see the advent of lethal autonomous weapon systems as the next revolution in military affairs. Currently, some 30 countries share the view that these weapons should be preemptively banned, but we know relatively little about their motivations. This study contributes to the growing literature on “killer robots” by theorizing preventive arms control as an anticipatory response to military innovation. I suggest that states prefer preventive arms control when they lack capacities or incentives to pursue innovation in the first place. I analyze a cross-sectional dataset on national positions toward the ban on autonomous weapons and demonstrate that the probability of support for preventive prohibition decreases with increasing financial and technological capacities. Both democracies and autocracies are less likely to support the ban than mixed regimes. Conversely, states with strong humanitarian orientation and high socialization within specific arms control regimes are more likely to support the ban.
|
|
|
|
|
|
|
|
|
|
5 |
ID:
170140
|
|
|