Srl | Item |
1 |
ID:
160117
|
|
|
Summary/Abstract |
Autonomous weapons systems (AWS) are emerging as key technologies of future warfare. So far, academic debate concentrates on the legal-ethical implications of AWS but these do not capture how AWS may shape norms through defining diverging standards of appropriateness in practice. In discussing AWS, the article formulates two critiques on constructivist models of norm emergence: first, constructivist approaches privilege the deliberative over the practical emergence of norms; and second, they overemphasise fundamental norms rather than also accounting for procedural norms, which we introduce in this article. Elaborating on these critiques allows us to respond to a significant gap in research: we examine how standards of procedural appropriateness emerging in the development and usage of AWS often contradict fundamental norms and public legitimacy expectations. Normative content may therefore be shaped procedurally, challenging conventional understandings of how norms are constructed and considered as relevant in International Relations. In this, we outline the contours of a research programme on the relationship of norms and AWS, arguing that AWS can have fundamental normative consequences by setting novel standards of appropriate action in international security policy.
|
|
|
|
|
|
|
|
|
|
2 |
ID:
166084
|
|
|
Summary/Abstract |
It may have been the strangest christening in the history of modern shipbuilding. In April 2016, the U.S. Navy and the Defense Advanced Research Projects Agency (DARPA) celebrated the initial launch of Sea Hunter, a sleek, 132-foot-long trimaran that one observer aptly described as “a Klingon bird of prey.” More unusual than its appearance, however, is the size of the its permanent crew: zero.
|
|
|
|
|
|
|
|
|
|
3 |
ID:
169829
|
|
|
Summary/Abstract |
“The progression from semiautonomous, unarmed supply robots to fully autonomous weapons systems is likely to occur rapidly and with limited public scrutiny.”
|
|
|
|
|
|
|
|
|
|
4 |
ID:
145680
|
|
|
Summary/Abstract |
This article presents an initial discussion of the political and legal challenges associated with weaponised technologies in three interconnected areas that may impinge upon the ability to protect civilian populations during peace and war and imperil international security: armed unmanned combat aerial vehicles (commonly known as drones); autonomous weapons systems (known as ‘killer robots’); and the potential militarisation of cyberspace, or its use as a weapon, and the operation of drones and killer robots in the cyber domain. Supporting the argument that the world is ‘facing new methods of warfare’ and that international security governance and law are not keeping up, the article provides an overview and interpretation of three technologies in connection with aspects of five branches of law: state responsibility, use of force, international humanitarian law, human rights law, and law of the commons. I argue therefore that ‘preventive security governance’ could be a strategy to curtail uncertainty in the preservation of stability and international order. I define ‘preventive security governance’ as the codification of specific or new global norms, arising from existing international law that will clarify expectations and universally agreed behaviour on a given issue-area. This is essential for a peaceful future for humanity and for international order and stability.
|
|
|
|
|
|
|
|
|
|
5 |
ID:
179375
|
|
|
Summary/Abstract |
The ‘weaponisation’ of artificial intelligence and robotics, especially their convergence in autonomous weapons systems (AWS), is a matter of international concern. Debates on AWS have revolved around (i) the identification of hallmarks of AWS with respect to other weapons; (ii) what it is that makes AWS destructive force especially troublesome from a normative standpoint; and (iii) steps the international community can take to allay these concerns. Of particular concern is the need to preserve the ‘human element’ in the use of force. A differentiated approach to this latter issue, which is also principled and prudential, may pave the way to a legally binding instrument to regulate AWS by establishing meaningful human control over all weapons systems.
|
|
|
|
|
|
|
|
|
|
6 |
ID:
176613
|
|
|
Summary/Abstract |
The prospect of increasingly autonomous systems has seized the military imagination and rapidly generated an international debate surrounding the merits of a
potential preemptive ban under international law. What has been missing to this
point has been an in-depth consideration of how artificial intelligence, autonomous systems, and unmanned platforms would be perceived by the junior officers
who will play a core role in their integration into future militaries. Drawing on a
broad survey of officer cadets and midshipmen at the Australian Defence Force
Academy conducted in 2019, this article provides an analysis of how perceived
risks and benefits of autonomous weapon systems are influencing the willingness
of these future defense leaders to deploy alongside them.
|
|
|
|
|
|
|
|
|
|