Summary/Abstract |
Advanced surveillance systems can autonomously identify military targets. A consequent automated decision to attack without human assessment and authorisation of the action will almost certainly be in breach of international law. Separating decisions and actions identifies the role of machine-made decisions and the human ability to assess them and to authorise action. High autonomy levels in a weapon system place new responsibilities on organisations and personnel at all stages of procurement and use. In this article, Tony Gillespie builds on recent UN expert discussions to propose that detailed legal reviews at all procurement stages, including pre-development, are needed to ensure compliance with international law. Similar reviews are also needed for automated systems in the decision-making process.
|