Item Details
Skip Navigation Links
   ActiveUsers:703Hits:36597067Skip Navigation Links
Show My Basket
Contact Us
IDSA Web Site
Ask Us
Today's News
HelpExpand Help
Advanced search

In Basket
  Journal Article   Journal Article
 

ID196004
Title ProperHow the process of discovering cyberattacks biases our understanding of cybersecurity
LanguageENG
AuthorOppenheimer, Harry
Summary / Abstract (Note)Social scientists do not directly study cyberattacks; they draw inferences from attack reports that are public and visible. Like human rights violations or war casualties, there are missing cyberattacks that researchers have not observed. The existing approach is to either ignore missing data and assume they do not exist or argue that reported attacks accurately represent the missing events. This article is the first to detail the steps between attack, discovery and public report to identify sources of bias in cyber data. Visibility bias presents significant inferential challenges for cybersecurity – some attacks are easy to observe or claimed by attackers, while others take a long time to surface or are carried out by actors seeking to hide their actions. The article argues that missing attacks in public reporting likely share features of reported attacks that take the longest to surface. It builds on datasets of cyberattacks by or against Five Eyes (an intelligence alliance composed of Australia, Canada, New Zealand, the United Kingdom and the United States) governments and adds new data on when attacks occurred, when the media first reported them, and the characteristics of attackers and techniques. Leveraging survival models, it demonstrates how the delay between attack and disclosure depends on both the attacker’s identity (state or non-state) and the technical characteristics of the attack (whether it targets information confidentiality, integrity, or availability). The article argues that missing cybersecurity events are least likely to be carried out by non-state actors or target information availability. Our understanding of ‘persistent engagement,’ relative capabilities, ‘intelligence contests’ and cyber coercion rely on accurately measuring restraint. This article’s findings cast significant doubt on whether researchers have accurately measured and observed restraint, and informs how others should consider external validity. This article has implications for our understanding of data bias, empirical cybersecurity research and secrecy in international relations.
`In' analytical NoteJournal of Peace Research Vol. 61, No.1; Nov 2024: p.28-43
Journal SourceJournal of Peace Research Vol; 61 No 1
Key WordsInformation Security ;  Cybersecurity ;  Event Data ;  Data Bias


 
 
Media / Other Links  Full Text