Item Details
Skip Navigation Links
   ActiveUsers:4750Hits:25718874Skip Navigation Links
Show My Basket
Contact Us
IDSA Web Site
Ask Us
Today's News
HelpExpand Help
Advanced search

In Basket
  Journal Article   Journal Article
 

ID195267
Title ProperTrusting machine intelligence
Other Title Informationartificial intelligence and human-autonomy teaming in military operations
LanguageENG
AuthorMayer, Michael
Summary / Abstract (Note)Continuous advances in artificial intelligence has enabled higher levels of autonomy in military systems. As the role of machine-intelligence expands, effective co-operation between humans and autonomous systems will become an increasingly relevant aspect of future military operations. Successful human-autonomy teaming (HAT) requires establishing appropriate levels of trust in machine-intelligence, which can vary according to the context in which HAT occurs. The expansive body of literature on trust and automation, combined with newer contributions focused on autonomy in military systems, forms the basis of this study. Various aspects of trust within three general categories of machine intelligence applications are examined. These include data integration and analysis, autonomous systems in all domains, and decision-support applications. The issues related to appropriately calibrating trust levels varies within each category, as do the consequences of poorly aligned trust and potential mitigation measures.
`In' analytical NoteDefense and Security Analysis Vol. 39, No.4; Dec 2023: p.521-538
Journal SourceDefense and Security Analysis 2023-12 39, 4
Key WordsArtificial Intelligence ;  Trust ;  Future Battlefield ;  Autonomous Platforms ;  Decision - Centric Warfare