Query Result Set
Skip Navigation Links
   ActiveUsers:4011Hits:20933913Skip Navigation Links
Show My Basket
Contact Us
IDSA Web Site
Ask Us
Today's News
HelpExpand Help
Advanced search

  Hide Options
Sort Order Items / Page
CHANG, WELTON (2) answer(s).
 
SrlItem
1
ID:   164194


Restructuring structured analytic techniques in intelligence / Chang, Welton; Berdini, Elissabeth ; Mandel, David R; Tetlock, Philip E   Journal Article
Tetlock, Philip E Journal Article
0 Rating(s) & 0 Review(s)
Summary/Abstract Structured analytic techniques (SATs) are intended to improve intelligence analysis by checking the two canonical sources of error: systematic biases and random noise. Although both goals are achievable, no one knows how close the current generation of SATs comes to achieving either of them. We identify two root problems: (1) SATs treat bipolar biases as unipolar. As a result, we lack metrics for gauging possible over-shooting—and have no way of knowing when SATs that focus on suppressing one bias (e.g., over-confidence) are triggering the opposing bias (e.g., under-confidence); (2) SATs tacitly assume that problem decomposition (e.g., breaking reasoning into rows and columns of matrices corresponding to hypotheses and evidence) is a sound means of reducing noise in assessments. But no one has ever actually tested whether decomposition is adding or subtracting noise from the analytic process—and there are good reasons for suspecting that decomposition will, on balance, degrade the reliability of analytic judgment. The central shortcoming is that SATs have not been subject to sustained scientific of the sort that could reveal when they are helping or harming the cause of delivering accurate assessments of the world to the policy community.
        Export Export
2
ID:   147583


Rethinking the training of intelligence analysts / Tetlock, Philip E; Chang, Welton   Journal Article
Tetlock, Philip E Journal Article
0 Rating(s) & 0 Review(s)
Summary/Abstract Despite intense scrutiny and promised fixes resulting from intelligence ‘transformation’ efforts, erroneous analytic assessments persist and continue to dominate news coverage of the US intelligence community. Existing analytic training teaches analysts about common cognitive biases and then aims to correct them with structured analytic techniques. On its face, this approach is eminently reasonable; on close inspection, incomplete and imbalanced. Current training is anchored in a mid-twentieth century understanding of psychology that focuses on checking over-confidence and rigidity but ignores the problems of under-confidence and excessive volatility. Moreover it has never been validated against objective benchmarks of good judgment. We propose a new approach: (a) adopting scientifically validated content and regularly testing training to avoid institutionalizing new dogmas; (b) incentivizing analysts to view training guidelines as means to the end of improved accuracy, not an end in itself.
        Export Export