Query Result Set
Skip Navigation Links
   ActiveUsers:691Hits:20063904Skip Navigation Links
Show My Basket
Contact Us
IDSA Web Site
Ask Us
Today's News
HelpExpand Help
Advanced search

  Hide Options
Sort Order Items / Page
MUMMOLO, JONATHAN (2) answer(s).
 
SrlItem
1
ID:   174450


Administrative Records Mask Racially Biased Policing / Knox, Dean ; Mummolo, Jonathan ; Lowe, Will   Journal Article
Lowe, Will Journal Article
0 Rating(s) & 0 Review(s)
Summary/Abstract Researchers often lack the necessary data to credibly estimate racial discrimination in policing. In particular, police administrative records lack information on civilians police observe but do not investigate. In this article, we show that if police racially discriminate when choosing whom to investigate, analyses using administrative records to estimate racial discrimination in police behavior are statistically biased, and many quantities of interest are unidentified—even among investigated individuals—absent strong and untestable assumptions. Using principal stratification in a causal mediation framework, we derive the exact form of the statistical bias that results from traditional estimation. We develop a bias-correction procedure and nonparametric sharp bounds for race effects, replicate published findings, and show the traditional estimator can severely underestimate levels of racially biased policing or mask discrimination entirely. We conclude by outlining a general and feasible design for future studies that is robust to this inferential snare.
Key Words Police  Racial 
        Export Export
2
ID:   165425


Demand effects in survey experiments: : an empirical assessment / Mummolo, Jonathan   Journal Article
MUMMOLO, JONATHAN Journal Article
0 Rating(s) & 0 Review(s)
Summary/Abstract Survey experiments are ubiquitous in social science. A frequent critique is that positive results in these studies stem from experimenter demand effects (EDEs)—bias that occurs when participants infer the purpose of an experiment and respond so as to help confirm a researcher’s hypothesis. We argue that online survey experiments have several features that make them robust to EDEs, and test for their presence in studies that involve over 12,000 participants and replicate five experimental designs touching on all empirical political science subfields. We randomly assign participants information about experimenter intent and show that providing this information does not alter the treatment effects in these experiments. Even financial incentives to respond in line with researcher expectations fail to consistently induce demand effects. Research participants exhibit a limited ability to adjust their behavior to align with researcher expectations, a finding with important implications for the design and interpretation of survey experiments.
        Export Export