Srl | Item |
1 |
ID:
131529
|
|
|
Publication |
2014.
|
Summary/Abstract |
As an abstract idea, openness is difficult to oppose. Social scientists from every research tradition agree that scholars cannot just assert their conclusions, but must also share their evidentiary basis and explain how they were reached. Yet practice has not always followed this principle. Most forms of qualitative empirical inquiry have taken a minimalist approach to openness, providing only limited information about the research process, and little or no access to the data underpinning findings. What scholars do when conducting research, how they generate data, and how they make interpretations or draw inferences on the basis of those data, are rarely addressed at length in their published research. Even in book-length monographs which have an extended preface and footnotes, it can sometimes take considerable detective work to piece together a picture of how authors arrived at their conclusions.
|
|
|
|
|
|
|
|
|
|
2 |
ID:
131531
|
|
|
Publication |
2014.
|
Summary/Abstract |
The number of people conducting scientific analyses and the number of topics being studied are higher than ever. At the same time, there are questions about the public value of social scientific endeavors, particularly of federally funded quantitative research (Prewitt 2013). In this article, we contend that data access and research transparency are essential to the public value of the enterprise as a whole and to the credibility of the growing number of individuals who conduct such research (also see Esterling 2013).
|
|
|
|
|
|
|
|
|
|
3 |
ID:
162860
|
|
|
Summary/Abstract |
Do researchers share their quantitative data and are the quantitative results that are published in political science journals replicable? We attempt to answer these questions by analyzing all articles published in the 2015 issues of three political behaviorist journals (i.e., Electoral Studies, Party Politics, and Journal of Elections, Public Opinion & Parties)—all of which did not have a binding data-sharing and replication policy as of 2015. We found that authors are still reluctant to share their data; only slightly more than half of the authors in these journals do so. For those who share their data, we mainly confirmed the initial results reported in the respective articles in roughly 70% of the times. Only roughly 5% of the articles yielded significantly different results from those reported in the publication. However, we also found that roughly 25% of the articles organized the data and/or code so poorly that replication was impossible.
|
|
|
|
|
|
|
|
|
|
4 |
ID:
166487
|
|
|
Summary/Abstract |
Modern grid monitoring equipment enables utilities to collect detailed records of power interruptions. These data are aggregated to compute publicly reported metrics describing high-level characteristics of grid performance. The current work explores the depth of insights that can be gained from public data, and the implications of losing visibility into heterogeneity in grid performance through aggregation. We present an exploratory analysis examining three years of high-resolution power interruption data collected by archiving information posted in real-time on the public-facing website of a utility in the Western United States. We report on the size, frequency and duration of individual power interruptions, and on spatio-temporal variability in aggregate reliability metrics. Our results show that metrics of grid performance can vary spatially and temporally by orders of magnitude, revealing heterogeneity that is not evidenced in publicly reported metrics. We show that limited access to granular information presents a substantive barrier to conducting detailed policy analysis, and discuss how more widespread data access could help to answer questions that remain unanswered in the literature to date. Given open questions about whether grid performance is adequate to support societal needs, we recommend establishing pathways to make high-resolution power interruption data available to support policy research.
|
|
|
|
|
|
|
|
|
|
5 |
ID:
146822
|
|
|
Summary/Abstract |
Data access and research transparency (DA-RT) is a growing concern for the discipline. Technological advances have greatly reduced the cost of sharing data, enabling full replication archives consisting of data and code to be shared on individual websites, as well as journal archives and institutional data repositories. But how do we ensure that scholars take advantage of these resources to share their replication archives? Moreover, are the costs of research transparency borne by individuals or by journals? This article assesses the impact of journal replication policies on data availability and finds that articles published in journals with mandatory provision policies are 24 times more likely to have replication materials available than articles those with no requirements.
|
|
|
|
|
|
|
|
|
|
6 |
ID:
131535
|
|
|
Publication |
2014.
|
Summary/Abstract |
Calls for greater data access and research transparency have emerged on many fronts within professional social science. For example, the American Political Science Association (APSA) recently adopted new guidelines for data access and research transparency. APSA has also appointed the Data Access and Research Transparency (DA-RT) ad hoc committee to continue exploring these issues. DA-RT sponsored this symposium. In addition, funding agencies like the National Institutes for Health (NIH) and the National Science Foundation (NSF) have expanded requirements for data management and data distribution. These pressures present challenges to researchers, but they also present opportunities.
|
|
|
|
|
|
|
|
|
|