Knowledge Production Processes: An Analysis of Research Perseverance and the File Drawer Bias in Social Science Survey Experiments (WP-23-41)
Philip Moniz, James Druckman, and Jeremy Freese
The scientific process is difficult to evaluate because many of its stages typically evade observation. This includes whether one has success in obtaining funding for data collection, whether one perseveres should their funding application fail, whether one writes up the results of data analyses and submits a manuscript to a journal, and of those submissions whether publication occurs. Using data from applicants to a unique grant program to fund probability-sample survey experiments in the U.S. (Time-sharing Experiments for the Social Sciences), Moniz, Druckman, and Freese identify factors that influence each step. They find that research time, and not resources, plays a substantial role in determining whether the grant is funded, and, if not, whether the applicant proceeds with the project. The latter result likely reflects the availability of cheaper non-probability sample data sources that still require time to collect. Additionally, they document the substantial influence of obtaining statistically significant results in determining whether a scholar writes up and submits a paper (a variation of file drawer bias). Once a manuscript is submitted, however, statistical significance does not influence publication likelihood at all. Thus, file drawer bias emerges from researcher rather than editorial choices. The bias also is substantially smaller than it was a decade ago (Franco et al. 2014), suggesting increased recognition of the importance of null results. Overall, the researchers’ findings identify how research time and statistical significance shape science, at least in the broad domain of survey experimental research, providing guidance for potential interventions in the scientific process.