Many political scientists agree that we need more scholars doing replication studies. Replications, referring to the use of original data and code to conduct re-analyses and robustness checks, are an important step in the scientific process. Yet most journals still hesitate to publish replications, with rare and notable exception.
The most recent issue of PS: Political Science and Politics included a wide-ranging discussion about research transparency and replication. The editor of the American Political Science Review, John Ishiyama, argues that the lack of a forum for sharing replications creates a disincentive to conduct them. At the same time, Thomas Carsey notes the “growing trend” of assigning replications in graduate methods courses. Putting these ideas together, we are exploring the idea of sharing these assignments as a starting point for advancing replication in the discipline.
We sent out a survey to the Political Methodology mailing list to gather more information about replication projects assigned in graduate courses. Our sample includes a total of 75 respondents from more than 10 countries. Roughly half of the respondents are professors, a third are graduate students, and the remainder are post-docs or research fellows in the field. About three-quarters of respondents either teach or have taken a course in their graduate curriculum that includes a replication assignment. In addition, more than a third of respondents indicated that they regularly either supervise or complete a replication project outside of these classes. While these initial results are encouraging, they are not necessarily representative of the entire political science community because respondents self-selected based on interest in replication and methodology more broadly.
First, respondents more often assign or complete replications in advanced courses than in first-year courses. Furthermore, it appears that some respondents work with replications in both types of courses (see Figure 1). The most common assignment cited by respondents allows students to choose the original studies that they wish to replicate, rather than asking them to replicate a particular paper or select one from a list. We also asked respondents to report problems encountered while completing these replications. The most commonly reported problems associated with these attempts were the lack of replication data and code, followed by insufficient documentation, both of which likely constrain the set of published articles to be replicated.
Second, replications done in graduate courses may be an under-utilized resource. Nearly three-quarters of respondents who either assigned or completed a replication project indicated that the instructor reviewed a draft of the study. This suggests that these replications have the potential to be of value to the wider research community. However, the replications are rarely shared more broadly: 72 percent of these respondents noted that these replication assignments are never (or rarely) shared outside of class, and only 13 percent reported that students post their replication projects on a course website or other repository.
Taken together, these preliminary results suggest that one way to advance replication in political science might be to share the work that has already been done in graduate courses. More specifically, we envision a site where users can upload their replication write-ups and materials, as well as link to the original data and code. Site submissions would not be limited to those from graduate students, but we believe that graduate student work is a promising place to start. Crucially, the site would not prohibit contributors from later submitting a replication for publication in a journal, similar to repositories such as Arxiv and the NBER working paper series in other disciplines. Nearly two-thirds of respondents indicated that such a resource would be valuable (see Figure 2) and would either encourage their students to post their replication assignments or share their own work on the site. An additional third of respondents would need more information about the site, and less than 3 percent appear to reject the idea completely.
One important consideration is how to address quality control on the replication site. Many submissions would undoubtedly flag purported errors in published results. However, these replications themselves could either contain errors or unjustly describe minor discrepancies as major mistakes, which could have substantial reputational consequences. Soliciting referees to accept/reject submissions would presumably increase the quality of all submissions while also increasing operating costs. Indeed, our survey results reflect that respondents are divided on how to approach this trade-off. About a third of respondents felt that referees were needed to carefully evaluate student submissions before they were made publicly available. However, the majority of respondents (53 percent) indicated that the site could still be useful without referees, especially if it included commenting, labeling for whether replications had been reviewed in class, and a notification system inviting original authors to provide responses to replications of their work. Additionally, the site could require all contributors to submit their own replication code.
Many questions remain for continued exploration and discussion, including how to incentivize submissions and how to link replications with original publications. We will continue to grapple with these questions and welcome input from other researchers on these important issues. Certainly, there are no easy answers. Yet we hope that taking these first steps will shed light on best practices for sharing replications in political science as well as in the wider scientific community.