Scholars have addressed this question by drawing upon personal experience and anecdotal evidence. Joe Nye and Steve Walt have argued that academic research is increasingly irrelevant and inaccessible to policy practitioners. Others, such as Peter Feaver and Mike Horowitz, offer a more qualified take but provide no systematic evidence. So we still need to do what Kate Weaver has suggested: “mind — and measure — the gap” between what scholars are researching and what policymakers are demanding.
Our Teaching, Research & International Policy (TRIP) project at the College of William & Mary can help. TRIP tracks the types of research published in the top 12 peer reviewed journals of international relations (IR) from 1980 to the present. TRIP also periodically surveys IR scholars and asks them what kind of research they produce.
For example, the 2014 survey asked U.S.-based respondents about their primary research methods.
Qualitative analysis remains the primary method employed by a large majority (58 percent) of IR scholars at U.S. colleges and universities. Quantitative methods came in a distant second place, selected by only 25 percent of respondents. However, a larger fraction of younger scholars (those under 36 years of age), 48 percent, chose quantitative methods.
So, even if the IR scholarly community is not currently a quantitative mono-culture, it may be trending in that direction. Further, analysis of published research in the “top” IR journals suggests a clear movement toward quantitative methods over time.
The TRIP survey also asks scholars what type of research is most useful to policymakers. Those results are largely consistent with the conventional wisdom advanced by Walt and Nye. Scholars believe that policymakers value area studies most, followed by policy analysis, contemporary case studies and historical case studies. Scholars expect that quantitative analysis, pure theory and formal models are the least useful.
Of course, asking scholars what policymakers want is a terrible way to measure demand. The obvious thing to do is ask policymakers what kinds of research they find most useful.
Enter Paul Avey and Mike Desch, whose new article reports on a survey of policymakers (see a symposium on the article here). The study is limited in some respects. For example, the respondents were overwhelmingly from the Defense Department and fairly senior — even though the uptake of scholarship occurs across the government and often at the junior level. Yet, it is the best existing effort to provide a systematic measure of demand for IR research within the policy community. As the figure below shows, senior national security policymakers view area studies most often and formal models least often as “very useful.”
Our own narrow experience in the area of international development research and development policy suggests that government agencies involved in the conduct of foreign policy increasingly demand data, analysis and evaluations that are the product of quantitative and experimental research, rather than area studies and case study research. A quick look at the kind of projects funded by the DOD Minerva Initiative, DARPA, IARPA, USAID’s Higher Education Solutions Network, and other solicitations from DOD, DOS, DHS made the Avey & Desch results surprising to us. The Avey and Desch results caused us to wonder about other ways of measuring demand. In addition to conducting elite surveys with broader sample frames, one might analyze job postings from federal agencies involved in foreign policy to see what skills, expertise or disciplinary training are in great demand. Alternatively, one could analyze grant solicitations or federal grants awarded — that would give you a measure of what policymakers intend to pay for and what they actually buy!
One convenient, if unrepresentative, sample of demand within the policy community can be found by analyzing recent solicitations from the State Department’s Diplomacy Lab. The Diplomacy Lab aims to harness the creativity, expertise and skills of undergraduate researchers in U.S. universities to inform and assist the State Department on complex global problems. Since William & Mary was one of the two founding university partners of the Diplomacy Lab, we saw the specific projects and skill sets requested by policy makers within the State Department. Every semester the State Department solicits 15 to 60 discrete research projects that will be conducted by undergraduate students supervised by a faculty member. The last two solicitations are here and here. (And one example of a completed report by students at William & Mary working on the AidData project can be found here.) Despite the quantitative/GIS mapping work done by AidData, taken as a whole the solicitations from the State Department illustrate a pattern of demand that is broadly consistent with much of what Avey and Desch found in their survey of senior security policymakers.
Like Avey and Desch, we found that the Diplomacy Lab project solicitations disproportionately demanded descriptive and qualitative research methods, with 84 percent of projects calling for descriptive, historical, or qualitative analysis. Quantitative skills, conversely, were requested in only 12 percent of projects. Formal models and experimental approaches appeared in none of the 75 solicitations from the State Department. We did observe a small number of methods that were uncommon in most IR research and have no direct analogue in the TRIP data or the Avey and Desch survey, including: computer programming to create a video game, writing a survey and database creation/management of a digital history of U.S. foreign relations. Since practitioners were turning to undergraduate researchers for this work, it is not terribly surprising that the methods requested are less technical than those appearing in contemporary peer-reviewed journals or in RFPs directed toward faculty members in these same universities.
While Avey and Desch intentionally focused on policymakers involved in security policy, the State Department solicits proposals from all of the various bureaus at Foggy Bottom. As a result, we see a great deal of variation in the issue area or substantive focus in these solicitations. We analyzed all 75 requests in the two PDFs linked above and found that traditional security issues and other prominent issue areas in “IR” were not always well represented. While many requests asked for substantive knowledge in multiple issue areas (so totals add to more than 100 percent), the breakdown looked like this: 32 percent of project requests asked for expertise in crime and law enforcement; 28 percent asked for regional expertise in culture/language of a particular country; 24 percent requested knowledge of the domestic politics of a particular country; 19 percent wanted knowledge on environmental issues followed (in order) by international development, money and finance, human rights, gender, public health, diplomacy and education. Conversely, only three of 75 projects requested knowledge about traditional security issues – one each for interstate war, intra-state war and terrorism. So, the State Department appears to want to know many different things from undergraduate researchers, but most of what they want to know is not covered in a traditional course on international security. Perhaps unsurprisingly, where you look within the foreign policy community will have a big effect on what knowledge is in demand.
Aside from methods and substantive focus, both the Diplomacy Lab solicitations and the Avey and Desch study address demand for researchers from various disciplines. Avey and Desch ask security-oriented policymakers about the relative utility of different disciplinary training and, consistent with Kristof’s lament, political science finishes dead last, followed closely by public policy and then “international affairs.”
While the Diplomacy Lab categories for disciplinary training are not identical to those offered in the TRIP/Avey and Desch studies, each solicitation from the State Department provided a different description and number of “areas of expertise or interest.” We list the top nine in the graph below. This serves as a reasonably close proxy for “academic discipline” since those were the nouns most commonly mentioned in the DOS solicitation. The results below suggest that the Diplomacy Lab is looking for students from a broad range of disciplines. If this were the only data point, one might conclude that quantitative skills are in great demand. But as a proportion of the total number of entries for this variable, math, statistics and data analysis combined equal less than 20 percent.
The Avey and Desch paper, like our own modest effort here, relies on a narrow sample frame, and the findings are shaped by the questions we ask and/or what we choose to count — and there are many different things one could count when trying to measure demand. Looking just at the Diplomacy Lab initiative suggests that officials in the State Department are looking at a very diverse set of skills, disciplines and issue areas. Other organizations are probably more narrowly focused on specific issue areas or methods. We speculate that practitioners in the State Department demand less technical work through this program because they know the researchers will be undergraduate students rather than professors; however, we have no evidence to support this guess.
For those interested in accurately describing the relationship between the academy and the policy community or for those interested in changing that relationship, we need accurate and systematic measures of supply and demand. The Avey and Desch study is salutary and represents a small step in the right direction, but there is much work left to be done.
Ana O’Harrow is a student at the College of William & Mary who has worked on the TRIP project as a Research Assistant. She will graduate in May. Mike Tierney is the George and Mary Hylton Associate Professor of Government and International Relations. He is a PI on the TRIP project and is the Chair of the AidData Steering Committee.
Edit: Minor corrections in titles and placement of figures.