A science in the shadows
Controls on ‘gain of function’ experiments with supercharged pathogens have been undercut despite concerns about lab leaks
A decade ago, scientists funded by the National Institutes of Health used ferrets to engineer a highly lethal flu virus. The purpose of the research — known as “gain of function” — was to better understand how viruses evolve and to help devise medicines to combat the potential disease threats.
It also came with a risk: A laboratory mishap could unleash a devastating pandemic.
The research, conducted in the Netherlands and at the University of Wisconsin, sparked an international controversy and led to new safeguards for such experiments. But over the past four years, NIH leaders and other U.S. officials have weakened key aspects of those controls, a Washington Post examination found.
The high-risk research has reemerged as a focal point because of speculation that such experiments in Wuhan, China, may have accidentally triggered the coronavirus pandemic. While Chinese virologists deny that their work is to blame, accidents have occurred on rare occasions in labs elsewhere in the world, leading to inadvertent releases of pathogens.
“The risks are absolutely real. They’re not intellectual constructs or hypotheticals,” said David A. Relman, a Stanford University physician and microbiologist who has advised NIH and other federal agencies on biosecurity. Eventually, he said, “something that you make or information that you release will result in an accident of some kind.”
Speculation about the work in Wuhan has focused new attention on gain-of-function research. This report details the U.S. support for such experiments and the secrecy undergirding them. It does not illuminate whether the coronavirus pandemic resulted from gain-of-function research.
In the United States, NIH Director Francis S. Collins and Anthony S. Fauci, director of the agency’s National Institute of Allergy and Infectious Diseases, have led the federal funding and oversight of gain-of-function research.
Eight years ago, Collins and Fauci helped put in place high-level reviews and other safeguards in response to concerns raised by Relman and aides to President Barack Obama, who were alarmed by what they saw as insufficient scrutiny of the research with ferrets. The NIH leaders and the Department of Health and Human Services pledged to subject the work to increased transparency and vetting. This included forming a review group of federal officials — known informally as a “Ferrets Committee” — to vet proposed projects for safety and worthiness.
However, Collins and Fauci in recent years have helped shape policy changes, directly and through their aides, that undercut the committee’s authority, according to federal documents, congressional testimony and interviews with dozens of present and former officials and science experts.
In 2017, a change made under their watch removed the committee’s power to block the projects, recasting the panel as strictly an advisory body.
Another change at that time redefined gain-of-function research, giving NIH leaders greater leeway to approve projects without referring them to the review committee. Some researchers had complained that far-reaching reviews would slow NIH approvals and scientific progress.
Since then, the experiments have continued to unfold amid secrecy, and HHS, which administers the review committee, has kept its work confidential: No agendas, meeting minutes or other records of its proceedings are public. Even the names of the federal officials assigned to serve on the committee, which has spanned the Obama, Trump and Biden administrations, are kept secret.
In an interview for this report, both Collins and Fauci and their senior aides disputed that the policy changes had weakened oversight of the research. Both NIH leaders pointed to safeguards that remain in place.
“Reasonable people do not all completely agree on the ideal way to frame the oversight of these very sensitive experiments,” Collins said, adding: “There are some who see the risks as greater and the benefits as less. And vice versa.”
Lab accidents, Collins said, “are certainly a concern. … You want to mitigate that by having the highest possible containment for any kind of experiment that might lead to trouble.”
As for loosening the controls in 2017, Collins and Fauci defended the resulting policy, which is formally known as the “Framework” for guiding gain-of-function research.
Collins lauded “the very intense, deliberative process” adhered to by NIH and other federal agencies that generated the changes. The policy was shepherded by Collins’s aides, who met with him and other participants in his personal office, he said.
Collins also said he is now open to making public the names of the review-committee officials to help achieve “the kind of transparency that the public expects.”
Fauci said that all of the research projects are first evaluated by expert managers within his institute, who check whether the work will be conducted by properly trained personnel within secure facilities. The experiments, he said, are “done with the highest degree of oversight.”
“To the extent that we can be transparent, that the system would allow us to be transparent, we go overboard to be transparent,” Fauci said.
‘Grading their own homework’
Officials at NIH declined to say how many gain-of-function projects they have funded since the controversy over the experiments with ferrets peaked in 2012, other than confirming grants for two projects in 2018.
Asked to provide the number of projects funded, Collins and Fauci suggested the answer would hinge on how the work was defined in a given year.
An agency spokesperson said that relevant information could be found in an agency database that archives tens of thousands of grants each year. But the database, NIH Reporter, does not designate which grants are for gain-of-function research.
The Post identified at least 18 projects that won funding from 2012 to 2020 that appeared to include gain-of-function experiments. Reporters examined research summaries in the database, along with articles published in scientific journals, and conducted interviews with experts.
Funding from NIH for the 18 projects totaled about $48.8 million and unfolded at 13 institutions. Eight were approved after the review committee’s power was weakened in 2017.
From 2017 to 2020, no more than “three or four” projects were forwarded to the review committee, said Robert Kadlec, who oversaw the panel and served as the Trump administration’s assistant HHS secretary for preparedness and response.
“They were grading their own homework,” Kadlec said.
Kadlec, a physician who earlier had held biodefense roles with the Pentagon, the White House and the Senate, said that the high-risk research has not been adequately vetted.
“Frankly, we didn’t have the scientific wherewithal,” Kadlec said of HHS, adding that the review committee’s capabilities were not “robust enough to make sure that bad things don’t happen.” (On June 7, Kadlec rejoined the Republican staff of the Senate Health, Education, Labor and Pensions Committee, where he said he will focus on biosecurity policy.)
At an NIH meeting last year, Christian Hassell, a senior aide to Kadlec at HHS, complained about how the research is vetted.
“We’ve only completed two reviews,” Hassell told members of NIH’s National Science Advisory Board for Biosecurity on Jan. 23, 2020, adding that a third project had been received by the committee for review.
Hassell suggested the lack of reviews reflected how narrowly the revised policy defines gain-of-function research, according to a video of the meeting.
“I’ll just probably be more frank than may be appropriate — I think that’s too narrow,” Hassell said. “My view on this thing is, don’t use too fine a filter.”
Hassell continues to serve as a senior science adviser at HHS in the Biden administration. He declined to be interviewed for this report.
Relman and other scientists said the federal policy governing the research is opaque and needs strengthening.
“If you’re going to ask society to take on a higher-than-normal level of risk, then I think there’s got to be more openness,” said Michael J. Imperiale, a University of Michigan virologist who served from 2005 to 2012 on the biosecurity board and who now is editor in chief of mBio, a journal of the American Society for Microbiology.
Skeptics of gain-of-function research question whether it is worth the risk.
“Everyone on the planet now knows what a pandemic is, what that means for their families, their communities, their incomes,” said Richard H. Ebright, a professor of chemical biology at Rutgers University who has studied biosecurity risks. “This kind of research can give rise to a pandemic.”
New attention on research
The mystery of how the novel coronavirus emerged and triggered the worst pandemic in a century has refocused attention on gain-of-function research.
In May, President Biden directed U.S. intelligence agencies to “redouble their efforts” to investigate the pandemic’s origins, saying the agencies had “coalesced around two likely scenarios.” One of those possibilities is that the initial outbreak stemmed from human contact with an infected animal. The other possibility is that a lab accident in China released the culprit pathogen. The agencies provided a classified report to Biden on Tuesday, concluding it was not yet possible to pinpoint the source of the pandemic, according to officials familiar with the matter.
Some who suspect that the novel coronavirus jumped from an animal to humans note that this happened with two other coronaviruses — severe acute respiratory syndrome, SARS, in late 2002 and Middle East respiratory syndrome, MERS, 10 years later.
Other scientists question whether lab workers in Wuhan could have become infected while experimenting with the novel coronavirus and wound up spreading it.
[Biden asks intelligence community to redouble efforts to determine definitive origin of coronavirus]
Following the natural outbreak of SARS nearly two decades ago, three accidental releases of the virus have occurred, at labs in Singapore, in Taiwan and at China’s National Institute for Viral Disease Control and Prevention. The mishap in Beijing resulted in one confirmed death and the quarantining of at least 600 people, according to the World Health Organization.
The possibility of a lab origin was initially dismissed by many scientists assessing the ongoing pandemic. But the absence of evidence tying the novel coronavirus to infected animals, along with the Chinese government’s refusal to provide international inspectors access to lab notebooks and virus samples, has helped fuel suspicions.
The Wuhan Institute of Virology has come under scrutiny because the lab complex is in the city where serious illness from the strain was first reported in late 2019. The institute has conducted gain-of-function research on bat-borne coronaviruses, according to a journal article co-authored by Chinese researchers there.
The lead Chinese virologist in Wuhan, Shi Zhengli, and one of her NIH-funded partners, a New York City-based nonprofit group called EcoHealth Alliance, have disputed the possibility of a lab release, as has the Chinese government. EcoHealth’s president, Peter Daszak, has also publicly denied that his organization participated in gain-of-function research in Wuhan. (Zhengli and Daszak did not respond to emails seeking their comment.)
The research can involve techniques that combine genetic material from different pathogens to create a new, lab-generated “chimeric” virus.
In the United States, such experiments are typically funded by NIH, the nation’s primary steward for biomedical research. All but one of the 18 projects identified by The Post have been funded by Fauci’s infectious-disease institute.
Fauci and Collins have served as gatekeepers to the work.
Both are iconic figures in American health and medicine: Collins helped lead the government’s mapping of the human genome, and Fauci has counseled presidents and congressional leaders for four decades on health crises from AIDS to covid-19. Fauci also serves as Biden’s chief medical adviser.
Fauci and Collins have long defended the research, saying that the knowledge gained could aid the development of vaccines and therapeutic drugs.
“There’s disagreement as to the scientific and/or public health value of these experiments,” Fauci said at a gathering of international researchers invited to NIH on Dec. 17, 2012, according to video of the session. “But I believe the people who feel that they shouldn’t be conducted are in the minority.”
By then, NIH leaders were grappling with the firestorm ignited by two gain-of-function projects.
The Rotterdam research
The experiments, conducted separately at the Erasmus Medical Center in Rotterdam and at the University of Wisconsin, had altered a strain of highly virulent, bird-carried flu in a way that enabled it for the first time to cause airborne infections among mammals.
In fall 2011, manuscripts describing the confidential results of the research had been submitted to separate scientific journals. Relman, serving as a peer reviewer for one of the journals, became alarmed, fearing that the details, if published, could provide a recipe for terrorists.
He privately notified a White House biosecurity official, biologist Lawrence D. Kerr, who warned others in the Obama administration and contacted the NIH director’s office, according to scientists familiar with the previously unreported details.
Collins’s staff assigned the agency’s biosecurity board to assess the risk. The board advises NIH regarding how to mitigate public health and other biosecurity risks posed by some research. In consultation with the secretary of health and human services, Collins appoints the board members, and his office administers their functions.
“We had never seen any papers like this with the gain-of-function idea,” said Lynn W. Enquist, who was then a member of the biosecurity board and editor in chief of the Journal of Virology. “The idea of increasing virulence, or increasing transmissibility, was not really something that most scientists had ever thought about doing. It was a concern.”
The research involved the H5N1 strain of avian influenza, which had a fatality rate of about 60 percent, compared with less than 1 percent for seasonal flu. Those stricken were typically Southeast Asian poultry farmers who contracted it from their birds. The strain was not known for human transmission.
One of the NIH-funded researchers, Ron Fouchier in Rotterdam, had altered H5N1 to make it more dangerous — so that it spread through respiratory droplets among caged ferrets, mammals that were the best simulation for humans’ susceptibility. Fouchier and his counterpart in Wisconsin, Yoshihiro Kawaoka, were seeking to learn more about the H5N1 strain, especially how it mutated.
Paul Keim, a Northern Arizona University geneticist who was then chairman of the NIH biosecurity board, recalled that his colleagues were concerned about the risk of publishing.
“We were saying, ‘Wow — it’s highly transmissible with a 60 percent mortality rate,’ ” Keim said. “You could kill 4 billion people in a flash, because these viruses go around the world.”
On Nov. 30, 2011, the board unanimously recommended that key research methodologies should be withheld from publication.
The board’s vote directly challenged the stewardship of Fauci and Collins, because the pathogen-altering research had been approved by NIH with no external review or publicity.
“They had made the decision to fund this work,” said Imperiale, who was on the board at that time. “It was awkward for them.”
[From 2011: Federal panel asks journals to censor reports of lab-created ‘bird flu’]
Fauci and Collins responded by working privately to reverse the biosecurity board’s recommendation — while publicly defending the need for the research, according to interviews and records.
Publicly, Fauci and Collins, along with a third NIH official, co-authored an essay, published by The Post on Dec. 30, 2011, concluding that the risks of the Rotterdam and Wisconsin experiments were worth taking.
The three men wrote that “important information and insights can come from generating a potentially dangerous virus in the laboratory.” The experiments with ferrets, they said, were aimed at filling “important gaps in knowledge” regarding human transmissibility.
Many influenza researchers in the United States and abroad, who depend on the sharing of information about shifting strains of seasonal flu, supported the NIH leaders’ position. Some bristled at government intrusion on scientific decision-making.
Privately, the biosecurity board reconvened behind closed doors on March 29 and 30, 2012, amid what participants described as tense circumstances: Fauci and Collins attended and reiterated their support of the research and its publication. The board members were required to sign a nondisclosure agreement, according to those who participated.
The board reversed its earlier recommendation, voting 12 to 6 in favor of publishing the research led by Fouchier, and the members unanimously backed publishing the separate results from Wisconsin. There are no public records of the meeting. Fouchier and Kawaoka have defended their work as responsible and worthwhile.
The papers summarizing the research results were published in separate scientific journals, and none of the controversial details were redacted.
[From 2012: Biosecurity advisory board reverses decision on ‘engineered bird flu’ papers]
In interviews, six scientists who served on the biosecurity board said concerns surrounding the research should have been aired before NIH approved funding for the Rotterdam and Wisconsin experiments.
“Why didn’t someone in NIH, when these grants were being reviewed, look at them and say, ‘Hey wait — there’s a potential problem here,’ ” Imperiale said.
Fauci continued in 2012 and beyond to marshal support for NIH’s handling of the experiments with the H5N1 strain and for other gain-of-function projects.
He wrote that fall in the journal mBio that “all decisions regarding such research must be made in a transparent manner.”
The ‘Ferrets Committee’
Following the controversy over the flu experiments, the Obama administration on Feb. 21, 2013, created the HHS committee to review other proposed high-risk research projects before NIH approved them. The policy at first applied only to H5N1 — it would be expanded the next year to include other flu strains and coronaviruses.
The policy also gave the committee the authority to veto research proposals referred to it by NIH. It was Collins who privately dubbed the HHS review panel the “Ferrets Committee.”
“The department-level review will determine the appropriate risk-mitigation measures and whether a given proposal is acceptable for HHS funding,” Fauci, Collins and three other officials wrote in an article published online that day.
From the outset, however, the committee struggled to obtain adequate information from the NIH director’s office about projects, said an Obama administration science policy official.
The scientist, who still holds a government position and spoke on the condition of anonymity because of the sensitivity of that role, recalled receiving incomplete details from NIH, adding that “we would not be aware of the final decision that was made” about the grant proposals in question.
Meanwhile, a series of dangerous blunders began unfolding at elite U.S. research institutions — ultimately shaking the Obama White House’s confidence in the ongoing gain-of-function research.
In March 2014, a Centers for Disease Control and Prevention worker sent a small container of what was presumed to be a nonlethal influenza strain from Atlanta to an Agriculture Department counterpart who was conducting research with chickens at a federal lab in Athens, Ga. After the birds began dying unexpectedly, officials discovered that the CDC had shipped material contaminated by a virulent strain.
Then on June 5 of that year, CDC researchers working in high-level biocontainment to refine techniques for detecting anthrax accidentally sent active samples of the bacterium to another CDC lab in Atlanta that was equipped to handle only nonlethal material. Forty-one CDC workers who might have been exposed underwent antibiotic treatment.
And on July 1, federal workers discovered 12 long-abandoned cardboard boxes in a cold room at NIH’s main campus in Bethesda, Md. Of the more than 300 vials of infectious agents found in the boxes, six contained variola virus, the source of smallpox, a scourge that had been eradicated worldwide as of 1979. The samples, used decades earlier by a Food and Drug Administration researcher, were supposed to have been destroyed or stored in a high-level biocontainment facility at the CDC.
[From 2014: Smallpox vials, decades old, found in storage room at NIH campus in Bethesda]
Lisa Monaco, Obama’s deputy national security adviser, and John Holdren, director of the White House Office of Science and Technology Policy, urged all federal and nonfederal labs on Aug. 28, 2014, to conduct a “Safety Stand-Down” to “review laboratory biosafety and biosecurity best practices and protocols.”
In mid-October — citing the “recent biosafety incidents at Federal research facilities” — the Office of Science and Technology Policy and HHS jointly announced a “pause” in funding for any newly proposed gain-of-function experiments with influenza and the feared coronavirus strains MERS and SARS.
The announcement also encouraged “those currently conducting this type of work, whether federally funded or not, to voluntarily pause their research while risks and benefits are being reassessed.”
The research resumes
The increased federal scrutiny triggered pushback from some virologists, including coronavirus researchers Ralph S. Baric of the University of North Carolina and Mark R. Denison of Vanderbilt University.
“We argue that it is premature to include the emerging coronaviruses under these restrictions, as scientific dialogue that seriously argues the biology, pros, cons, likely risks to the public, and ethics of [gain-of-function research] have not been discussed in a serious forum,” Baric and Denison wrote to the biosecurity board on Nov. 12, 2014.
Referring more broadly to highly pathogenic flu and coronavirus strains, their letter added: “The pandemic potential of these viruses is clear, but they also are vulnerable in the early stages of an outbreak to public health intervention methods. . . . GOF [gain of function] experiments are a documented, powerful tool.”
Within weeks, NIH officials informed Baric and an undetermined number of other researchers that their work had been exempted from the pause.
Baric is a recognized leader in studying how coronaviruses can leap from bats to other mammals, including humans. His research has sought to identify which strains pose the biggest threats and approaches that might lead to the development of vaccines or therapeutics. The research has been funded with about $11.9 million in NIH grants, records show.
Neither Baric nor Denison, whom Collins appointed to the NIH biosecurity board in November 2016, responded to written questions about their work.
Baric has collaborated with another leading coronavirus researcher — Zhengli, of the Wuhan Institute of Virology, whose prominence in the field garnered her a nickname: China’s “Bat Woman.” Zhengli provided genetic sequences and DNA molecules derived from horseshoe bats in China, and Baric designed the experiments, conducted at his lab in Chapel Hill, N.C.
Baric, Zhengli and other co-authors summarized their respective roles in a December 2015 article in the journal Nature Medicine titled, “A SARS-like cluster of circulating bat coronaviruses show potential for human emergence.”
The article described techniques that were hallmarks of gain-of-function research, including the use of reverse genetics to create a chimeric virus derived from bats and implanted in laboratory mice.
By 2016, the prospect of more gain-of-function research projects was generating concern among some federal scientists, according to interviews and previously unpublicized government emails.
On April 17, 2016, Kerr, the biologist who had sounded an alarm about the first gain-of-function experiments while serving in the Obama White House, emailed six federal colleagues, warning of a boomlet of high-risk research with coronavirus strains.
Kerr, who by that time was a senior biosecurity official at HHS, cited MERS, the coronavirus that had emerged in 2012 and that over the next seven years would kill 866 people abroad.
“The continuing nature of MERS outbreaks has brought scientists back into the corona-biology world and more are using genetically-synthesized infectious viruses in their work,” Kerr wrote. “GOF work on MERS is not a pretty subject to consider.”
The Post obtained a copy of the email.
An HHS spokeswoman did not respond to The Post’s request to interview Kerr, who has headed the department’s Pandemics and Emerging Threats Office since 2016.
The potential danger of a lab-altered coronavirus strain was also foreshadowed by an April 2016 consulting firm’s report, commissioned by NIH. The 1,016-page report by Gryphon Scientific warned that increasing the transmissibility of coronaviruses could lead to a pandemic and increase the risk of deaths by “several orders of magnitude.”
Away from the spotlight, officials from NIH, elsewhere within HHS and other federal departments conferred from 2015 to 2017 about how to revise the policy for gain-of-function research, and when to lift the pause. The efforts were guided by Collins, who held sole authority to grant exemptions from the pause for NIH projects, and his chief of staff, according to those familiar with what unfolded. At Collins’s behest, the biosecurity board also convened several public meetings regarding gain-of-function research policy.
The private meetings included White House officials from the Office of Science and Technology Policy and the National Security Council, along with Collins, his aides and other federal staffers, those familiar with the sessions said.
This obscure interagency process created the two changes that gave NIH officials greater discretion to fund gain-of-function research projects.
In addition to stripping the HHS committee of its power to veto proposed projects administered by NIH, the revised policy also limited the scope of projects the committee would review.
As established in October 2014, the policy had required NIH to forward for the committee’s review experiments expected to generate certain flu and coronaviruses that would be “transmissible among mammals” and that might accidentally cause human infections.
But in December 2017, the policy was narrowed to cover only altered pathogens “likely capable of wide and uncontrollable spread in human populations.” The sweeping reference to mammals was eliminated. A review by the committee was not required, the policy said, unless the pathogen to be constructed is “reasonably judged” by NIH “to be a credible source of a potential future human pandemic.”
In written responses to The Post, NIH and HHS media representatives said the policy changes were based on an extensive process that took into account comments from government experts and others.
As to why the reference to mammals was deleted, NIH said the revised policy identified “the subset of research” that could pose the greatest pandemic risk for humans.
Collins, pressed on why the change was made, said that he was “not able to fully reconstruct” the details but added that agency staffers evaluate the research proposals “from the most sophisticated perspective.”
Baric, meanwhile, ultimately broke ranks with other proponents of gain-of-function research who for more than a year rejected the possibility that a lab accident led to the pandemic.
In a letter published on May 14 by the journal Science, Baric, along with Relman and 16 other scientists, called for a rigorous investigation of the pandemic’s origin: “Theories of accidental release from a lab and [natural] spillover both remain viable.”
Muller is a recent graduate of Northwestern University’s Medill School of Journalism and a fellow with the Medill Investigative Lab. Alice Crites contributed to this report.