Soon after the novel coronavirus began circulating on the West Coast, a team of researchers began an urgent quest to find out how the pathogen might affect pregnant women, beginning their work before they had figured out how to pay for it. Concerned that it might take months to go through the grant-application process, with no guarantee of success, the team at the University of California at San Francisco turned to philanthropy and crowdfunding.

Stephanie Gaw, one of the co-principal investigators, said she was astonished by the response.

“Within a week, we had raised $25,000,” she said. “In a month, $70,000.”

The pandemic has upended norms of the scientific process, from the way studies are funded through the publication of findings. Researchers have been presenting their results online or sending them directly to media outlets rather than awaiting publication in prestigious academic journals. And the stodgy process of peer review has evolved into forthright — and sometimes acrimonious — assessments in the unbridled atmosphere of the Internet.

“We are watching science in real time,” said Julie Schafer, chief technology officer of Flu Lab, a nonprofit organization dedicated to vanquishing influenza. “And it’s incredibly uncomfortable.”

The need for speed has put a spotlight on the messiness of the scientific process, in which breakthroughs are rare and single studies are typically just starting points or contributions to an evolving body of knowledge. In today’s highly partisan atmosphere, those individual studies skip to the Twittersphere or the popular media where they can land like misguided missiles.

In August, researchers at Duke University hustled to publish their findings on the development of a low-cost way to test the efficacy of masks, choosing a peer-reviewed, open-source journal in the hope of making the information quickly accessible to anyone who might want to reproduce the technology. The study went viral after readers distorted one aspect — that neck gaiters might be worse than not wearing a mask at all.

“It certainly was taken differently from what we had intended,” said Martin Fischer, an associate research professor in Duke’s Department of Chemistry. “It was never intended as a mask recommendation.”

In many ways, the virus has accelerated a shift already underway, reflecting some scientists’ dissatisfaction with what they regard as excessive gatekeeping — from the months-long process of reviewing grant applications to the constraints imposed by academic journals, including lengthy peer reviews, paywalls and embargoes.

The underlying question, Schafer said, is how to move science forward — and change clinicians’ behavior — in the most expeditious way.

Under the traditional norms of research, “it can tend to get a little inbred,” said Irene Eckstrand, a retired evolutionary biologist who was scientific director of the federally funded Models of Infectious Disease Agent Study (MIDAS).

Modelers who predict the course of an outbreak play a particularly important role during a pandemic by increasing understanding of infectious-disease dynamics through computational, statistical and mathematical models. The MIDAS scientists Eckstrand oversaw had a forward-looking mission — to drop everything and unite against the common threat in the event of an outbreak like today’s.

“The idea was to have standing groups of modelers available . . . like a center available to the U.S. government if there is an emergency,” said Natalie Dean, a biostatistics expert at the University of Florida.

In recent years, funding for that visionary mission was cut.

“We’ve lost that flexibility and a bridge between research and front-line public health,” said Lauren Ancel Meyers, a specialist in network epidemiology at the University of Texas at Austin.

As a result, modelers, like other scientists, have been scrambling to fund and ramp up their operations. The upshot, said Donald S. Burke, a professor of health science and policy at the University of Pittsburgh, is a lack of agility that has always haunted public health.

“Things that aren’t yet a problem don’t get attention,” he said. “Yesterday’s problem is funded for a few years.”

In response to what he calls “ossified and bureaucratic” traditional funding mechanisms, Tyler Cowen, an economist at George Mason University, created Fast Grants through the school’s Mercatus Center, a free market-oriented nonprofit organization.

He identified philanthropic funding and put together a panel of 20 anonymous reviewers, half women, half people of color, who promise decisions within two weeks. Out of about 5,000 applications, Fast Grants has been able to underwrite almost 200 projects, Cowen said, typically with $10,000 to $500,000.

Katherine Seley-Radtke, a medicinal chemist at the University of Maryland Baltimore County, was one of the recipients. Seley-Radtke, who said she discovered several years ago compounds that acted on coronaviruses, received six months of funding that helped her start new research.

The grant was a boon for Seley-Radtke, who said she was stymied a few years ago when she applied for a grant from the National Institutes of Health.

“I would love to have a fast answer as to whether or not I got money and then move on,” she said.

But the Fast Grant program has become a victim of its own success, having to put operations on hold twice as it sought additional funding. Cowen said he hopes to have it up and running again soon.

The urgency of starting research is matched by the urgency of making results available — and reliable — at far faster speeds than peer-reviewed academic journals historically allow.

Steve Kirsch, a tech entrepreneur and medical philanthropist who founded the Covid-19 Early Treatment Fund to identify and repurpose existing drugs, described his frustrations with a system that he thinks is slowing access to potentially lifesaving treatment.

Kirsch outlined promising results from a small clinical trial involving the drug fluvoxamine that was conducted at the Washington University in St. Louis and is awaiting publication in a major medical journal. It was supported by Kirsch’s fund.

“Five thousand people are dying every day,” said Kirsch, referring to the global toll from the pandemic.

Eric Lenze, the study’s principal investigator, confirmed in an email that researchers are taking a “traditional approach via journal.”

“My co-investigators and I thought it was important to have the study peer reviewed,” wrote Lenze, who said the paper was embargoed but declined to say where.

This past weekend, Kirsch published his own piece about the study online.

“The landscape has changed,” said Eric Rubin, editor in chief of the New England Journal of Medicine, which in the past discouraged researchers from submitting preprints — papers that are published more expeditiously online — or even talking about their research.

The relaxation of those kinds of rules has accelerated with the coronavirus. But almost everything the New England Journal publishes is still embargoed, Rubin said, in part to allow reporters time to receive expert input on information that could be misinterpreted, even by doctors.

The stakes in medicine are “super high,” he said. Editors have to ask, “If we publish this, are we going to do more harm than good?”

Michael Johansson, an adjunct lecturer on epidemiology at the Harvard T.H. Chan School of Public Health, began several years ago to develop a swifter online publication-and-review system, convinced that a global health crisis would occur before long. He came up with Outbreak Science, which went online in January, just as the coronavirus was being recognized as an international threat. The nonprofit website uses algorithms to identify relevant studies and feed them to Twitter.

“It came out of the recognition that science was too slow to be helpful during outbreaks,” Johansson said.

The widespread availability of preprints has allowed some bad science to win undue attention, as evidenced by a spate of prominent retractions. Outbreak Science tackles that problem by providing a structured format for peer review, based on a dozen yes and no questions, so that experts can chime in quickly, providing assessments that allow influential papers to surface.

Even traditional outlets have shown that their peer-review process is not foolproof, particularly amid the pressure of a pandemic. In June, the Lancet and the New England Journal of Medicine retracted papers — the Lancet’s on the much-touted drug hydroxychloroquine — after a U.S. company, Surgisphere, declined to share the raw data the studies were based on for an independent audit.

Those papers struck some researchers as resulting from haste.

“In true peer review, we’d scrub the data,” said Davey Smith, a virologist at the University of California at San Diego.

The outcry not only increased distrust in the scientific process but sparked delays in other hydroxychloroquine research, including work Smith was conducting.

“It was impossible to sign people up for trials,” he said. “We still don’t know about the effects of hydroxy on early covid,” he added, emphasizing the importance of rigor even as he appreciated the need for speed.

Rubin said that large databases are often proprietary, making them inaccessible for review, and that the journal is making greater efforts to find trustworthy sources to vouch for them.

“Since that time, we have rejected papers based on not being confident about underlying data,” he said.

Such highly visible errors may have multiplied, but they are not new — and their costs can last for years.

It took 12 years for the Lancet to retract a now-notorious 1998 paper that implied a link between autism and the vaccine against measles, mumps and rubella. But the damage lingered, creating widespread distrust of vaccinations that continues — and may feed worries about the coronavirus vaccines when they become available.

In today’s turmoil, some scientists send their preprints directly to reporters who in turn ask experts to evaluate the research.

“I get so many peer-review requests,” said Dean, the biostatistics expert. “I hate to turn most of them down. But it’s just bonkers.”

Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health, received funding in September through an accelerated NIH program after submitting a proposal in July — a benefit for scientists who, he said, spend an “insane” amount of time writing grants.

Lipsitch described being on both sides of online critiques since the pandemic began. He was one of many experts who found fault with an early antibody study from Santa Clara, Calif., suggesting that the number of people infected with the virus was 50 to 85 times higher than previously thought. Lipsitch thought the paper improved from the feedback.

He is revising a paper of his own on the relationship between age and susceptibility to the virus, based on comments he has received from online reviews and through traditional peer review. Some seemed overly personal, said Lipsitch, who lamented the tone and sexism of much online criticism.

He is not convinced that accelerating peer review is always a good thing. “Nobody can do it fast as well as they do it slow,” he said.

The challenge of evaluating research promises to increase as the virus it is designed to combat redoubles attacks worldwide.

“Science has moved immensely fast,” Eckstrand said. “The problem is figuring out what you can believe and what you can’t believe.”