SAN FRANCISCO — Within days of social media companies taking down a viral video touting conspiracy theories about the novel coronavirus, a clip popped up on YouTube telling viewers about another way they could still access the banned footage: through a link to the video on the file-sharing service Google Drive.

Google Drive is not a social media platform, nor is it set up to tackle the problems that social media companies face: the weaponization of their services to amplify dangerous content. But the use of the Drive link, to the trailer for a documentary called “Plandemic,” reflects a wave of seemingly countless workarounds employed by people motivated to spread misinformation about the virus — efforts that continue to thwart social media companies’ attempts at preventing hoaxes and conspiracy theories from spreading in the midst of the greatest public health crisis in decades.

During the pandemic, Facebook, Twitter and YouTube have adopted a more aggressive approach to policing misinformation than in the past. They have introduced new rules, such as removing posts that contradict guidance from public health agencies, deny that the virus exists or promote bogus scientific claims.

That has prompted those spreading covid-19 misinformation to try new methods, including using social media services that have not historically been platforms for news, such as the short video app TikTok, and productivity tools such as Google Drive and Google Docs. They’ve even used digital library Internet Archive. These services have more limited systems for policing content compared with the major social media platforms, which have spent years investing in moderation efforts in response to criticism.

Borrowing techniques used by other illicit industries, including porn, many of the remaining posts about “Plandemic” on YouTube and Facebook have the most inflammatory content edited out to avoid detection. Instead, they direct people to a link where they can see the entire film.

The “Plandemic” video is a trailer for a documentary that recasts populist conspiracy theories about how elites are suppressing information about the virus. In it, a doctor, Judy Mikovits, says Anthony S. Fauci, director of the National Institute of Allergy and Infectious Diseases, buried her research showing that vaccines damage the immune system. She also says that Bill Gates and others are spreading the virus to profit off an eventual vaccine and that wearing a mask increases a person’s risk of catching the virus.

It went viral last week, eventually becoming one of YouTube’s top trending videos, according to social media researcher Erin Gallagher. The video surged on YouTube as people clicked through from embedded links in Facebook groups dedicated to opposing vaccines and the conspiracy theory QAnon, Gallagher said.

Soon after it went viral, Facebook, YouTube and Twitter banned the video on the grounds that it contained misinformation about the virus that could cause “real-world harm” and an immediate threat to public health. TikTok, which also prohibits harmful misinformation about covid-19, banned it as well, a spokeswoman said. The companies say they will take down content that disputes the existence of covid-19, as well as content that discourages people from seeking medical treatment, promotes medically unsubstantiated treatment methods, or disputes the efficacy of social distancing guidelines. For other types of misinformation, the companies tend to leave them up but demote them in search rankings or attach fact-checking labels, in the case of Facebook.

In addition, the companies are actively directing people who search for information about covid-19 to the sites of the World Health Organization, the Centers for Disease Control and Prevention and high-quality news sources.

Less than two weeks after the company bans, however, researchers are finding that the video and references to it are resurfacing across social media, particularly on YouTube. At least 40 versions of the trailer were uploaded on YouTube over the past week and were easily found using a simple hashtag search, according to Eric Feinberg, vice president of content moderation at Coalition for a Safer Web, who shared his findings with The Washington Post. Some of them have more than 40,000 views and stayed up for days.

The account for the “Plandemic” movie is currently live on Instagram, along with roughly 60 other videos, many with cut-up versions of the movie that omit its most problematic claims. One Instagram post featuring the film has more than 500,000 likes.

“The social media companies are playing a giant game of whack-a-mole,” Feinberg said.

Google took down the Drive file featuring the movie after the company was contacted by The Post. YouTube removed five out of the 12 videos found by Feinberg that The Post shared with the company. The remaining videos were allowed to stay up because the posters had edited out parts that made false claims, such as that wearing a mask can cause covid-19.

“We quickly remove flagged content that violates our Community Guidelines [youtube.com], including content that includes medically unsubstantiated diagnostic advice for COVID-19 and re-uploads of the original clip if they contain segments that we deem to be violative of YouTube’s Community Guidelines,” said YouTube spokesman Farshad Shadloo. “From the very beginning of the pandemic, we’ve had clear policies against COVID-19 misinformation and are committed to continue providing timely and helpful information at this critical time.”

He added that the company had removed thousands of “Plandemic” videos but would not share a specific number. He said that more than 90 percent of those removed had 100 or fewer views. He said the company had improved some of its detection capabilities to flag re-uploads and alterations of content.

Facebook removed nine of 61 Facebook posts and Instagram links after being contacted by The Post, spokeswoman Andrea Vallone said. The others were edited to remove the most problematic claims, though in many cases, they directed people to sites where they could see the full video.

On TikTok, the term “Plandemic” has more than 1.9 million searches, Feinberg said. He found two dozen uploads of the video over the past three days. TikTok said it removed most of the videos after The Post shared them.

On several YouTube videos, the Google Drive link was in the caption below the video. One video that remains up, called “PANDEMIC THE COVID-19 AND CORONAVIRUS COVER-UP,” is not the documentary itself but simply a screen that tells people to look in the caption for a Google Drive link where they can see the full documentary.

The video’s caption also includes a prominent link to the CDC website, a tactic, Feinberg said, that is meant to game YouTube’s algorithms, which are currently prioritizing content that links to the CDC.

This is not the first time that Google’s productivity tools have been used to spread misinformation during the pandemic. In March, Tesla chief executive Elon Musk tweeted a link to a Google Doc touting a questionable study purporting to demonstrate the efficacy of the drug hydroxychloroquine. The Doc has since been blocked by Google.

Health or medical content that promotes or encourages engaging in practices that may lead to serious physical or emotional harm in individuals or to a serious public health harm is prohibited on Google Docs and Drive, according to the company’s content policies. A spokeswoman for Google, Alex Krasov, would not say whether the company uses technology to scan Google Drive files to enforce those policies and said Google “doesn’t go into details” about how those policies are enforced.

The Internet Archive, a nonprofit library that hosts historical content, has also been used to amplify misinformation banned by major social media companies. An Internet Archive link for an article falsely alleging that 21 million people had died of covid-19 in China was shared widely on Facebook after Facebook blocked the article from Medium, where it was originally published, according to Joan Donovan, director of the Technology and Social Change Research Project at the Shorenstein Center on Media, Politics and Public Policy at Harvard’s Kennedy School.

Following those reports, the Internet Archive said this month that it will alert users when they’ve clicked on stories that were debunked or taken down on the live Web, including “Plandemic.” The Internet Archive did not immediately respond to a request for comment.

Editing out the most problematic parts of a video, or linking people to a banned video that can live on another platform that permits it, is not a new strategy. It has been widely deployed during mass shootings, such as with footage of the Christchurch mosque shooting in New Zealand and by far-right figures. It routinely trips up technology companies.

But some of the newer workarounds appear to be a response to companies taking a tougher line during the pandemic, Donovan said.

“What is striking about media manipulation during the pandemic is that more and different actors are participating in tactical innovation, accelerating the networked distribution of content that breaks platforms’ rules,” she said.