This is the latest installment of a weekly feature on this blog — lessons from the nonprofit News Literacy Project. Each installment offers new material for teachers, students and everyone else who wants a dose of reality.

You can learn about the News Literacy Project and all of the educational resources it provides in this piece, but here’s a rundown:

Founded more than a decade ago by Alan Miller, a Pulitzer Prize-winning reporter at the Los Angeles Times, the News Literacy Project aims to teach students how to distinguish between what’s real and fake in the age of digital communication and a president who routinely denounces real news as “fake.”

Now the leading provider of news literacy education, it creates digital curriculums and other resources and works with educators and journalists to teach middle school and high school students how to recognize news and information to trust — and provides them with the tools they need to be informed and engaged participants in a democracy. It uses the standards of high-quality journalism as an aspirational yardstick against which to measure all news and information. Just as important, it provides the next generation with an appreciation of the First Amendment and the role of a free press.

The following material comes from the project’s newsletter, the Sift, which takes the most recent viral rumors, conspiracy theories, hoaxes and journalistic ethics issues and turns them into timely lessons with discussion prompts and links. The Sift, which publishes weekly during the school year, has more than 10,000 subscribers, most of them educators.

Here are lessons from the Monday edition of the Sift, as provided by the News Literacy Project:

Practice information hygiene

The parallels between the spread of the new strain of coronavirus and the spread of misinformation and confusion about it — between the actual pandemic and what the World Health Organization called an “infodemic” — offer a number of important and urgent lessons in news and information literacy.

Just as covid-19 has thrown the weaknesses of our global health infrastructure into stark relief and dramatically raised the stakes of our personal choices and habits, the outbreak has underscored defects in our information infrastructure and is emphasizing the potential impact of our choices and habits online.

Much of the public health messaging in the last week has focused on the importance of practicing good hygiene (for example, the #WashYourHands hashtag that trended across social media platforms) and on the need to “flatten the curve” by engaging in “social distancing.”

At the same time, we all need to focus on “information hygiene” and flattening the curve of dangerous falsehoods online by taking proactive steps to reduce their spread. Our decisions about which pieces of information to “like” and share can have a surprising impact on others.

For example, a false claim about ways to avoid the virus or cure covid-19, however well-intentioned, may cause someone to downplay the seriousness of the outbreak or the recommendations of public health officials. Or something posted to social media as a joke might, after being “liked” and shared a number of times, be taken seriously and exacerbate public confusion and panic about the crisis. (Remember, “likes” are also known as “passive sharing,” because many platforms’ algorithms suggest things you “like” to your followers.)

As this crisis unfolds, more and more people will be asked to stay home, meaning more and more people will be online more than ever before: searching for answers and trying to make sense of the events around them. It is essential that we bring the same seriousness and sense of responsibility to our roles as creators and sharers of information as we do to our roles as conscientious stewards of public health.

For more examples of misinformation about the pandemic, see this issue’s viral rumor rundown below and this running list from Jane Lytvynenko of BuzzFeed News.

Note: Discussing the covid-19 pandemic with students is also an opportunity to explore the ways the crisis has ignited a familiar cast of known “bad actors” in the information ecosystem. Hucksters are peddling bogus supplements and miracle cures, some of which are dangerous; disinformation agents and conspiracy theorists are pushing elaborate falsehoods; trolls are sowing confusion; extremists are promoting agendas of hate; opportunists are using misinformation to generate social media engagement; hackers are exploiting public fear and uncertainty to compromise accounts and install malware; and millions of ordinary people are inadvertently amplifying misinformation out of a well-intentioned attempt to help their friends, family members and followers online.

Also note: Social media companies are struggling to combat covid-19 misinformation — partially because many of their systems for policing misinformation were created to address coordinated campaigns, not a global viral misinformation outbreak from ordinary people (which means monitoring misinformation in dozens of languages and national contexts). Still, they are also taking unprecedented steps in the right direction.

Related:

Discuss: What responsibilities come with free speech? What kinds of speech shouldn’t be permitted in a free society? Should all social media platforms ban and remove medical misinformation? Why or why not? Should social media companies treat misinformation about climate change (an issue where there is scientific consensus) the same as they do misinformation about covid-19? Why do you think crises and tragic events tend to spark conspiracy theories?

Resource: “Sifting Through the Coronavirus Pandemic,” an information literacy hub created by Mike Caulfield, a digital literacy expert at Washington State University. (Note: While we fully endorse Caulfield’s SIFT method, it is not affiliated with this newsletter.)

Also note: The News Literacy Project is working on a resource response for educators teaching news literacy during this outbreak. In the meantime, you can start by reinforcing these five tips with students:

1. Recognize the effects of your information decisions. Just as your decisions and actions can inadvertently spread the virus itself, your conduct online can influence others and have consequences in the real world.

2. Take 20 seconds to practice good information hygiene.

Like the time recommended for effective hand-washing, 20 seconds is all that is needed to eliminate a significant chunk of the misinformation we encounter: Scan comments for fact checks, do a quick search for the specific assertion, look for reliable sources and don’t spread any unsourced claims.

3. Filter your information sources. The World Health Organization cited the “over-abundance of information” (PDF) as a cause of the current “infodemic.” While a diverse and varied information diet is generally important, so is the ability to focus your attention on credible sources.

4. Learn to spot misinformation patterns. Rumors about this virus often cite second- and third-hand connections to anonymous people in positions of authority, such as health or government officials. Don’t be fooled by “copy-and-paste” hearsay.

5. Help sanitize social media feeds. Flag misinformation when you see it on social media. Failing to do so leaves behind an infected post that will influence those who see it after you.

Viral Rumor Rundown

NO: Covid-19 does not cause pulmonary fibrosis (scarring of the lungs).

NO: Holding your breath for 10 seconds is not a reliable test for pulmonary fibrosis — or for covid-19.

YES: Drinking water is generally good for you, and proper hydration is important during treatment for any infection.

NO: Frequently drinking water does not prevent infection from the current strain of coronavirus by washing the virus into your stomach.

Note: Like many viral rumors, this one includes a request to “send and share” this falsehood to “family, friends and everyone.” You should be skeptical of user-generated material that cites sources that are anonymous or unfamiliar, especially if it explicitly asks you to share it widely.

Also note: There are numerous “copy-and-paste” style viral rumors — many of them citing second- or third-hand advice from an authoritative source — circulating via social media, email and text message about the virus.

Also note: There are at least a dozen iterations of this “advice” circulating online, including one that says it is from an “internal message” to the “Stanford Hospital Board.” In a post on March 11, Stanford Health Care debunked this (scroll to the bottom of the Web page).

Troll farm in Ghana fueled racial discord in U.S.

A Russian-backed disinformation operation running more than 270 social media accounts and pages was uncovered in West Africa by investigative reporters at CNN working with two researchers from Clemson University.

A “troll farm” in Ghana — based at a compound outside Accra, the capital — consisted of 16 people who worked exclusively on mobile phones to exacerbate racial divisions in the United States, CNN reported on March 13, while at least eight people were doing the same in Lagos, Nigeria. They were hired by a purported nongovernmental organization, Eliminating Barriers for the Liberation of Africa (EBLA), and the first accounts were created in July.

Focusing on racial issues in the United States, the workers, most of whom were in their 20s, shared links to stories about controversial topics such as the use of excessive force by police and posted inflammatory comments on Facebook, Instagram (which is owned by Facebook) and Twitter. The accounts and pages on the three platforms had a combined total of more than 340,000 followers.

Ghanaian police raided the compound in Accra on Feb. 6, and posts stopped appearing that day. On March 12, Facebook removed 49 Facebook accounts, 69 Facebook pages (a public profile created for businesses, causes, celebrities and organizations) and 85 Instagram accounts associated with the operation, and Twitter took down 71 accounts.

Seth Wiredu, a Ghanaian who lives in Russia, told CNN he financed EBLA with his own money; Ghanaian authorities said the funding came from Russia, and both Facebook and Twitter noted Russian involvement.

One of the workers in Ghana who spoke with CNN said Wiredu has encouraged his employees to open new accounts.

Discuss: How could Russia benefit from deepening racial divisions in the United States? What portion of the social media accounts with whom you interact on a regular basis are operated by people whose identities you know for certain — and what portion are operated by people you don’t know? Can this type of activity alter public dialogue and disrupt U.S. elections? What other effects might these kinds of campaigns have? Is this campaign consistent with previous Russian disinformation efforts?

Idea: Have students divide into groups and spend 15 minutes researching the history of Russian propaganda. Then have each group share their findings.