The material comes from the project’s newsletter, the Sift, which takes the most recent viral rumors, conspiracy theories, hoaxes and journalistic ethics issues and turns them into timely lessons with discussion prompts and links. The Sift, which is published weekly during the school year, has more than 10,000 subscribers, most of them educators.
The News Literacy Project also offers a program called Checkology, a browser-based platform designed for students in grades six through 12 that helps prepare the next generation to easily identify misinformation. During the coronavirus pandemic, the project is offering access to Checkology Premium at no cost to educators and parents in the United States. More than 1,100 educators and parents in 49 states and the District of Columbia have registered to use the platform with as many as 90,000 students.
You can learn more about the News Literacy Project and all of the educational resources it provides in this piece, but here is a rundown:
Founded more than a decade ago by Alan Miller, a former Pulitzer Prize-winning reporter at the Los Angeles Times, the News Literacy Project is the leading provider of news literacy education.
It creates digital curriculums and other resources and works with educators and journalists to teach middle and high school students how to recognize news and information to trust — and it provides them with the tools they need to be informed and engaged participants in a democracy. It uses the standards of high-quality journalism as an aspirational yardstick against which to measure all news and information. Just as important, it provides the next generation with an appreciation of the First Amendment and the role of a free press.
Here’s material from the Oct. 19 Sift:
Big week for big tech
As the U.S. presidential election draws near, social media companies are taking action against falsehoods and questionable content posted on their platforms, sparking fresh controversy on the timing and scope of such efforts.
YouTube announced on Oct. 15 that it is banning QAnon and other “harmful conspiracy theories” that target individuals. The decision follows similar recent efforts by Facebook, Twitter and other platforms to curb content related to QAnon, a sprawling system of conspiratorial beliefs. Other social media decisions restricting content also made headlines in rapid succession.
On Oct. 12, Facebook cracked down on “any content that denies or distorts the Holocaust” — a reversal of Facebook founder and chief executive Mark Zuckerberg’s previous stance that the platform should allow for a wide range of free speech. A day after the Holocaust announcement, the company banned ads that discourage vaccine use.
Both decisions unfolded less than a week after the company said it would temporarily stop running political ads once polls close on Nov. 3. Twitter and Facebook also each took steps to slow the spread of a widely disputed Oct. 14 report by the New York Post, which included unverified claims based on purportedly hacked materials involving Democratic presidential nominee Joe Biden’s son Hunter. Twitter prevented users from sharing certain links and related images. (Amid pushback, the company soon reversed course and said it was changing its hacked-material policy.) Meanwhile, Facebook opted to reduce the reach of the New York Post piece while the company’s third-party fact-checkers reviewed it.
Note: There’s growing concern that online falsehoods could foment real-world violence around the election.
“Without methodology or transparency, Facebook and Twitter become the ‘arbiters of the truth’” (Cristina Tardáguila, Poynter).
- “‘We Can’t Only be Mad at Facebook’: Nonprofits, Newsrooms Team Up Against Misinformation” (Catherine Buni, Nieman Reports).
- “Four in five Americans concerned misinformation will influence election” (Megan Brenan, Knight Foundation).
Discuss: Do you agree with the steps that social media companies have taken recently to prevent the spread of misinformation? Is banning Holocaust denial, QAnon content, anti-vaccination ads or post-election political ads censorship or responsible moderation? If you were the CEO of YouTube, Facebook, Twitter or another major platform, how would you respond to misinformation on your platform?
Viral rumor rundown
NO: This aerial photo does not show crowds at a rally for President Trump in Ocala, Fla.
YES: The photo shows a crowd of more than 1 million people at the 2018 Street Parade music festival in Zurich.
YES: More than 5,000 people attended a rally for Trump at the Ocala International Airport on Friday.
Tip: Be wary of aerial photos showing crowds from user-generated sources of information (especially anonymous people or strangers online). Viral rumors often present images of large crowds in false contexts to try to exaggerate support for a given position.
★ Featured rumor resource: Use these classroom-ready slides to turn this viral rumor into an engaging, fact-checking challenge with your students.
NO: If a poll worker were, for some reason, to write on a ballot, it does not invalidate that ballot.
NO: Poll workers generally do not write on ballots except, in some states, to verify the ballot’s authenticity with a signature or stamp.
YES: Several iterations of this copy-and-paste rumor recently went viral on Facebook, gaining traction with voters outside South Carolina, where the claim appears to have originated.
Note: These kinds of rumors, in which blocks of text are copied and pasted — sometimes with slight alterations — are called “copypasta” in Internet parlance.
Also note: Aside from lacking evidence for this claim, the version of this rumor shown above contains two additional red flags: It attributes the shared text to “a very reliable good friend” who is unnamed, and encourages people to “PLEASE PLEASE PASS THIS ON!”