team-picture
Anant Goel, Nabanita De, Qinglin Chen and Mark Craft at Princeton’s hackathon (Anant Goel)

When Nabanita De scrolled through her Facebook feed recently, she felt afraid. There were so many posts with competing information and accusations about Donald Trump and Hillary Clinton that she didn’t know how to begin deciphering the fearmongering from the reality.

The social media site has faced criticism since the presidential election for its role in disseminating fake and misleading stories that are indistinguishable from real news. Because Facebook’s algorithm is designed to determine what its individual users want to see, people often see only that which validates their existing beliefs regardless of whether the information being shared is true.

So when De, an international second-year master’s student at the University of Massachusetts at Amherst, attended a hackathon at Princeton University this week with a simple prompt to develop a technology project in 36 hours, she suggested to her three teammates that they try to build an algorithm that authenticates what is real and what is fake on Facebook.

And they were able to do it.

Consider these points before sharing a news article on Facebook. It could be fake. (Monica Akhtar/The Washington Post)

De, with Anant Goel, a freshman at Purdue University, and Mark Craft and Qinglin Chen, sophomores at the University of Illinois at Urbana-Champaign, built a Chrome browser extension that tags links in Facebook feeds as verified or not verified by taking into account factors such as the source’s credibility and cross-checking the content with other news stories. Where a post appears to be false, the plug-in will provide a summary of more credible information on the topic online.

They’ve called it FiB.

Since the students developed it in only a day and a half (and have classes and schoolwork to worry about), they’ve released it as an “open-source project,” asking anyone with development experience to help them improve it. The plugin is available for download to the public, but the demand was so great that their limited operation couldn’t handle it.

So while FiB isn’t currently up and running, when it works, this is what it looks like:

When a link cannot be verified, it looks like this:

Goel said that ideally, Facebook would team up with a third-party developer such as FiB so that the company could control all news feed data but then let the developers verify it so Facebook couldn’t be accused of “hidden agendas or biases.”

The sponsors of the hackathon included Facebook and other major technology companies. FiB was awarded “Best Moonshot” by Google, but neither Facebook nor Google, which has its own problems with promoting fake news, have reached out about helping them.

Both companies have said this week that they will take steps to address the spread of fake news.

This presidential election year has shown how the lines have blurred between fact and lies, with people profiting off the spread of fake news. There are more than 100 news sites that made up pro-Trump content traced to Macedonia, according to a BuzzFeed News investigation. The Washington Post interviewed Paul Horner, a prolific fake-news creator, who said, “I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything.”

Melissa Zimdars, a communications professor at Merrimack College in Massachusetts, said she’s seen a similar problem with her students who cite from sources that are not credible. So she created a list of fake, misleading or satirical sites as a reference for her students. She created it not as a direct response to the postelection fake news debate but simply to encourage her students to become more media literate by checking what they read against other sources.

The list, which she has continued to add to since she made it public earlier this week, has gone viral. She also included tips for analyzing news sources:

  • Watch out if known/reputable news sites are not also reporting on the story. Sometimes lack of coverage is the result of corporate media bias and other factors, but there should typically be more than one source reporting on a topic or event.
  • If the story makes you REALLY ANGRY it’s probably a good idea to keep reading about the topic via other sources to make sure the story you read wasn’t purposefully trying to make you angry (with potentially misleading or false information) in order to generate shares and ad revenue.

Zimdars said media literacy has become a challenge because people have grown so distrustful of institutional media that they turn to alternative sources. A recent Pew Research Center survey found that only 18 percent of people have a lot of trust in national news organizations; nearly 75 percent said news organizations are biased.

It doesn’t help, she said, that news media, to be profitable, rely on “click-bait” headlines that are sometimes indistinguishable from the fake stories.

Another problem, said Paul Mihailidis, who teaches media literacy at Emerson College in Boston, is that many people sharing links on Facebook don’t care whether it’s true.

“I don’t think a lot of people didn’t know; I think they didn’t care. They saw it as a way to advocate,” he said. “The more they could spread rumors, or could advocate for their value system or candidate, that took precedent over them not knowing. A large portion of them didn’t stop to critique the information. One of the things that has happened is people are scrolling though [Facebook] and the notion of deep reading is being replaced by deep monitoring. They see a catchy headline, and the default is to share.”

And even if they do care, the way people consume news, by a flick of their thumb on their smartphone, means they are less likely to take the time to cross-check what they are reading against other sources.

That’s where the plugin tool presents a simple solution.

“A few days back, I read an article telling people they can drill a jack in the iPhone7 and have an earphone plug, and people started doing it and ruining their phones,” De said. “We know we can search on Google and research it, but if you have five minutes and you’re just scrolling through Facebook, you don’t have time to go verify it.”

Twitter, Google, Facebook are changing their policies to prevent bullying and improve accuracy. (Reuters)

Read more on Inspired Life:

Deep in Trump country, this Clinton voter found renewed faith in her neighbors and America

From despair to resolve, how people who did not vote for Trump are coping with the loss

This election has divided the country. Getting Clinton and Trump voters talking is one way to heal.

Want more inspiring news and help to improve your life? Sign up for the Saturday Inspired Life newsletter.