The tweet begged the Internet for help: “My son was in the Manchester Arena today. He’s not picking up my call! Please help me.” Attached was a picture of a teenager in a suit. More than 15,000 people retweeted this image, while others tried to help with advice.
But this viral image is not just an example of the Internet’s capacity for generosity after an attack like the one in Manchester, England, at an Ariana Grande concert Monday. It’s also a hoax. The kid pictured in the video is a YouTube personality, and he was nowhere near the concert.
The YouTuber, TheReportofTheWeek, posted a video overnight clarifying that he is alive, and in the United States. “This unfortunately was an effort done by various trolls and website users, certain website users, just to try and mislead the general public with fake news,” he said.
This hoax wasn’t the only one of its kind after the terrorist attack in Manchester, which left 22 dead and 59 injured. And it’s not the first or last time that something like this will happen in the wake of an attack or disaster. This is part of the whiplash of how we experience tragedies online now. Just as the Internet can be a collective force for good — like the #RoomForManchester hashtag that spread overnight, or the real strangers who tried to help desperate parents find their kids — it is also a source of confusion and hoaxes.
teens are making up fake 'missing' friends at the ariana grande concert to get RT's. this is so dystopian pic.twitter.com/ghA8HLyydE
— jack wagner (@jackdwagner) May 23, 2017
This tweet received more than 16,000 retweets, but the comments are filled with users pointing out that the photo of his “twin” was an image he tweeted as a new profile picture for himself in February:
And another, also with about 16,000 retweets, that shows a child from a promotional image for a clothing line for people with Down syndrome.
There’s more. Collages like this were all over Twitter in the hours after the attack, in various forms, often reposted by well-meaning Twitter users who believed they were helping to spread word of the missing:
You’ll recognize the YouTuber behind TheReportofTheWeek as one of the photos. But a bunch of the other photos in this collage are also not genuine photos of those missing after the attack. Many are famous YouTubers or memes. One YouTuber, the one with the glasses in the second row, noticed her photo was included and started the ironic hashtag #Pray4Eva.
One version of the collage, tweeted out by the Daily Mail and screenshotted by BuzzFeed, contained an image of 4chan founder Chris Poole.
And as often happens, attempts to debunk these viral photos — even by the people who are actually in them — are not getting quite as much traction online as the original hoaxes.
— Andrea Noel ϟ (@metabolizedjunk) May 23, 2017
Before these hoaxes, there were others. There’s one man whose face always seems to appear as a “victim’s” after a terrorist attack. Trolls have framed the same Sikh man as a terrorist multiple times. The next time there’s an attack or disaster, this will happen again. Hoaxes, created for amusement, malice or chaos and spread by a well-meaning Internet looking for a way to help or inform, have become an inevitable part of our response to tragedy.