Pages, groups and videos pushing steroids and other performance-enhancing drugs could be found through searching for keywords like Human Growth Hormone or Humatrope. Once users land on one of those pages, dealers push using the drugs and may include a way to contact them, like an email address or WhatsApp number.
Even as recently as this week, more than a dozen Facebook pages, YouTube videos and an Instagram account were selling or promoting prescription steroids and other appearance enhancing drugs. After a Washington Post inquiry, the social media companies removed the pages and posts for violating their terms of service, which prohibit illegal drug sales.
The posts turn devices into drug dealers, said Tom Galvin, Digital Citizens Alliance executive director. “Parents should know that,” he said. Their kids are “gaining access to this online on sites that are mainstream.”
The sale of these types of drugs is just the latest example of harmful and illicit content proliferating on social networks. Disinformation, hate speech and illegal sales continue to plague the sites, despite efforts to better moderate content both with thousands of humans and improved algorithms.
Amazon and Google recently were selling firearm and gun accessories on their sites in apparent violation of their own policies. YouTube, Facebook and Twitter played Whac-a-Mole after the Christchurch shooting in New Zealand, when the shooter’s video surfaced and was disseminated across the Web. And Twitter, Facebook and YouTube recently said they removed hundreds of accounts that appeared to be a concerted Chinese effort at spreading political discord regarding protests in Hong Kong.
That has led some industry critics to question whether the companies are doing enough.
“The model of moderation that platforms use is structurally inadequate to the task,” said Roger McNamee, a Facebook investor who has become one of the most prominent critics of Big Tech. “It appears that the moderation is not actually designed to eliminate those things, it's designed to eliminate the political blowback.”
Facebook, which also owns Instagram, said it removes content that violates its policies as it finds them. “Our Community Standards make it very clear that buying, selling or trading drugs, which include steroids, is not allowed anywhere on Pages, in advertising, or anywhere else on Facebook,” said Facebook spokeswoman Crystal Davis.
YouTube said it removed 90,000 videos for violating its “harmful or dangerous policy” in the second quarter of 2019, and the company works closely with experts, including emergency room doctors and pediatricians, to develop its policies. “We’ve been investing in the policies, resources and products needed to live up to our responsibility and protect the YouTube community from harmful content,” YouTube spokesman Farshad Shadloo said.
Twitter spokeswoman Katie Rosborough pointed to the company’s existing policies, which state Twitter can’t be used for “any unlawful purpose or in furtherance of illegal activities.”
Steroids have previously surfaced as a social media problem. Digital Citizens Alliance first researched the issue in 2013, finding that steroids were being sold on YouTube. Immediately following the report, it appeared the company cracked down. (DCA receives funding from telecommunications, pharmaceutical and tech organizations, as well as some members of the Motion Picture Association of America.)
YouTube, which is part of Google, said it has taken a number of steps to reduce the spread of “borderline content and videos” on its site in recent years.
The researchers, including Eric Feinberg, the chief executive of New York-based GiPEC, decided to revisit the topic after noticing last year that steroids continued to be sold on the platforms. “They continue to turn a blind eye,” he said.
In some cases, the content surfaced to researchers as “Suggested Pages” or “Recommended Videos” they might like due to their searches. While YouTube has recently launched features attempting to reduce illicit content, it only works on English-speaking videos. The researchers found suggested steroid videos alongside videos in foreign languages like Arabic, according to images reviewed by The Post.
In one example, a Facebook page called “Landmarkchem Raw steroid powders, HGH, peptides & semifinished for sale” offered a variety of steroids via Facebook’s “shop” function, which is intended to help merchants sell products. The button directed people to the Facebook page of “Lucky Li,” whose email address and Skype name were also listed on the page.
The researchers contacted Landmark through an email address found on Facebook, and they were offered a wide range of drugs that are supposed to require a prescription, according to emails reviewed by The Post. The emails also gave instructions for administering the drugs.
The researchers purchased two vials labeled Human Growth Hormone, a substance that is often abused to enhance muscle mass or athletic performance, and three vials of an anabolic steroid known as Deca Durabolin, which is only available with a prescription due to potential side effects such as liver damage. The researcher paid about $360 for the drugs via the PayPal-owned platform Venmo. (Such a transaction violates Venmo’s terms of service.)
Drug testing showed that the Deca Durabolin appeared to be legitimate, but the HGH was not — leaving questions about what they were sold.
Steroids can be used to enhance appearance as well as athletic performance.
“It seems like a cruel irony that the same platforms that are fueling a desire to look like celebrities in the social media age are also the platforms that people turn to reshape their body to look like celebrities,” Galvin said.
Advocates against steroid abuse are calling on the tech companies to do more to stamp out the sales on their platforms.
Donald Hooton Sr., the executive chairman of the Taylor Hooton Foundation, fights steroid abuse in memory of his son. The new research illustrates just how much the issue of steroid abuse has evolved, he said. His son Taylor, a successful high school student athlete whose death was linked to steroid abuse more than 15 years ago, found his dealer while working out at the local YMCA.
Hooton Sr. said he was concerned that parents don’t realize such sales could now be happening on their children’s devices, and he called on the tech giants to do more.
“There is no doubt in my mind that they’ve got the capability, engineering skills and the wherewithal to get this crap stopped, to prevent this from going on their platform,” he added.