More than 1,500 complaints of unwanted sexual approaches, many targeting children, have been made against popular social networking apps in Apple’s App Store, in contrast to what Apple prominently markets as a “safe and trusted place,” according to a Washington Post investigation.
Using a machine learning algorithm to identify App Store reviews containing reports of unwanted sexual content, racism and bullying, The Post sifted through more than 130,000 reviews of six random chat apps, all but one of which were ranked in the top 100 for social networking by Apple earlier this month. The Post manually inspected the more than 1,500 reviews that made mention of uncomfortable sexual situations.
About 2 percent of all iOS reviews of Monkey, ranked 10th most popular in Apple’s social networking category earlier this month, contained reports of unwanted sexual experiences, according to The Post’s investigation. Despite that, the app was approved for users 12 and older. The other apps included in the investigation were Yubo, ChatLive, Chat for Strangers, Skout and Holla. At least 19 percent of the reviews on ChatLive mentioned unwanted sexual approaches.
Apple, which says on its website it “carefully [reviews] every app,” has long distinguished itself from competitors that exercise less control. But the prevalence of unwanted sexual content involving minors raises questions about whether Apple can continue to offer a protective cocoon to its customers as its platform grows. Apple has a financial interest in a bigger platform: It earns a cut of all revenue generated by apps.
Apple says it reviews 100,000 apps a week using a mix of software and humans. “We created the App Store to be a safe and trusted place for our customers to get apps and we take all reports of inappropriate or illegal contact extremely seriously,” Apple spokesman Fred Sainz said in a statement. “If the purpose of these apps is not inappropriate, we want to give developers a chance to ensure they are properly complying with the rules, but we don’t hesitate to remove them from the App Store if they don’t.” The age rating on Monkey was raised to 17 and older this week after inquiries from The Post.
But the random chat apps examined by The Post have been available on the App Store in some cases for years and are among its most popular. Apple’s practice has been to not monitor user reviews, according to a former Apple executive.
Interviews with parents, teens and experts show that the reviews reflect a broad problem on Apple’s platform. Because only a tiny fraction of overall users actually write reviews, what is visible on Apple’s App Store may represent a much larger number of real-life cases. The ages of the reviewers could not be learned, but many identified themselves as being under 18 or said they were concerned about underage users.
One of those reviews came from Katie Brandner, a mother of three in New Orleans. Last year, Brandner confiscated her 14-year-old daughter’s iPhone after she wouldn’t stop chatting late into the night. Brandner thought she would find messages with her daughter’s friends. Instead, she found the Yubo app and hundreds of messages from older men, many of whom sent sexually explicit photos of themselves, pressuring her daughter to reciprocate.
She complained to Yubo, an 18-person company based in Paris that is the 22nd most popular social networking app on the App Store, and posted a review warning other parents of the dangers of the app. “I hoped people would read it,” she said. “I hoped Apple might see it.” She didn’t hear anything from either company.
Yubo’s chief operating officer, Marc-Antoine Durand, called Brandner’s experience unacceptable and said the company will now respond to App Store reviews. He said the company has recently implemented better user protections. In the past six months, Yubo has removed 20,000 profiles of users who are under 13 by using an age estimation algorithm, said Annie Mullins, an independent safety adviser for Yubo.
Listen on Post Reports: “If Apple is saying they police the App Store for anything that could negatively affect kids, why aren’t they reading these reviews? And if they are reading these reviews … are they doing something to try to stop this?”
Skout spokesman Robert Rendine said in a statement that minors are not permitted on the app. “Our number one priority is to provide a safe environment for our millions of users to interact and connect, and we are continuously working to advance these efforts,” he said.
Allen Loh, head of global expansions for the Holla Group, which owns Monkey, declined to comment on either app. ChatLive didn’t respond to requests for comment. Chat for Strangers, owned by FunPokes, didn’t respond to requests for comment.
Unlike traditional social networks, which start by connecting people who already know each other, random chat apps are designed to put people together who may have nothing in common, including age and interests. With a single tap, two people are matched in a video call. Then they’re matched again with a new person, and so on. For many users, the main purpose of these apps is to make a romantic connection. But some younger users view them as a way to kill time or combat loneliness. Logging on is popular at sleepovers.
Monkey, Holla, Chat for Strangers and ChatLive employ roulette-style chats, where people are automatically placed in conversations with a person at random. Yubo and Skout connect people with strangers but offer users more control over with whom they speak.
“In my mind, these have to go,” Phillip Shoemaker, who was Apple’s director of App Store review from 2009 to 2016, said of the entire category of random chat apps. He said “chat roulette” apps were banned during his time at the company.
The reason apps like Monkey, Holla, Chat for Strangers and ChatLive are allowed, according to someone familiar with Apple’s guidelines who spoke on the condition of anonymity because that person was not authorized to speak on the record, is that they use some content moderation and other safeguards.
Sainz said Apple will work with developers who don’t comply with specific rules to “tighten their moderation practices to avoid future occurrences."
Apple promises to remove apps that it says contain “over the line” content, “especially when it puts children at risk,” according to its website. It calls out pornographic material in particular. “It’s our store and we take responsibility for it,” the company site says.
Apple, however, doesn’t read App Store reviews for clues about whether it is upholding that standard, Shoemaker said. “Ideally what you want is a bot to go through the reviews,” Shoemaker said. “If they did, we’d be seeing a lot more apps getting pulled off the store.”
Except ChatLive, all of the apps that The Post examined are also available on Google’s smartphone operating system, Android, where reviews also mention unwanted sexual approaches. But Android bills itself as a more open platform with fewer restrictions, even allowing users to install apps outside of the Google Play Store, something Apple doesn’t allow. It’s one of the reasons Apple’s phones are so popular with parents. In an April research report, Piper Jaffray estimated that 83 percent of U.S. teens use iPhones over Androids.
Google spokesman Dan Jackson declined to comment.
Many of these apps have age restrictions of 17 and older, but that doesn’t stop kids from logging on. An iPhone set up with the profile of a 9-year-old was able to download adult apps without any restrictions, The Post found. Developers set the age guidelines for their apps based on Apple’s age rating guidelines. Apple can adjust the ratings if the company deems it necessary.
Once logged in, moderating content falls to the app. Skout has 350 people, more than half its staff, devoted to safety and moderation, Rendine said. Yubo said it scans texts sent via the app to remove keywords that are commonly used by predators or in inappropriate conversations. It also prohibits nudity or posing in one’s underwear during live streams.
“There’s no silver bullet. It’s hard to achieve 100 percent,” said Mullins, pointing to some younger users trying to “outwit” the technology to get around the rules.
Apple controls who can download an app, and policing the platform is not getting easier. A new generation of young smartphone users is fueling an explosion in the number of new social network apps. When the majority of young people were concentrated in just a few apps, like Facebook, Twitter and Snapchat, it was easier to place responsibility with the large, U.S.-based companies that ran them.
Now, kids are scattering to smaller, niche apps with just a few employees, which makes policing the content more difficult, for Apple and for law enforcement as it tries to track down and subpoena owners of apps across borders. Monkey, for instance, was acquired last year by Holla, founded by Chinese teenager Eric Tao. The parent company of ChatLive appears to be based in Ireland.
“What we’re encountering with the small app companies is we have no way of finding out who built it, what records they retain, for what period of time, what country we’re located in, where the servers are,” said Chuck Cohen, a captain with the Indiana State Police’s Office of Intelligence & Investigative Technologies. Often, “it ends up being a series of shell corporations in various countries,” he said.
Cohen said his investigations have led him to ask Google and Apple for information on the developers of mobile apps, but that he’s come up empty-handed. “The information Google and Apple are collecting from them is very limited,” he said. Investigators have used apps like these to conduct investigations and prosecute people for child pornography and other crimes.
Apple has largely escaped the criticism other companies with big platforms have endured. Amazon has been criticized for facilitating the sale of stolen and counterfeit goods. Facebook’s platform has amplified the voices of white nationalists and right-wing conspiracy theorists. Uber and Lyft have tried to avoid liability for sexual assaults and other crimes that have occurred in rides they have set up.
“We take action on bad actors that attempt to abuse our store,” Amazon spokeswoman Cecilia Fan said in a statement. (Amazon chief executive and founder Jeff Bezos owns The Post.)
Uber, Lyft and Facebook declined to comment.
Apple has argued that its approach of tight control is better, even in the face of scrutiny from antitrust lawmakers investigating whether that control is stifling competition. Apple, which says it vets all 2 million apps on its App Store and rejects 40 percent of entries, says that its walled garden helps keep its customers safe and happy.
That is the rationale Apple used to remove from its store HKMap.live, which helped pro-democracy demonstrators in Hong Kong avoid police. Last week, it removed Like Patrol, an app that allowed Instagram users to track what certain people were doing on the app, and 181 apps related to vaping in the wake of health concerns about e-cigarettes. Apple said the apps violated its guidelines, citing evidence from the Centers for Disease Control and Prevention in the case of the vaping apps.
“When does Apple step in? That’s not clear,” says Eric Internicola, a longtime iOS app developer who offers consulting services to other developers. Internicola said that for apps that include user-generated comments, the rules about what exactly is over the line are fuzzy. He said he was skeptical that Apple could adequately monitor whether apps were doing a good job policing themselves. “How do you police the police?” he asked. (Internicola is a former mobile developer for The Post.)
Apple has taken a comparatively hands-off approach with chat apps that connect people, including teens, with strangers, even as parenting groups and law enforcement have raised alarm bells.
Some watchdog groups like Protect Young Eyes say parents have been complaining to Apple about these apps and their propensity to connect sexual predators to underage victims. Police task forces have warned parents and teachers about the apps in presentations at schools around the country.
“If you knew there were predators at the mall or park, you wouldn’t drop your kids off at the mall and say, ‘I’ll be back in nine hours,’ ” said Ed Peisner, founder of the Organization for Social Media Safety.
Once on the chat apps, users, especially if they’re women, often encounter at least some sexual behavior, according to experts. One of the most common interactions, according to interviews with experts and users of the apps, is men who surprise girls by masturbating on screen. These people say it’s also common for men to try to persuade women to expose themselves during the chat. Chat for Strangers and ChatLive allow users to pay for special access to women, for example, or for digital currency they can use to unlock special features. Apple collects a percentage of that revenue.
“I haven’t ever had a situation where it feels innocent and appropriate for kids,” said one of the experts, Christine Elgersma, senior editor of parent education for Common Sense Media, a nonprofit focused on how children use media and technology. “These are set up really to have sexual encounters for the most part.”
Paul Irwin, the founder of Sheepdog Bloodhound, a watchdog group that polices apps for predatory behavior aimed at kids, said the problem with the random chat apps is that the interactions start as private. More public communication on other apps offers digital clues to investigate. On random chat apps, kids “are very quickly exposed to individuals who are unclothed, who are predatory,” he said.
Those chats don’t always stay private. The Washington Post found videos online showing young girls using Monkey and other apps being surprised by grown men performing lewd sex acts. Recordings of chat app interactions have appeared on pornography sites.
As recently as last month, Bark, a company that monitors the content on kids’ phones for parents and alerts them to things like sending nude photos, obtained video of a 15-year-old girl using Holla. The teenager was chatting with two adult men who began masturbating after the girl identified herself as a minor. With the permission of the girl’s family, Bark shared with The Post two videos, with the images partially blurred out, depicting the interactions.
In addition to unwanted sexual behavior, The Post’s investigation also turned up reviews complaining of racism and bullying. Users who are black often reported being met with racial epithets when they connected with random strangers. And many users on Monkey, Yubo and Holla complained that they were ridiculed by others for fun.
The history of apps that connect people to one another at random began a decade ago. Facebook had just overtaken Myspace as the world’s dominant social network, and Internet users were eager for new ways to connect with one another online, long before the dark side of social was on anyone’s mind.
Monkey, founded in 2017, was part of the wave of new social networking apps. It gained users by tricking them into thinking someone was talking about them on the app, according to dozens of reviews in the App Store. The users downloaded the app after receiving a text message, they said.
In fact, nobody had been talking about them, according to the reports. The text messages had played on their insecurities about people talking about them behind their backs on social media, a common occurrence for teens exposed to cyber bullying.
But the “growth hack” worked, earning Monkey’s founders Ben Pasternak and Isaiah Turner a profile in the New Yorker, in which they bragged that Apple chief executive Tim Cook had emailed them to congratulate them on their success. Pasternak and Turner declined to comment.
Monkey’s solution to Apple’s requirement that apps offer a way to flag inappropriate content has also created perverse effects. On Monkey, flagged users are cordoned off from other users and relegated to a sort of second class within the social network where they are still able to freely communicate with others who have also been banned, according to many reports in the App Store reviews.
While some people are banned for inappropriate content, others in Apple’s comments section often complained that they had been banned simply because a person on the other end of a video chat didn’t want to talk to them anymore. Limited to being connected with others who have been flagged, those users reported being matched almost exclusively with men performing sexual acts. That has placed young users in an even riskier situation.
People who quit Monkey can simply move on to the next app. A search on Apple’s App Store for “Monkey” turns up more than a dozen similar apps, often depicting young girls blowing kisses or lying in bed holding their phones.
It can be a dizzying feat for parents to keep up with all the apps that may put kids in harm’s way. “I can go talk about specific apps and their dangers, but by the time I’m done, there will be a new app,” said Angela Alvarado, deputy district attorney specialized in online safety in the community prosecution unit in Santa Clara, Calif., which makes up part of Silicon Valley.
“Adults who want to start relationships with kids, they’ll have moved on to another one also,” she said.
Kyra, an 18-year-old from New Jersey, said she started going on random chat apps when she was 11 using an iPod Touch her parents bought for her. It started as something she did with friends to be goofy, but she eventually started using the apps, such as Chat for Strangers, while she was alone.
Kyra, who spoke on the condition that her last name not be used to protect her privacy, said that sexual advances from grown men were a constant part of the experience. She said she would tell people she was 15, adding four years to her age, but still under the app’s age limit of 17 and older. “They’d still push for sexual things,” she said. “Even if I said my real age, like 12 or 13, they’d say that’s okay. It made me feel uncomfortable."
Kyra eventually stopped using the apps, which she says displayed her bad judgment. “I wish I could slap myself,” she said. In an attempt to warn other young girls away from Chat for Strangers, she posted a review more than five years ago on the App Store describing the sexual behavior of grown men toward underage girls.
Recently, when Kyra logged onto the App Store, she was surprised to see that some of the random chat apps were still there, despite all the reviews. “I definitely think Apple should be held accountable,” she said. “Shouldn’t Apple be going through these reviews on the regular?”
Click here to download the 137,608 Apple app store reviews behind our investigation.