“We’re releasing this feature because we want people to be able to take a break and have their time on Instagram be intentional and meaningful — irrespective of whether that means seeing less ads or not,” Meta spokeswoman Liza Crenshaw said.
The announcement comes after former Facebook employee Frances Haugen leaked internal company research suggesting that Instagram harms the mental health of young women and girls — and just a day before Instagram CEO Adam Mosseri is set to testify before Congress about the company’s impact on young people. Policymakers have criticized Facebook for failing to share its findings about teens, while Facebook shot back that the research was taken out of context.
The break prompt will ping people after 10, 20 or 30 minutes of scrolling and suggest they switch activities. These nudges will come with “expert-backed tips” for how to step away, according to a blog post from Mosseri. The feature isn’t on by default, but teens will get notifications encouraging them to set the reminders, the company says.
Other additions are still in the works, Instagram said. In January, all accounts will see a hub where users can mass-delete past posts, comments and likes to better manage their online presence, Instagram said. A feature the company says is coming in early 2022 will limit unwanted contact from strangers by preventing them from tagging teens in comments or posts.
The company also said it will roll out parental controls in March that let guardians see how much time teens spend on the app and set limits. Teens will also get the option to notify their parents when they report someone for inappropriate behavior on the app.
In addition, Instagram said it’s experimenting with alerts for teens who spend too much time “dwelling” on posts about a single topic. This comes after the company was criticized for serving to teens potentially harmful posts including diet and weight loss information and content promoting eating disorders. Meta’s Crenshaw did not say how the nudges will work or how much time will be considered dwelling.
And the company said it may also expand its “sensitive content control” for teens to places beyond the explore tab — a personalized landing page with posts from accounts you don’t follow. Historically, there’s been little transparency around what content counts as sensitive and how much exposure to potentially harmful content the company considers too much. Crenshaw declined to provide further details.
Experts said they aren’t sure how the tool launched Tuesday squares with Instagram’s business model, which relies at least in part on keeping teens engaged on the social platform.
“If you can’t be certain that your algorithms aren’t promoting and recommending harmful content to teenagers, then you shouldn’t use your algorithms on teenagers,” said Josh Golin, executive director at Fairplay, a nonprofit organization that aims to end marketing targeted at children.
Whistleblower Haugen’s revelations about Instagram’s alleged harm to teen girls drew much attention and criticism, but researchers have been uncovering similar findings for years, according to Linda Charmaraman, director of the Youth, Media & Wellbeing Research Lab at the Wellesley Centers for Women. The visual nature of apps such as Instagram creates a focus on physical appearance and a tendency among teens to compare themselves to others, Charmaraman said. Couple that with a business model that depends on getting brands in front of ever-younger customers — and that often relies on “influencer marketers” to do so — and you’ve got a recipe for mental health struggles, experts say.
The new reminder about breaks does address concerns about the “doom scroll,” or the habit of passively looking at content with little active participation, says Vicki Harrison, program director for the Center for Youth Mental Health and Wellbeing at Stanford University.
But as long as Instagram, Facebook and Meta refuse to let outside researchers and auditors review their data and algorithms, it’s impossible to know the extent of the problem and how it’s best addressed, Charmaraman said. For instance, the company could have data showing that teens typically scroll in five-minute chunks many times throughout the day, Charmaraman said. That type of engagement wouldn’t set off the break reminder, so teens could remain a reliable audience for advertisers.
Crenshaw said the average length of a short session on the app is 10 minutes. She noted that Meta wants to be more transparent with both its own employees and outside researchers and that it is looking for partners willing to collaborate on independent studies, although the company is unsure how to share data with third-party researchers without violating people’s privacy.
Wednesday will be the first time that Mosseri goes before lawmakers on behalf of the company. Critics of Instagram — including one of the senators in charge of the hearing at which Mosseri will testify — were swift to question the company’s motives in rolling out safety features for teens the day before his appearance.
“We’d like to be hopeful that this shows Instagram’s commitment to making its products safer for kids and teens. But with their track record, it seems like their ‘big announcement’ in the dead of night is more likely to be a smokescreen to try to draw attention away from things they don’t want highlighted at Wednesday’s hearing,” a spokesman for Sen. Marsha Blackburn (R-Tenn.) said in a statement.
Some advocates say they hope the company is planning more than extra health and safety features in response to what many think is a threat to the mental health of children and teens.
“Facebook is not getting the message. It can’t engage in piecemeal efforts. It can’t promise to do better. It needs to publicly say what changes it’s going to make and turn those changes over to policymakers and independent experts around the world,” said Jeff Chester, executive director of the Center for Digital Democracy, which advocates for privacy, civil and human rights. He and others called for Meta to release its research and algorithms for an independent audit, as well as officially scrap its plans for an Instagram app for users younger than 13.
Crenshaw said Instagram for Kids is still “on pause.”
As it stands, the new features are a move in line with Facebook’s longtime strategy, according to Jim Steyer, CEO of family advocacy group Common Sense Media: Wait until things blow up, then promise to make things better.
“It’s a public relations stunt,” Steyer said. “And it isn’t going to work.”