When Elyse Stieby opens her Instagram app, among the first things she sees are weight loss tips on the “explore” page: The number of calories in eggs, a medium coffee and a potato.

Stieby says she tries to just look at her friends’ posts, rather than the recommended content Instagram serves her in her feed and through the explore tab — the app’s version of a personalized landing page and search bar accessed through the magnifying glass icon at the bottom of the app. But she says she knows the app’s algorithm chooses what it shows her based on what it thinks she wants to see — so the makeup, hair and body tips are tough to avoid.

“I don’t need to lose weight. I’m 102 pounds,” said the 18-year-old materials science major at Ohio State University.

Experiences like Stieby’s are at the center of a storm of criticism surrounding Instagram owner Facebook. In September, Facebook paused plans for an Instagram app designed especially for children after lawmakers voiced concerns about the app’s effects on young people’s mental health. Instagram is supposed to be for children older than 13, but kids younger than that have been able to get on the platform. Facebook whistleblower Frances Haugen leaked internal documents to the Wall Street Journal and the Securities and Exchange Commission that suggested the company knew that the use of Instagram may hurt the mental health of young women and girls. She testified in front of a Senate committee saying Facebook put growth and profit above anything else. Facebook has fought back, denying the claims.

Instagram has been steadily increasing the amount of recommended content it shows people. In July, the app started putting videos from people you don’t know right alongside your friends’ posts in the main feed. And the explore tab — a curated collection of algorithmically recommended content — is a wild West of images the app thinks you will like based largely on other posts you’ve interacted with. Impressionable teens may ultimately pay the price as the explore tab spits out content including idealized images and dubious “self help” recommendations.

Social media apps Snapchat and TikTok have also been criticized for promoting content that could warp self image or encourage harmful behaviors.

Still, experts say there are some steps teens, parents and schools can take to help teens handle the challenges that come with social media use.

While some experts caution that the impact of social media on mental health isn’t fully understood, others have found demonstrable effects.

“The idea that Facebook just learned about this, as a problem for kids’ mental health, is complete baloney,” Jim Steyer, founder and CEO of family advocacy organization Common Sense Media, said.

Danielle Wagstaff, a lecturer in psychology at Federation University in Australia, co-authored a 2019 paper linking Instagram use with adverse mental health symptoms in women. Potential evidence that Facebook knowingly continued harmful practices shifts the conversation, Wagstaff said, leaving some parents wondering whether Instagram is a safe place for teens to spend time.

But teens are savvy media consumers and they’re coming to their own conclusions about the apps that expand their worlds and pricks at their brains. Teens say they understand how the algorithm works, and they’re doing their best to blunt its effects.

How the recommended content works

The photo feed we see on Instagram is typically filled mostly with posts from accounts we follow, and the same is true of stories, temporary posts that hover at the top of the feed and disappear after 24 hours.

But we can’t control what shows up in the explore tab or the slots for recommended posts inside our feeds. Instagram’s algorithm selects those based on a few factors. According to the company, it’s determined by the post and account’s popularity, whether the user has interacted with posts from that account before and the types of the content the user has interacted with even if they just tapped to read a caption or look closer.

Unlike your “Ads Interests,” which Instagram uses to target you with ads and which are listed under Settings -> Security -> Access Data -> Ads Interests, you can’t view what types of recommended content the app thinks you want to see. The only way to cut back on unwanted content is by clicking on the offending image in the explore tab, tapping the three dots in the corner and selecting “Not Interested.” Over time, the app should show fewer similar posts. You can also ask to see less sensitive content, which includes bare bodies, drugs and firearms, by going to Settings -> Account -> Sensitive Content Control and choosing “Limit Even More.” Instagram automatically limits sensitive content for people under 18.

Teens like Stieby may see content they don’t want to see, like calorie-counting infographics, because it’s related to other fitness or wellness content they’ve interacted with, according to Instagram spokeswoman Liza Crenshaw. Stieby responded that she doesn’t work out or look at fitness or wellness content.

Facebook whistleblower Haugen claimed in a Senate committee hearing on Oct. 5 that engagement-based ranking, or showing content that’s most likely to get a reaction, is good for Instagram owner Facebook, even if it’s bad for some kids and teens.

Teens are savvy, but ‘the algorithm’ is a mental burden

Gloria Wetherbee, 20, took a social media marketing class as part of her coursework at the University of Mary Hardin-Baylor in Belton, Texas, where students learned the best ways to compel audiences to interact with content. The class made her more aware of the ways content creators and social media companies drive engagement as she tries to avoid images of idealized bodies on Instagram, she said.

She’s careful not to tap on images of influencers, fashion tips or weight-loss content. Even sending them to a friend to make fun of the images means she’ll see more of them, she said. Instead, she carefully scrolls past them.

“I know part of the algorithm is sending new things and seeing what sticks, but I feel like I’ve honed my usage down so I don’t get it as much any more,” Wetherbee said.

Stieby says her explore page on Instagram has some self-help infographics with messages like, “Don’t let technology blind and consume you.”

The boys she knows see different content, she said.

“A lot of stuff is about the way you look and feeling pretty, or how to get skinnier or more toned or, ‘This is how you do your makeup so that guys will like you. Wear this perfume so that guys like you,’ ” she said. “But a guy’s Instagram, it’s like, ‘Oh, look at this car, it makes a cool sound.’ ”

Discrepancies in the ways boys and girls use social media — and the content they’re served — ring true for many teens, Wagstaff said. As Stieby put it: Boys see cars, girls see beautification tips.

But that doesn’t mean boys don’t struggle with self image, according to Wagstaff. Researchers are uncovering more instances of disordered eating in men, she said. And body image issues aren’t the only social media trap boys can fall into: Some pockets of the Internet promote violent or bigoted ideologies and teen boys are especially vulnerable, she added.

“We want Instagram to be a supportive place for people struggling with eating disorders and body image issues, especially young women and girls,” Vaishnavi J, head of safety and wellbeing at Instagram, said in a statement, noting that the company removes content that promotes or encourages eating disorders.

Ways to mitigate impact

Some parents may feel the itch to snatch the phone and ban the app. But pause a moment before launching your teen’s smartphone into the nearest body of water.

Kids that get their phones taken will likely get their hands on a new one, Wagstaff cautioned, and deleted apps can still be accessed from any Internet-connected device. Public relations blowups like Facebook’s require acknowledging a hard truth: If kids weren’t encountering harmful images on Instagram, they’d be seeing them somewhere else.

Instead, parents, schools and companies must work together to educate kids not only about the risks of social media, but also about the mindset it takes to move through a tough world with confidence and self love, she said. Parents should connect teens with resources to practice mindfulness and self-compassion, both of which help build resilience in the face of constant comparison.

Talk (and listen) frequently with your children about Instagram, Common Sense Media’s Steyer said, teaching them to recognize the compulsion to compare themselves to others. Explain that Photoshop and other editing tools are responsible for the stylized images they see, and ask why people choose to change their faces and edit their surroundings.

Some meditation apps offer series tailored for teens. In one meditation on the Calm app, pop star Camila Cabello walks listeners through an exercise in which instead of grabbing their phones every time they want to scroll, they mindfully observe that impulse instead.

Schools play a role as well, experts say. Common Sense offers school programs in media literacy and digital citizenship, both of which help students evaluate the messages they see online and engage constructively with others, Steyer said. Australia is experimenting with health programs in primary school that encourage kids to notice the natural difference among bodies and to comfortably talk about bodies, Wagstaff said.

Last, many think the apps themselves must change. Some legislators and advocates have pushed for new types of feeds on social media apps, rather than the kind that rank content with the goal of boosting engagement and time spent on the app. Others have supported bills expanding data privacy protections for children, which would make it harder for companies to track and target them.

Teens themselves are taking steps to manage their own social media use and put what they see into perspective.

Stieby, who also uses Snapchat and TikTok, said she uses her iPhone’s Screen Time app to set a limit for social media apps: two hours a day, tops. She rarely hits it, she said, but when she does, she knows it’s time to log off.