An infographic explains how flagging works on YouTube. (YouTube)

YouTube says more than 90 million people have helped it flag potentially problematic videos since 2006 — more people than the entire population of Egypt.

That's just one statistic that the Google-owned video site revealed in a Thursday blog post about the importance of community moderation, as social media sites grapple with trying to balance free speech and potentially harmful content.

Google's streaming video service has the same problem as many other social media sites: Users submit so much content that companies say it's impossible to screen it all for postings that are illegal or offensive, or violate a site’s rules in other ways. In YouTube's case, it says users upload 400 hours of video per minute — the equivalent of 100 copies of “Gone With the Wind.”

That fire hose of content is why YouTube, along with many other sites, ask users to help them by flagging videos that violate their rules. Many have answered that call.

Users from 196 countries have flagged videos for YouTube, with one-third of those users reporting more than one video, YouTube head of public policy Juniper Downs wrote in the blog post. She also said “the number of flags per day is up over 25 percent year-on-year.”

Reporting a video, however, doesn’t mean it gets taken down. First, it starts a more formal review process. Clicking on the flag button, which is accessible by selecting the “More” option below a YouTube video, will pull up a form that asks a viewer to tell YouTube why they are reporting the video. The form lets users choose reasons ranging from things like sexual content and child abuse to spam. A YouTube employee then reviews the video.

“We have trained teams, fluent in multiple languages, who carefully evaluate your flags 24 hours a day, seven days a week, 365 days a year in time-zones around the world,” Downs wrote. “They remove content that violates our terms, age-restrict content that may not be appropriate for all audiences, and are careful to leave content up if it hasn’t crossed the line.” YouTube declined to comment on how many people are part of those teams, exactly how many languages they speak or where the majority of the workers are based.

Social media sites often walk a "delicate line" with moderation, said University of Washington law professor Ryan Calo. "The key thing to understand is that a platform like YouTube makes their decisions against the backdrop of our free speech principles and culture, but they are not bound by those principles," he said. A  site's own policies typically take precedence.

Those policies, however, can be subject to interpretation. For instance, Facebook recently reinstated and apologized for removing posts of a Pulitzer Prize-winning photo from the Vietnam War that shows a naked girl screaming and running from a napalm attack.

The photo, called “Terror of War,” was initially removed when Facebook's review process deemed it child porn. The action prompted protests in Norway and elsewhere. “Sometimes, though, the global and historical importance of a photo like 'Terror of War' outweighs the importance of keeping nudity off Facebook,” Facebook Chief Operating Officer Sheryl Sandberg wrote in a letter to Norway’s prime minister.

Online recruiting by groups like the Islamic State — including through YouTube videos — have also put security issues in the hands of social media moderators.

In an interview with The Washington Post last year, Assistant Attorney General for National Security John Carlin said tech companies were being “exploited” by terror groups and that they needed to do more to fight back. Representatives from tech companies including Facebook, Google and Twitter attended a White House summit on how to block the spread of ISIS propaganda online.

In the blog post Thursday, Downs said that YouTube is “vigilant and fast in removing terrorist content and hate speech,” but that those violations only represent 1 percent of the 92 million videos the site removed in 2015.

She also encouraged users to “continue flagging.”