Consumer groups and youth advocates have claimed YouTube fails to filter content by appropriate age levels. (Kiyoshi Ota/Bloomberg)

Most Americans believe social media companies have a responsibility to remove offensive content from their platforms. But how tech firms decide which posts should be taken down is where public confidence fades.

In a survey published Wednesday by Pew Research Center, 66 percent of Americans say social networks have a responsibility to delete offensive posts and videos. But determining the threshold for removal has been a tremendous challenge for such companies as Facebook, YouTube and Twitter — exposing them to criticism that they’ve been too slow and reactive to the relentless stream of abusive and objectionable content that populate their platforms.

The screening process also has drawn accusations of selective enforcement colored by political bias, though the social networks are adamant they are politically neutral.

When it comes to removing offensive material, 45 percent of those surveyed said they did not have much confidence in the companies to decide what content to take down. And nearly 1 in 4 say they have no confidence at all in the companies to make judgment calls about removing content.

Pew, which broke down the survey results by political affiliation, found that this particular lack of faith in social media companies was shared on both sides of the aisle.

Among Democrats, 37 percent have confidence in the companies to determine what content should be taken down, compared with 62 percent who said they did not have too much confidence or none at all, the study found.

Republicans were even more skeptical: 23 percent of GOP supporters said they trusted the decisions of the companies to remove offensive posts, compared with 76 percent who largely lacked confidence.

President Trump has long railed against what he contends is social media censorship of conservative views. Sometimes relying on inaccurate or misleading data, he has claimed that social media giants unfairly limit his reach or target his supporters. On Wednesday, Republican Sen. Josh Hawley of Missouri introduced legislation that would strip tech companies of the broad legal protections they have to not be held liable for the content users post on their websites. The bill represents a major effort to addresses the charges from several Republicans that the tech giants wrongly silence conservative voices.

Earlier this month, YouTube, the Google-owned video site, announced it would take a more aggressive stance against hate speech, including removing videos that falsely deny the Holocaust and other major historical events took place. In response to criticism that social media companies are doing too little to combat hateful ideologies that rely on their platforms to gain an audience, YouTube and its peers have begun to take a broader view of what constitutes hate speech. Critics have zeroed in on the prevalence of posts that promote discrimination, peddle conspiracy theories about world events and that harm children.

On Wednesday, The Washington Post reported that the Federal Trade Commission is in the late stages of an investigation into YouTube, for allegedly violating children’s privacy. YouTube declined to comment on the FTC probe. Consumer groups and youth advocates have claimed that the video site fails to filter content by appropriate age levels, exposing children to a near-endless stream of troubling content.