A proposal to extend to the world at large a system that uses self-ratings to protect children from Internet pornography has come under attack from civil liberties groups. They contend it might become the basis of mandatory government ratings.

Under the system, operators of Web sites would post ratings that they think match their content. "Filtering" software that parents can use to regulate what sites their children see works automatically with the ratings and can be set to block access to sites with certain ratings.

The plan is laid out in a "Memorandum on Self-Regulation of the Internet" and was presented last week as the centerpiece of a major "Internet Content Summit" hosted in Munich by the Bertelsmann Foundation, funded by the German media giant Bertelsmann AG.

An earlier effort in the United States to implement such a system by a group now called the Internet Content Ratings Association has resulted in 120,000 sites rating themselves, a small fraction of the millions on the World Wide Web. The Munich proposal attempts to breathe life into that system and extend it worldwide.

Already supporting the previous system are Bertelsmann, International Business Machines Corp., Microsoft Corp., AOL Europe, and telecommunications firms British Telecommunications PLC and Cable & Wireless PLC.

Most filtering software products are based on enormous lists of specific sites deemed objectionable by teams of human reviewers or by computer programs. Both approaches lead to products that can inadvertently block material that is not objectionable or allow material that is to slip through.

Self-rating removes some of the guesswork, advocates say. Many people who run adult sites have been willing to declare that fact through ratings, to avoid accusations that they are peddling their wares to children.

"It is in the best interest of industry to commit to self-regulatory mechanisms," Mark Wossner, chairman of the Bertelsmann Foundation, said in a statement. "The Internet is the medium of free expression and has to remain just that, even if safeguards for your protection against illegal content need to be provided."

Opponents say a voluntary system could easily evolve into a regime of government-mandated filtering. Esther Dyson, chairwoman of EDventure Holdings and a longtime advocate of online civil liberties, said in an electronic-mail message that while she applauded the foundation's attempts "to deal with a tough issue," the plan "leaves me feeling distinctly queasy."

The proposal could end up creating "a worldwide bureaucracy always forced to take the 'safe' route, calling for the removal of questionable content," she said.

Other critics say the system could result in small Web site creators getting filtered out if they refuse to adopt the rating system. "The proposal is as much about making the Internet safe for large media companies as it is about making the Internet safe for children," said David Sobel of the Electronic Privacy Information Center, part of the Global Internet Liberty Campaign.

Donna Rice Hughes, vice president of Enough Is Enough, a group that favors Internet regulation, expressed frustration with civil liberties groups, many of which have called in the past for self-regulation as an alternative to government Internet restrictions. "If they're balking at the idea, ultimately they don't want to see these types of solutions in place," Rice Hughes said. "They want 'anything goes.' "

Yale Law School professor Jack M. Balkin, whose ideas were adopted in the Bertelsmann proposal, said he hopes to create a system of self-regulation as an alternative to government initiatives that he feels would do more damage to civil liberties.

"In this information age, filtering is inevitable," Balkin said, because "there's too much information chasing too few minds." Balkin said he designed his proposal to address civil libertarians' concerns about filtering, saying he is "completely and irrevocably opposed to governments" forcing citizens to use filters.