While Congress debates whether to require U.S. schools to install filters on their computers to block children’s access to inappropriate web sites, a movement is underway to devise a global system for rating and filtering internet content worldwide.

An international summit held in Munich, Germany, Sept. 9-11, drew more than 300 policymakers, corporate executives, and experts from the fields of law and technology to discuss the initiative. The three-day summit was sponsored by the Bertelsmann Foundation, a non-profit social policy organization based in Germany.

At the summit, the Bertelsmann Foundation presented its “Memorandum on Self-Regulation of Internet Content,” a controversial document containing practical recommendations for governments, industry, and users to work together in developing a new culture of responsibility to protect children on the internet.

Jack M. Balkin, Knight Professor of Constitutional Law and the First Amendment at Yale Law School and one of the chief architects of the proposal, told the New York Times he believes that internet filters ultimately are inevitable. “The question then becomes, what is the best design for a filter so that it preserves civil liberties?” he said.

The challenge facing the initiative, Balkin said, is to find a way to control what children have access to on the internet, without resorting to strict government regulations. Mark Wossner, chairman of the Bertelsmann Foundation, explained, “The internet is a medium of free expression and has to remain just that, even if safeguards for youth protection and against illegal content need to be provided.”

The summit comes in the wake of a survey conducted in June by the Allensbach Institute of Opinion Research. According to the survey, which included respondents from the United States, Australia, and Germany, a large majority (79 percent of Americans and 86 percent of Germans) believe the internet needs some form of policing.

The survey found significant cultural differences on the subject of internet regulation, though. For example, while 43 percent of Americans said they’d want nudity blocked on their computers, only 13 percent of Germans agreed. Considering Germany’s history with hate crimes and racial violence, it’s not surprising that 58 percent of Germans expressed a desire to block radical political messages online—but only 26 percent of Americans, who traditionally uphold freedom of speech, wanted these types of messages blocked.

Supporters of the international plan believe that contrasting attitudes about what should be blocked shows a need for a system that can be adjusted so parents and educators are the ones who create the restrictions—not software makers, governments, or interest groups.

Three-layer model

The model for self-regulation proposed by the Bertelsmann Foundation has three components, according to Balkin. First, web site operators worldwide would voluntarily describe their site using categories such as nudity or violence, at various levels of intensity.

Second, regulators and interest groups—including school districts—would create templates to filter sites according to their content ratings. For example, an internet user could select the Moral Majority template, which would filter out content that is not approved by members of that association.

Third, the system is fine-tuned as groups release so-called “white lists” of acceptable sites which may have been inadvertently filtered out in the second phase. For example, a legitimate news site may have been filtered out for “violent content.”

According to Balkin, this three-layer model would combine the use of technology by end users with a choice at the local level of what to filter, giving parents and educators control over the material viewed by children according to the templates they use.

For such a system to work, Balkin said, it must: (1) have the capacity for organic growth; (2) be transparent—that is, it should communicate where on the system, and why, a particular site was blocked; and (3) be compatible with different ratings systems. But critics of the plan point to several holes that will need to be filled before it ever is adopted.

One problem is, who will do the rating? Leaving it up to site operators themselves would be irresponsible; but giving responsibility to a third-party organization would be too expensive and would generate its own set of problems. Balkin believes the solution will combine both approaches, but how remains a question.

Another problem is how to deal with web site operators who refuse to participate in the voluntary rating. Would their sites automatically be blocked by the technology—and, if so, would this have a stifling effect on free speech, contrary to the plan’s intent?

David Sobel, general council for the Electronic Privacy Information Center, worries that an international rating system will encourage abuse: “It’s likely that some governments will mandate the use of a rating system. We think it will have a detrimental effect on the internet, making it more like TV, where certain issues are emphasized and others are marginalized.”

Bertelsmann Foundation


Electronic Privacy Information Center