In its final report to Congress Oct. 20, the Child Online Protection Act (COPA) Commission recommended a combination of broader public education, heightened public awareness of existing technologies, better enforcement of existing laws, and industry self-regulation to help stop kids from accessing internet material that is “harmful” to minors.

The commission stopped short of recommending mandatory web filtering for schools, even as Congress considered a bill that would require just that.

“The commission has concluded that no single technology or method will completely protect children from harmful material online,” said Don Telage, the group’s chairman.

Congress asked the 18-member commission to evaluate potential solutions to keep kids from accessing inappropriate material on the internet when it enacted COPA in 1998. That act also made it a crime for web sites to allow access to harmful material by minors, a provision the courts have struck down as unconstitutional.

After hearing from numerous witnesses and reading pages of testimony, the commission identified 17 technologies and methods for protecting children. Commission members then rated each item according to effectiveness, accessibility, cost, First Amendment issues, privacy, and law enforcement.

“This report should serve as a blueprint for future action and is a first step in what we hope will be a continuing dialogue among Congress, the federal government, law enforcement, and the internet community,” Telage said.

Public awareness

The commission recommends that government and the private sector launch a major family education campaign to promote public awareness of technologies and methods available to protect children online. Many people are knowledgeable about the internet, but parents generally are not, Telage said.

The campaign should stress parental involvement in their children’s online activities, access to child-friendly sites, and technology to protect children online—perhaps through a web site. Government, schools, PTA groups, public libraries, and community centers should be essential components of this campaign, the report said.

Computer retailers and manufacturers should readily sell filters and other parental controls, it said. In addition, both parents and public institutions—such as schools and libraries—that offer internet access should adopt acceptable-use policies, and government and industry should promote their use.

“Just as we provide children with firm rules for crossing the street, we need to provide them with rules and guidelines to facilitate their online learning experience as well as their safety,” the report said.

Consumer empowerment

Child protection technologies need to be evaluated—and the public needs to be informed of these evaluations—so consumers know how well they work and what they block, the report said.

“One of the things we learned about in the testimony was the variation in effectiveness of these tools,” including filters, browsers, rating systems, and biometrics, Telage said. He described it as a “fledgling industry” that has no meaning for the public. “We got such conflicting testimony that we could not get a clear statement about this,” he said.

A nongovernmental testing facility could give the public objective, well-researched information about the features, effectiveness, price, and search criteria of the various protection technologies on the market, the report said.

Also, the commission recommends that the industry make better filters and monitors that are easy to use and are more accessible. Browsers and web portals should prominently display links to parental control devices.

The report also said national industry standards need to be developed for labeling, rating, and identifying web sites.

“We found that if only sites rate and label by some uniform manner, then the filters work better,” Telage said. Both new media, like the internet, and old media, like television, print, movies, and games, need to create uniform ratings, the report said.

Law enforcement

The report recommends that all government levels increase law enforcement funding to pay for more aggressive investigations and prosecutions to deter online pornography and sexual exploitation of children.

“The testimony … showed the resources law enforcement had [were] so inadequate, they [could] only focus on child-stalking cases,” Telage said. “It’s a question of manpower.”

State and federal law enforcement should provide internet service providers with a list of internet news groups and web sites that contain child pornography or lead to convictions involving obscene material, the report said.

The commission called for stricter laws and better enforcement to discourage deceptive or unfair practices that entice children to look at obscene materials. This would prevent producers and distributors of obscene material from marketing to children, the report said.

Self-regulation

The commission also challenged the “adult industry” to take responsibility and to police itself. Representatives from the industry told the commission they are willing to take steps to restrict child access to adult content.

“The adult industry needs to self-regulate itself, so its material doesn’t come into the hands of children,” Telage said. This means the front pages of adult sites wouldn’t contain explicit graphics or text and would be labeled “adult only.”

In addition, ISPs should take voluntary steps to protect minors by regulating themselves, the report said. They should remove child pornography hosted on their servers when they are notified of its presence, and they should voluntarily cooperate with authorities during investigations involving their services.

“Some of the smaller ISPs are not that good at it,” Telage said, although larger ISPs do self-regulate.

Items not recommended

The commission’s report does not recommend the creation of special top-level domains such as “.xxx” or “.kids,” although the Internet Corporation for Assigned Names and Numbers (ICANN) was to consider approving them in November. The COPA Commission has not advocated or been involved with ICANN’s decision-making process, Telage said.

New top-level domains come with First Amendment and privacy issues, Telage said. “They sound good on the surface, but once you get into them, you realize they have problems.”

Also, the commission doesn’t recommend mandatory filtering. The report said, “No particular technology or method provides a perfect solution but, when used in conjunction with education, acceptable-use policies, and adult supervision, many technologies can provide improved safety.”

The education spending bill still being considered by Congress at press time contained language that would force schools and libraries to use filtering technologies as a condition of receiving federal eRate funding.

Telage said the commission thought it was best to give the commercial adult industry a chance to police itself first.

“The most significant recommendation in my mind is the charge to the adult industry to self-regulate,” Telage said. “It puts the burden where it belongs.”

Most services self-regulate, and the adult industry should do the same, he said. “It would single-handedly have the largest impact on the problem,” Telage said. “It [also] would have minimal impact on First Amendment issues.”

COPA Commission

http://www.copacommission.org

ICANN

http://www.icann.org