A new report from an anti-censorship alliance of 50 nonprofit organizations claims that internet filtering software is “hopelessly flawed.” The report comes as schools seek to comply with the Children’s Internet Protection Act, which requires schools to filter students’ access to the web if they wish to receive federal eRate discounts on their internet connectivity.

Makers of filtering software dismissed the report as inaccurate and based on outdated information.

“Internet Filters: A Public Policy Report,” released by the National Coalition Against Censorship (NCAC), summarizes existing information about products designed to filter out internet sites that are deemed offensive or inappropriate for adolescents or children.

In the spring and summer of 2001, NCAC’s Free Expression Policy Project surveyed all of the studies it was able to locate describing the actual operation of 19 products commonly used to filter web sites, including those widely sold to schools: N2H2’s Bess, SurfControl’s Cyber Patrol, Symantec’s iGear, 8e6 Technologies’ X-Stop, and WebSense, from the company of the same name.

Written by Christina Cho and directed by Marjorie Heins, a former First Amendment litigator for the American Civil Liberties Union, the report’s purpose is to “present this information in one place and in readily accessible form, so that the ongoing policy debate will be better informed about what internet filters actually do.”

The report summarizes previously existing studies from 1997 until now to create an extensive listing of blocked web sites by category. Filtering programs are not delivering on the services they promise, researchers conclude.

In hundreds of cases, the report declares, the filters block valid sites in error. Researchers say one reason is because the technology relies largely on detecting key words or phrases such as “over 18,” “breast,” or “sex.”

In one study cited by the report, Bess blocked House Majority Leader Richard “Dick” Armey’s official web site. (Ironically, Armey—a Republican—supported the new filtering law.) In other studies, Symantec’s iGear blocked a United Nations report titled “HIV/AIDS: The Global Epidemic,” while Smartfilter blocked an online brochure called “Marijuana: Facts for Teens,” published by the National Institute on Drug Abuse.

“The problem stems from the very nature of filtering, which must, because of the sheer number of internet sites, rely to a large extent on mindless, mechanical blocking identification of key words and phrases,” wrote Cho.

Even when a company’s human reviewers are able to review a site manually, there remain “massive problems with subjectivity,” the report states. Critics of filtering insist that the political attitudes of different manufacturers are reflected in their decisions to block certain sites on topics such as homosexuality, human rights, and even criticism of filtering software.

For instance, the report alleges that Cyber Patrol unjustly blocked sites with a political slant, such as the Flag Burning Page, which examines the issues behind flag burning from a constitutional perspective. Cyber Patrol did not, however, block sites promoting firearms, such as the National Rifle Association’s web page.

“Ultimately, less censorial approaches such as media literacy, sexuality education, and internet acceptable-use training may be better policy choices than internet filters in addressing concerns about young people’s access to ‘inappropriate’ content or disturbing ideas,” wrote Cho.

Filtering companies say NCAC has its facts wrong.

“We at SurfControl take very seriously the matter of protecting First Amendment rights,” said Susan Getgood, the company’s vice president of marketing. “In fact, one of the reasons there is not a broader censorship law in the United States is because the Supreme Court in 1998 thought filtering software was so good that it made censorship laws unnecessary.”

As for the specific incidents mentioned in the report, Getgood said the information is very dated and does not reflect the state of filtering technology today.

“The authors’ definition of how filtering works is erroneous and does not correctly describe how most filtering products work or how well they work,” she said. “Products have not relied on key word filtering for years.”

According to Getgood, it is impossible to comment on the specific sites the report says were blocked erroneously without knowing more.

But in any event, SurfControl does not “block” web sites, she said. Instead, the company “place[s] sites into categories and allow[s] … schools, parents, and individual users to make their own choices about what they want to filter, depending on their own definition of acceptable use.”

Javier Garriz, senior product manager of enterprise solutions for Symantec, echoed Getgood’s sentiments.

“The software can be configured to allow as much or as little web content to be accessed, as is specified by the user,” he said. Furthermore, administrators can always override any content classifications contained in the URL database supplied with the system.

The new report contains nothing that hasn’t been said before by critics of filtering software, but NCAC said it hopes the report will intensify the debate over filtering in schools.

Most filtering companies seem unperturbed about the report’s findings.

“I think it actually speaks very well for N2H2,” said spokesman David Burt. “These guys aggregated 10 pieces of research about N2H2 from 1997 to 2001 and found only 90 sites wrongly blocked out of more than 4 million. That’s an accuracy rate of 99.99 percent.”

Links:

Internet Filters: A Public Policy Report

http://www.ncac.org/issues/internetfilters.html

SurfControl’s Cyber Patrol

http://www.cyberpatrol.com

Symantec Corp.

http://www.symantec.com