A new report from an anti-censorship alliance of 50 nonprofit organizations claims that internet filtering software is “hopelessly flawed.” The report comes as schools seek to comply with the Children’s Internet Protection Act, which requires schools to filter students’ access to the web if they wish to receive federal eRate discounts on their internet connectivity.

Makers of filtering software dismissed the report as inaccurate and based on outdated information.

“Internet Filters: A Public Policy Report,” released by the National Coalition Against Censorship (NCAC), summarizes existing information about products designed to filter out internet sites that are deemed controversial, offensive, or inappropriate for adolescents or children.

In the spring and summer of 2001, NCAC’s Free Expression Policy Project surveyed all of the studies it was able to locate describing the actual operation of 19 products commonly used to filter web sites, including those most widely sold to schools: N2H2’s Bess, SurfControl’s Cyber Patrol, Symantec’s iGear, 8e6 Technologies’ X-Stop, and WebSense, from the company of the same name.

Written by Christina Cho and directed by Marjorie Heins, a former First Amendment litigator for the American Civil Liberties Union, the report’s purpose is to “present this information in one place and in readily accessible form, so that the ongoing policy debate will be better informed about what internet filters actually do.”

The report summarizes previously existing studies from 1997 to the present to create an extensive listing of blocked web sites by category. Categories include artistic and literary sites, sexuality education, gay and lesbian information, political topics and human rights, and web sites about censorship.

Filtering programs are not delivering on the services they promise, NCAC researchers conclude.

In hundreds of cases, the report declares, the filters block valid sites in error. Researchers say one reason is because the technology relies largely on detecting key words or phrases such as “over 18,” “breast,” or “sex.”

In one study cited by the report, Bess blocked House Majority Leader Richard “Dick” Armey’s official web site. (Ironically, Armey—a Republican—supported the new filtering law.) In other studies, Symantec’s iGear blocked a United Nations report titled “HIV/AIDS: The Global Epidemic,” while Smartfilter blocked an online brochure called “Marijuana: Facts for Teens,” published by the National Institute on Drug Abuse.

“The problem stems from the very nature of filtering, which must, because of the sheer number of internet sites, rely to a large extent on mindless, mechanical blocking identification of key words and phrases,” wrote Cho.

Even when a company employs human beings to review sites manually, there remain “massive problems with subjectivity,” the report states. Critics of filtering insist that the political attitudes of different manufacturers are reflected in their decisions to block certain sites on topics such as homosexuality, human rights, and even criticism of filtering software.

For instance, the report alleges that Cyber Patrol unjustly blocked sites with a political slant, such as the Flag Burning Page, which examines the issues behind flag burning from a constitutional perspective. Cyber Patrol did not, however, block sites promoting firearms, such as the National Rifle Association’s web page.

“Ultimately, less censorial approaches such as media literacy, sexuality education, and internet acceptable-use training may be better policy choices than internet filters in addressing concerns about young people’s access to ‘inappropriate’ content or disturbing ideas,” wrote Cho.

Filtering companies say NCAC has its facts wrong.

“We at SurfControl take very seriously the matter of protecting First Amendment rights,” said Susan Getgood, the company’s vice president of marketing. “In fact, one of the reasons there is not a broader censorship law in the United States is because the Supreme Court in 1998 thought filtering software was so good that it made censorship laws unnecessary.”

As for the specific incidents mentioned in the report, Getgood said the information is very dated and does not reflect the state of filtering technology today.

“The authors’ definition of how filtering works is erroneous and does not correctly describe how most filtering products work or how well they work,” she said. “Products have not relied on key word filtering for years.”

According to Getgood, it is impossible to comment on the specific sites the report says are blocked erroneously without knowing more.

But in any event, SurfControl does not “block” web sites, she said. Instead, the company “place[s] sites into categories and allow[s] companies, schools, parents, and individual users to make their own choices about what they want to filter, depending on their own definition of acceptable use.”

Javier Garriz, senior product manager of enterprise solutions for Symantec, echoed Getgood’s sentiments.

“The software can be configured to allow as much or as little web content to be accessed, as specified by the user, consistent with whatever policy that user’s organization might have regarding access to web-delivered information,” he said.

For example, one of the content categories available for filtering through iGear is “Advanced Sex-ed,” which includes sites that provide medical discussions of sexually transmitted diseases such as HIV and AIDS.

“If the administrator of the software chooses to place that particular category in the ‘deny’ state—i.e., if the administrator’s intention is to block access to sites containing that type of content—then the software would, indeed, deny access to those sites,” said Garriz.

Furthermore, administrators can always override any content classifications contained in the URL database supplied with the system.

“The system was designed to be simple enough to allow changes in policy settings to be made by the people who are in the best position to make these determinations—the [K-12] teachers themselves,” Garriz added.

The new report contains nothing that hasn’t been said before by critics of filtering software, but NCAC said it hopes the report will intensify the debate over filtering in schools.

Most filtering companies seem unperturbed about the report’s findings.

“I think [the report] actually speaks very well for N2H2,” said David Burt of N2H2’s public relations department. “These guys aggregated 10 pieces of research about N2H2 from 1997 to 2001 and found only 90 sites wrongly blocked out of more than 4 million. That’s an accuracy rate of 99.99 percent.”

Links:

Internet Filters: A Public Policy Report
http://www.ncac.org/issues/internetfilters.html

SurfControl’s Cyber Patrol
http://www.cyberpatrol.com

Symantec Corp.
http://www.symantec.com

N2H2 Inc.
http://www.n2h2.com

WebSense
http://www.websense.com