Although most of the internet is available for free, some software companies (and possibly many lawyers) stand to make megabucks from the use of filtering software, at least if Congress has its way.

Programs designed to selectively block users from accessing sites deemed potentially harmful or inappropriate have been around for several years, and many school districts have opted to install them as one way of reducing potential liability for allowing students to wander into the murky depths of the dark side of the internet. The pols on Capitol Hill have felt the public pulse and now are pushing for mandatory use of filtering software by public schools and libraries.

The narrow approach, contained in virtually identical bills now working their way through the House and Senate, requires schools and libraries that receive eRate funds to purchase and use filtering software to screen sites “deemed inappropriate for minors” by local authorities.

The shotgun approach, contained in a last-minute rider glued onto the huge education, health, and labor spending bill, would require schools and libraries to purchase filtering software to block “obscene” material if the institutions accepted any federal funding for computers and internet access.

In practice, the latter approach might present more problems for school districts. Legally, the “obscene” standard is vastly narrower than the “inappropriate” standard. Both obviously would require software to block porno sites. But other sites pose trickier problems.

A site offering online instructions for homemade pipe bombs would indeed be “inappropriate” for impressionable minds, but it would hardly meet the classic definition of “obscene.”

Ironically, if the constitutionality of that approach is upheld, it could mean even more litigation (read legal fees) for your schools. As you sought to apply the law in your schools, lawsuits would be brought challenging your interpretation of “obscenity.”

The shotgun approach also would force virtually every school with an internet connection to buy filtering software, whereas the alternative approach would allow your schools to opt out if you were willing to forgo eRate funding. But subtle traps await the unwary in both approaches.

Behind both congressional approaches is the assumption that filtering software will solve the problem. A Senate report (S.Rpt 105-226) argues that “although the best protection for children from harmful online content is close supervision by their parents, this. . .is not possible in schools and libraries.”

What ever happened to “in loco parentis”? And it’s the potential liability arising from faulty supervision that should have educators worried. Congress specifically admits (as does nearly everyone else, including even software manufacturers) that filtering doesn’t work all the time on every site.

Take note: None of this legislation includes language limiting school liability, even if filters are used to their best advantage. Even worse, that official congressional report now states that filtering is no substitute for supervision.

Let’s pretend we’re talking about something more traditional. Imagine a super floor mat for school gyms claimed to “prevents 99 percent of injuries.” Would the principal to allow the gym teacher to pop out for a cup of coffee while the mats “protected” the students?

In short, it’s fine if your schools want to use filtering software as part of a program that also includes close supervision and monitoring of student online activities (as well as swift loss of privileges for students violating the rules).

But if anybody on your school leadership team views filtering–especially filtering mandated by Congress–as a panacea for potential liability for what students (and staffers) do on the internet, be prepared to spend a lot more on lawyers than you paid for the filters.