Opponents of a law ordering public libraries to block Web sites considered inappropriate for children say perfect filters are impossible. But supporters say the law should stand, because future technology will meet its requirements.
“From a technical standpoint, the filtering technology available today just can’t do what is expected [by the law],” says Jim Tyre of the Censorware Project, a group critical of online censorship.
Federally mandated Internet filtering in libraries reached a breaking point last week when a federal court in Philadelphia declared the 2000 Children’s Internet Protection Act (CIPA) unconstitutional. Primarily concerned with cases of “overblocking,” the court said too much erroneous filtering caused by the law had violated the First Amendment.
“It is currently impossible, given the Internet’s size, rate of growth, rate of change and architecture … to develop a filter that neither underblocks nor overblocks a substantial amount of speech,” the justices said in a statement.
Battle Continues
The law’s supporters will appeal the case directly to the Supreme Court under a special provision, says Bruce Taylor, president and head counsel of the National Law Center for Families and Children.
“Libraries are allowing kids to look at [inappropriate materials],” Taylor says. “That’s what Congress wanted to stop — they wanted to protect the rights of children.”
Taylor also argues that overturning the CIPA “disenfranchises the most needy kids from working families … who don’t have educated, tech-savvy parents nor nannies to watch over them at the libraries.”
The American Library Association, which filed one of two suits that resulted in the U.S. District Court ruling, applauds the decision. Forcing public libraries to spend money on expensive, imperfect software penalizes all libraries, but especially those in poorer communities, says Larra Clark, an ALA spokesperson.
Jonathan Wallace, another member of the Censorware Project, says he is “pleased to see the ruling come down. More than anything, it’s an affirmation of common sense,” he says.
Wallace, who is an attorney and a software executive, says filtering software available today cannot make decisions about appropriate content. If software blocks both a Henry Miller novel and pornography because it can’t tell the difference, Congress cannot realistically expect the software to police the content at public terminals, he says.
Moving Target
CIPA advocate Taylor, however, argues that the court could find only “a few thousand sites out of two billion online” that were mistakenly blocked. The error margin is small, and the problem easy to fix, he says.
But Wallace says Taylor’s “numbers game” is inaccurate. Just because the court was aware of only several thousand sites doesn’t mean there weren’t more sites being blocked. With some ISPs and public sites inviting users to post home pages, software could block hundreds of thousands of sites under a single entry.
The technology simply doesn’t exist, says the Censorware Project’s Tyre. “If you want to create sufficiently accurate filtering software, you need a leap in artificial intelligence technology,” in which each blocked Web page would be monitored every day to assure that no sites were blocked mistakenly, he says.
But Gordon Ross, head of filtering software maker Net Nanny International, says today’s technological shortfalls in filtering shouldn’t cloud the fact that future technical solutions will likely solve the impasse defined by last week’s ruling.