The Legal Cost of Improper Internet Censorship
The U.S. Supreme Court ruled 30 years ago that public schools cannot engage in viewpoint-based censorship of library books. Schools can keep books off the shelves if they are poorly written or inappropriate for a particular age group, but they cannot limit access to Harry Potter books out of a concern they glorify witchcraft, or remove Kurt Vonnegut novels because they perceive the books to be anti-American, the court held in Board of Education, Island Trees Union Free School District v. Pico.
That also means that if a library includes novels about star-crossed teenagers in love, it can't selectively remove similar novels about gay and lesbian teenage romance. If an elementary school library includes children's books designed to teach kids about family relationships, it can't remove similar books discussing families with same-sex parents.
Technology may have changed since that ruling, but the law has not. Schools cannot block access to information on the Internet any more than they can engage in viewpoint-based discrimination toward the books on the shelves.
We at the American Civil Liberties Union launched the "Don't Filter Me" campaign last year after receiving a disturbing number of reports from students who were blocked from accessing websites about college scholarships for lesbian, gay, bisexual, and transgender teenagers; anti-bullying resources; and activities for student-led gay-straight alliances. In response to the campaign, reports flooded in from students across the country whose schools were using filtering software configured to block LGBT-supportive websites. When activated, the filter would block websites that expressed support for LGBT people and their legal rights, but allow access to websites that condemn homosexuality as immoral or oppose laws protecting LGBT people from bullying and discrimination.
For students, the stigmatizing effect of these filters is not just an abstract First Amendment issue. At a time when bullying and suicide among LGBT youths is all too prevalent, blocking certain websites deprives students of access to anti-bullying resources, suicide hotlines, and religious organizations that help students and families. This is especially relevant to students in crisis who may not feel safe accessing such sites from their home computers. By blocking these sites, or requiring students to ask for special permission for access, schools send a message that being gay, bisexual, or transgender is dirty or shameful.
Even when a school allows students to ask for a censored site to be unblocked on a case-by-case basis, discriminatory filters still impede the free flow of information by placing special burdens on particular viewpoints, and they violate a students' privacy by requiring them to disclose the nature of their searches to school officials. If students suspect that a staff person disapproves of their research, or if students wish to seek support online without outing themselves, policies like these may discourage them from accessing the information they need.
Over the course of the Don't Filter Me campaign, we found that many schools mistakenly thought their filters were designed to block pornography, not knowing they were blocking nonsexual LGBT websites as well because they were overinclusive. These schools were surprised to discover that their filters were designed specifically to identify nonsexual content, and that actual pornographic or sexual content was covered by completely different filtering categories. Once they realized the filters played no role in blocking actual pornography, the vast majority of schools agreed the filters served no educational purpose and promptly disabled them.
Based on these experiences, we reached out to the software companies that designed the filters and pushed them to take responsibility for viewpoint-based censorship caused by their products. Several companies agreed to eliminate their LGBT categories and placed nonsexual LGBT websites in neutral categories such as social science, history, or other appropriate areas. Unfortunately, some businesses left the filter parameters unchanged, but agreed to issue public statements and enhanced customer guidance to make clear that websites identified in LGBT filters categories are nonsexual and should not be blocked by schools.
But even though the software companies have helped address problems, school districts bear the ultimate responsibility for school-based Web filtering. One district—Camdenton R-III in central Missouri—learned that lesson the hard way after arguing that disabling its software program's filter for "sexuality" would allow students access to sexually explicit sites. We suggested alternative filtering systems that would block adult content while allowing students to access nonsexual sites, but district officials were unresponsive, so the American Civil Liberties Union sued.
In February, the U.S. District Court for the Eastern District of Missouri granted a preliminary injunction blocking the use of Camdenton's Internet filter and finding in PFLAG v. Camdenton R-III School District that students don't surrender their First Amendment rights when they log onto a school library computer. The federal court made clear that when a school district intentionally uses a discriminatory filter, it is engaging in viewpoint discrimination.
The federal district court's groundbreaking decision in the Camdenton case should be a warning to school districts. As technology changes, schools must be aware that all outposts in the marketplace of ideas should be open to students, whether on the bookshelves or the Internet. They must adhere to the same standards of viewpoint neutrality that apply anywhere else in a school library.
Vol. 31, Issue 31, Pages 24-25Published in Print: May 16, 2012, as The Legal Cost of Improper Internet Censorship