A back-to-school open-records project that every public high school news organization ought to try: Get a list of the websites, or terms, that your school’s filtering software blocks.
There are plenty of news stories about school policies guaranteed to trigger a collective campuswide yawn — but website blocking is not one of them. Almost anyone who’s ever tried to do research on — or teach a class on — a school computer has smacked into those frustrating words: “Access denied.” And they’ve all wanted to know why.
Federal regulations do require that schools filter known “harmful” web content as a condition of qualifying for discounted rates on technology. But the regulations are widely misunderstood and misapplied in overprotective ways that can impede learning.
For instance, it’s widely — but falsely — believed that the federal government requires schools to block access to Facebook, YouTube and Twitter on school computers. In fact, the opposite is true. As the Federal Communications Commission told schools last September:
Although it is possible that certain individual Facebook or MySpace pages could potentially contain material harmful to minors, we do not find that these Web sites are per se ‘harmful to minors’ or fall into one of the categories that schools and libraries must block.
In an unscientific query posed by the New York Times Learning Network at the start of last school year, multiple students responded that — in addition to social-networking sites and those obviously containing sexually explicit material — their schools also banned access to the Yahoo! search engine, one of the Web’s most useful tools.
Increasingly, these restrictions are following students home. Many districts are now exporting the same restrictions on take-home laptops that they enforce on campus, even though there already are website access controllers in every home: They’re called parents.
It’s now well-documented that website filtering has been misused to promote political or religious agendas, by screening out only certain viewpoints on disputed social issues — for instance, blocking access to websites that offer support for students with gender-identity issues while allowing access to those that condemn homosexuality.
Student journalists should obtain, and publish, the list of websites and/or keywords that the school’s filters are set to intercept (both on- and off-campus) — being careful, of course, not to print a string of swear-words for shock value. If the list comes from a private software vendor and isn’t determined by school officials, that’s part of the story — and as long as the list is in the possession of the school, it’s still a public record even if an outside company created it. (And if the school doesn’t know what websites and terms are being blocked, that may be even a better story.)
It’s also important, as part of the story, to explore how a school decides what’s “harmful” content. The FCC’s notion of what must be screened as “harmful” actually is quite limited, applying only to explicit sexual content that is both “patently offensive” and lacking in any artistic or literary merit — in other words, not Michelangelo’s David. (Test: If you just clicked that link on a school computer and it won’t open, then your school is doing it wrong.)