Written by guest blogger, Christopher Harris
Capping off a year-long research project that began with a symposium marking the 10th anniversary of the U.S. Supreme Court decision upholding the Children’s Internet Protection Act (CIPA), the ALA today released a new report looking at the long term impact filtering has had on schools and libraries. “Fencing Out Knowledge: Impacts of the Children’s Internet Protection Act 10 Years Later” addresses continuing concerns about the lack of transparency in filtering software, the prevalence of over-filtering, and the disproportionate impact CIPA has on some populations.
Reading this report from my dual perspective as both a librarian and a school administrator helped me understand how and why we have ended up where we are. I also had additional insight as both a participant of the symposium that spawned the report as well as original OITP discussions. While the librarian side of me is pushing for open access and user privacy, the school administrator in me is worried. As an administrator, I am charged by New York State law to act in loco parentis—in the place of a parent—protecting students from harm. Even though the requirements for filtering within CIPA are narrow, they are poorly defined.
“Harmful to minors” and “obscene” encompass huge swaths of content when viewed through a lens of fear and doubt. School administrators worry about angry calls from parents—or even worse, the local news—asking why a child was exposed to content that someone might find objectionable. Then they hear about a solution; just install this filter and everything will be okay. Though this is certainly not an excuse, a lack of understanding about the technology involved and a desire to do right by students is sometimes the driving force behind filtering issues in schools.
In other cases, though, the overreach of filtering seems to be more deliberate. “Fencing Out Knowledge” identifies a number of examples including the entire state of Rhode Island where students in school are prevented from accessing 89 categories of content including websites from the American Civil Liberties Union, People for the Ethical Treatment of Animals, and Planned Parenthood. Lest one think this is a partisan move, the National Organization for Marriage is also blocked; clearly the intention here is to restrict access to information of all types.
Those restrictions, as the report shows through evidence from various surveys, have an impact on students’ ability to successfully complete schoolwork. There is a deeper problem, though, for students with limited or no access at home. For them, living in a school district that over filters means they could grow up with no ability to participate in the digital world most of us take for granted. With YouTube and social networking still blocked in many schools, some students are locked out of opportunities to create content, pursue other forms of online learning, and engage with their peers.
In 2000, when CIPA was signed into law, the web was a very different place. In the age of GeoCities, content was limited mostly to text and images. AOL and other dialup services provided most of the access; always on broadband technology was almost unheard of in homes. Google was an up and coming search engine challenging AltaVista and Yahoo!. Slashdot, a community news site focused on technology, exemplified the level of sharing and interactivity of the web. Media sharing was mostly limited to RealNetworks and their consistently buggy and advertising filled clients. I experienced all of this as the technology coordinator for New Hope Elementary School in Chapel Hill, N.C. My views of filtering, however, were also heavily influenced by my wife who in 2000 was also finishing her MLS at the University of North Carolina-Chapel Hill and writing her masters paper on filtering in schools.
In those early days of filtering, the technology was amazingly unsubtle in its approach. My wife’s research refers to a situation in which Jamie McKenzie (an educational technology consultant) had an article about adult learning on his website blocked because the filename was adult.html and the filter was blocking access by filename. This was an age where keyword filters blocking “game” could be evaded by searching for “games” instead. Today, things are quite different. After almost 15 years of a constantly escalating arms race between students and faculty trying to access content and filtering companies trying to stop them, things have gotten a bit out of control.
Arms race is a very apt term for what we are seeing today given the militaristic way that filtering companies advertise their products. Companies like iboss, a leader in the K-12 filtering market, refer to “threats” and tout “forensic intelligence” in managing “enforcement” to protect against “high-risk” users. iboss proudly goes further in their protection offering the ability to decrypt https/ssl traffic as well as the ability to automatically record video from end-user desktops when potential violations are detected. From a privacy perspective, the librarian side of me finds this quite concerning. The school administrator in me, however, sees potential. The reality is that there is no expectation for privacy on school networks. Video documentation of a potential violation can, as the iboss website suggests, be a valuable tool in determining intent to avoid blanket punishments for accidental actions.
Both the librarian and the school administrator sides can appreciate the potential for granular control offered by modern filters like iboss, an example I continue to use as I had a recent demonstration on the product ahead of an upcoming rollout in my region. For example, the filter can be configured to allow specific access to individual Twitter accounts by username while blocking the base domain. Not the fully open door that has been hugely successful in some schools, but better than our current blocking of the entire Twitter-verse. As our media service starts cataloging YouTube videos, we will be able to ensure access to those resources even in districts that want to keep the rest of the site blocked.
In these ways, my view of filtering has grown and changed along with the capabilities of the technology. I remain concerned about the access and privacy implications, and will continue to advocate as strongly as I can for openness. However, as a school administrator I also must acknowledge the concerns brought on by in loco parentis; were I making the decision, I would likely want some type of very basic filter to block the worst of the worst content to prevent unintentional access.
The challenge is being confident enough in students’ ethics and digital literacy to stop with the most basic level of filtering. As “Fencing Out Knowledge” clearly shows, a distressing number of schools push the envelope and engage in severe over-filtering well beyond the requirements of CIPA. I look forward to assisting with the implementation of the recommendations identified in the report that seek to increase awareness about the issue and identify new solutions.
Christopher Harris is chair of the Advisory Committee of ALA’s Office for Information Technology Policy and Director, School Library System of the Genesee Valley (N.Y.) Educational Partnership.