The latest wave of moral panic around protecting children has inspired David Cameron to announce that, from the latter part of this year, all ISPs will be required to use a opt-in system where users only get access to adult content if they register first. Aside from the general unworkability of this, his evident cluelessness about the technical issues involved and the massive concerns about state control of content that it raises, there's another important point that needs to be addressed, and that's that in some cases, it will do the very opposite of protecting children - it will leave them cut off from resources they desperately need.
It was in the early 'nineties that I first became aware of what filtering out 'adult' content actually means. At that point I was helping to maintain a safer sex advice website and I was shocked when we were told it was being blacklisted by several search engines which deemed it pornographic. There were no images on the site apart from logos and everything was written in a very plain style, with nothing intended to titillate (rather than getting people excited we wanted them to stop and think). But there were those words, 'sex' and 'sexual', and there was some discussion of taboo body parts, and that was enough.
When I got angry about this, a colleague suggested that I look into the status of another site I wrote for, which was aimed at young LGBT people and had no sexual content at all. Sure enough, I discovered this was blocked too. And it wasn't just because certain terms ended in '-sexual' - further research showed that some search engines blocked on the basis of words like 'gay' and 'lesbian', as a matter of course.
Two decades on, much has improved. It's probable that most ISPs will have the sense to keep general LGB sites accessible, at least once the problem is pointed out to them. These sites can be a vital resource to young people facing daily hostility at home or at school. But what about trans people in the same situation? It is, sadly, still the case that the most heavily promoted sites using the terms 'transsexual' and 'transgender' are pornographic. Search engines have got savvy about this but a lazy ISP will find it much easier simply to deem everywhere with those terms 'adult', leaving vulnerable young people without anywhere they can find support, community or access to vital health information.
If the proposed filters come into operation, it is probable that every site on safer sex will have to fight to remain accessible, and the evidence shows us that it is young people who are most in need of the information these sites carry. Furthermore, it will be harder for young people facing sexual abuse to access support online. I can remember how I felt when Childline started, how I wished it had been there for me. There are now lots of places that offer children help but lots are needed, as not every child will be comfortable with the same set of options and not every child will look for help using the same kind of search terms. How many would end up disappearing due to the language they used?
Even when it comes to images, there are entirely appropriate reasons why, sometimes, children should have access to images of genitals. In some cases they're an appropriate part of conversations on safer sex; in others, they're important to helping young people feel comfortable about their bodies and recognise the diversity out there. They are particularly important for young intersex people trying to come to terms with bodily differences that doctors and family members may simply refuse to discuss. It is much better for young people to be able to access educational and support resources online than to rely on peer gossip or take a chance on trusting an adult to give advice when they are in a very vulnerable situation. School advisors and so on are not an adequate substitute because thy often lack training on trans and intersex issues.
By restricting access to legitimate resources, internet filters put already marginalised young people at even greater risk. Yes, online predation is a danger, but so are in-person abuse, bullying, isolation, unplanned pregnancy and lack of access to appropriate medical support. Among the few things research tells us about child molesters in general is that they seek out children who are poorly informed about sexual and bodily issues, and who lack confidence. Internet filters could easily make children more vulnerable in this way, too.
Of course it's possible that any successfully introduced filter will take account of these problems and invest the resources to make sure support and information sites are not affected. Given the history of this area, however, I'm not holding out much hope. Whilst I doubt Cameron's proposals will easily find solid form at all, it's important that we have this conversation, because current discussions are ignoring the complexity of child protection issues and ignoring the fact that the internet provides services that the state itself is failing to provide to certain groups. Any curtailment of access to information must be approached with extreme caution and this matters at least as much when it applies to children as when it applies to adults.