Tuesday, January 24, 2006

Big Brother is Watching...For the Children

Everyone, it seems, is abuzz about the Department of Justice subpoenaing Google's records of searches performed by all users over the course of some presumably arbitrary week. The International Herald Tribune has a short but pointed story about this that addresses a number of significant issues.

What bothers me about this story, however, is the following quote:

Protecting minors from the nastier material on the Internet is a valid goal; the courts have asked the government to test whether technologies for filtering out the bad stuff are effective.

The first problem I have with this is that it is almost universally true that when someone advocates some action or law "for the children," it's an emotional appeal that has more to do with advancing a personal agenda than benefitting any actual children.

That's not my main complaint, however. I'm more concerned with the casual assertion that government intervention, and specifically technological intervention, is good.

Technological solutions are, generally speaking, the worst solutions, especially when the technology must make a yes-or-no decision on any particular item. This is because technology lacks nuance. The classic example of this was early web filtering programs that prevented children from reading information on breast cancer because the web pages contained the word "breast."

More recent filters are considerably more sophisticated, but there will still be grey areas. Does the use of the word "fuck" make this page unacceptable for children? Does it make this entire site unacceptable? And what about these fine individuals? (Links provided by http://blogs.herald.com/dave_barrys_blog/.)

If you have an email address, you probably know how quickly spammers adapt to the latest filtering techniques, leading to an arms race between spammers and spam-filter authors. The only way to eliminate all the spam is to filter out everything, but of course that isn't particularly useful. Anything less will filter potentially desirable (and innocuous) content while still allowing some objectionable content through.

Slightly better, in the sense of not being as bad, are legislative measures. The law, at least, allows for shades of grey. Laws are difficult to enforce online, however, and fairly easy to circumvent. They also tend to focus on punishing behavior rather than preventing it, though the threat of punishment can serve as a deterrent.

The real problem, though, is not that the web is a big, open, and sometimes dangerous place for children (and adults). The problem is that parents are too quick to provide their children with the tools to access the web without also providing supervision. A computer connected to the Internet is a powerful piece of technology. Just as a parent shouldn't give a child a gun or the keys to the family sedan, a parent shouldn't give a child unlimited and unsupervised access to a worldwide computer network.

The bottom line is that the best way for parents to ensure their children's safety online is to know what they're doing and when, and not to treat computers as just another type of toy. Not only will that produce real improvements in online safety for kids, it will give the government less of an excuse to invade our privacy or restrict our activities.

No comments: