We don't try to enforce the speed limit (yet) by limiting the ability of cars to drive fast. We don't limit the ability to own a vehicle with bad emissions or without safety equipment. (We will push you for operating that vehicle on publicly owned roads.)
If you use your computer to commit a crime, you should be punished. If you use your car to commit a crime, you should be punished. (Assuming the crime should actually be a crime.)
What we shouldn't (generally) do is preemptively prevent your from having the ability to commit that crime, especially when that prevents you from performing activities that are not crimes.
No, you seem to be talking about whether certain behaviour involving cars should be allowed. That's not the issue in the article, or being discussed here. The issue is whether it's a good idea to design computers to allow certain actions in the first place. Since fundamentally we can't design computers to detect 'bad' search queries, any attempt would likely be like other forms of DRM: futile against the sort of people you want to stop, and debilitating for everyone else.
> The issue is whether it's a good idea to design computers to allow certain actions in the first place. Since fundamentally we can't design computers to detect 'bad' search queries
Where is the design of computers coming into this? If you host a forum, and someone posts something that you don't like, you have the option to delete that content.
Google hosts millions of online videos, and decides that they will delete child pornography when they find it (whether they use an automatic filter or a system of moderation is irrelevant).
They also decide that they won't delete ISIS videos when they discover them. That decision is not a technical one, but a social one. It is worth talking about, even if, as the article says:
> Certainly the fact that there are 3000 ISIS videos on YouTube and 10,000 ISIS accounts on Twitter should give you pause. Clearly this is a tricky area, and I don’t believe this is necessarily a matter for government regulation. I do, however, think that Google might alter its “Don’t be evil” motto to “Don’t enable evil.”
I have a sneaking suspicion that we don't actually disagree on all that much, and that most people who are arguing with me here are doing so simply because I am using words that sound slightly different than the ones Corey Doctorow uses.
So far the most convincing evidence of this is that the original article didn't actually mention anything about legislating against general purpose computers.
The original article talks about legislating against certain capabilities (software) of general use computers that allow the commission of illegal acts.
The chain of metaphors has gotten pretty stretched. As far as I can tell it goes about like this:
Author supports SOPA and criticizes Google for allowing people to find copyright content to steal.
Commenter doesn't like this so quotes Doctorow talking about wheels to explain why we shouldn't make running websites or search engines illegal or restrict their capabilities.
You point out that we do restrict the use of cars.
(Thus by metaphorical extension seemingly supporting the idea we should restrict using search engines or websites for illegal acts)
I point out that you are talking about restricting the use of cars, not restricting their capabilities.
(Trying to point out by metaphor the distinction between restricting legal use of search engines and legal capability of search engines)
You say we agree, but quote what I considered to be the obvious and not important part of my comment. (I thought the important part was: "What we shouldn't (generally) do is preemptively prevent your from having the ability to commit that crime, especially when that prevents you from performing activities that are not crimes.")
Assuming that we don't agree on the crucial point, I go further off topic and bring up another unrelated example of ways people are restricting the capability of computers (by making it illegal to export extrusion software.)
So let me see if we do agree explicitly on the original topic:
Do you support SOPA? Do you think that we should make it illegal to index and share links to websites of people who break the law (e.g. distribute copyrighted material or share intrusion software)? Do you think we should make it illegal to provide others the capability to break the law even if we are not breaking any other law ourselves?
...Or did we just get really lost in our metaphors?
I don't support SOPA. I do think that a law similar the the DMCA should continue to exist. However, I would like to see it amended to protect user rights to inspect and modify the software/hardware that they purchase. I would be interested to hear how the law could be structured to allow remixes (releasing a patched version of Skype, mixing a Taylor Swift video, etc.), but I think that gets more complicated.
I do think that it is reasonable to ask that people building automated tools also build automated tools to comply with the law. For example, Grooveshark ought to enforce not just individual DMCA requests against a particular entry in their database, but should be expected to use a content hash of some sort to avoid trivial circumvention of the law. This is technically trivial, and is not an onerous requirement. Google could easily apply similar tools, and to some extent they already do.
When those automated tools are not possible (for example, when running a Tor relay), the organization running the infrastructure should be absolved of all responsibilities related to data that passes through, but is not observable to them. This is a very logical outcome of the DMCA, because if traffic is not observable, no take down request can be reasonably made.
Lastly, I think that while the Google's and Reddit's of the world ought to have no legal responsibility to police truly harmful content -- ISIS videos, or /r/coontown -- they should, as good citizens of the web, do so anyway.
If you use your computer to commit a crime, you should be punished. If you use your car to commit a crime, you should be punished. (Assuming the crime should actually be a crime.)
What we shouldn't (generally) do is preemptively prevent your from having the ability to commit that crime, especially when that prevents you from performing activities that are not crimes.