Re: The law of Unintended Consequences applies....
The section is simply far too broad. There are cases where the protection is not only the most logical option, but helps prevent lots of problems. And then there are others where it allows clear abuse with a get out of consequences free card. Some examples:
Without this section, every provider is responsible for anything and everything on their platforms. This makes sense for a blog with a few readers, but it doesn't make so much sense for, say, a cloud services provider. Without protections like this, someone could find illegal content on a site and charge the provider of compute or network for that site with a crime despite the fact that the provider didn't know anything about the site. Initially, this doesn't sound like a problem; we make it illegal for people to provide services to criminals and the companies have to check their customers for criminal activity. The problem with this is that checking a customer for criminal activity is pretty hard to do without also completely ruining that customer's right to privacy. For instance, I have a virtual server online that could theoretically be used to commit crimes (it isn't). In order to verify that I'm not committing crimes, my server provider would probably have to scan every file on my machine and analyze all network traffic coming through. And even if they do that, they could be charged if it turns out their automatic system doesn't detect whatever crime I have managed to come up with. A good faith effort is not sufficient.
However, this is also frequently used to allow any type of content, no matter how obviously illegal, to be sent. The article already has some good examples of this, which I'm sure we agree should be stopped. Under the current law, our only method to try to stop it is to argue about the definition of "publisher", leaving lots of advantages for companies with many lawyers. That's not very useful. As obvious as it is to people that running ads means the advertiser is publishing at least that content, it hasn't yet been accepted in court because the law isn't clear enough.
I think we're likely to see lots of people clustered around the "protect it at all costs" and "scrap it entirely" ends of this spectrum. As usual when that happens, we really need to be somewhere in the middle.