back to article IBM torches Big Tech's get-out-of-jail-free card, says websites should be held responsible for netizen-posted content

IBM has broken ranks with the tech industry – and advocated for changes to a US law that shields websites from legal headaches regarding the stuff their users post online. In an essay shared this week, Big Blue's head of government and regulatory affairs Ryan Hagemann adopted the arguments of those who want to see an end to …

  1. Ken Hagan Gold badge

    Seems reasonable

    Social media's business model is "selling ad space on a conduit that people are attracted to precisely because they can post whatever they like". There are obviously legitimate uses of such a facility. Equally obviously, there are illegitimate ones. If you can't filter out the latter, then your business model is unimplementable within the law.

    Sorry, but society doesn't owe you a business model. Find a new one that you can keep legal.

    I note, for example, that most B2B scenarios would have fairly good authentication of who is posting and so action could be taken against those who abuse the facility and in a business setting that is probably enough of a deterrent. Amongst the general public, however, the authentication is almost nil (how easy is it to create a new and basically anonymous account?) and even if you can identify an abuser, the likelihood that either you or the platform can take action against them in any legally useful sense is almost zero. IBM's defence of B2B use-cases is therefore more than just defending their own turf -- it actually makes sense.

    1. a_yank_lurker

      Re: Seems reasonable

      The idea behind Section 230 is to prevent the site who probably has much deeper pockets from being sued over content they did not author or have real editorial control over. Fundamentally this is a good idea as prevents shysters from shaking down a site because Bubba the Yahoo posted something nasty.

      Much of the criticism of sites not policing is by people who never really define precisely what should be banned other than if it is from some they disagree with it most be evil standard. Not exactly a standard that is transparent or consistent as it is often based on who screams the loudest.

      My concern Itsy Bitsy Morons weighing in is they do not operate websites were people are routinely posting content that attracts much political attention. So what is their game here?

      1. ma1010
        Holmes

        Re: Seems reasonable

        What is their game? It's simple. They want the government to threaten massive fines against FB, Google and such for not moderating what users post. Then IBM will come along and say "Oh, you need to buy our AI technology to help police what gets posted on your service so you don't break the law we lobbied hard for. Get out your checkbook!"

        For IBM, this is NOT a case of "think of the children." It's just another money grab.

        While most of us would like to see FB and Google get one in the eye, I worry about the effect of sweeping laws on smaller businesses, web sites with forums and such. Will they just disappear if they can't afford to hire IBM?

        1. NetBlackOps

          Re: Seems reasonable

          The other part is "My ox won't get gored by this ..." type of thing. As you observe, you can kiss user content on smaller sites goodbye. IBM could care less about that.

          1. GruntyMcPugh Silver badge

            Re: Seems reasonable

            @NetBlackOps: "you can kiss user content on smaller sites goodbye"

            That is my fear here, in making the large social media sites responsible for content, they actually make them the _only_ places user content can be posted, and thus give them even more power.

      2. Ken Hagan Gold badge

        Re: Seems reasonable

        "Fundamentally this is a good idea as prevents shysters from shaking down a site because Bubba the Yahoo posted something nasty."

        No, it's really not. It allows the site to use Bubba the Yahoo to take the legal flack whilst the site owners get to sell ad space on either side of "Bubba's" page. Everyone knows this, just as everyone also knows that if Bubba gets thrown off the site he will just re-appear under a new free email address the following week.

        Society has to figure out where the line is drawn between public statements (subject to legal action) and private ones (typically not, as in the case of the average pub conversation). At the moment, social media sites put themselves forward as having the intimacy of private chats, but with the reach of public announcements. It isn't working. IBM can see that and don't want the baby thrown out with the bathwater when the blunt instruments of legislation are finally applied.

    2. Nick Kew

      Re: Seems reasonable

      action could be taken

      By whom?

      Do you want Facebook et al to become police, judge and jury all-in-one in determining what you're allowed to say? I don't, and neither does Facebook.

      I might just be persuaded to support some proposal that weakens section 230, but that's certainly not going to happen before someone presents a realistic plan to tell us who will police contents, and to whom they'll be accountable.

      Worst of all worlds is that facebook/etc self-police but with the threat hanging over them that any time they make a value-judgement in a grey area, they could be punished for 'getting it wrong'. To force them into always erring on the side of extra censorship is surely a totalitarian's wet dream. We have an element of that today, and the big risk with eroding section 230 is that it annexes even the lightest of grey areas into a big no-go.

      1. DJO Silver badge

        Re: Seems reasonable

        The laws on what's legal and what isn't are quite clear.

        You Tube, Twitter et al make billions of dollars from their customers materials.

        In fact the bigger they are the more money they have per item due to economies of scale.

        They could easily afford to monitor all content as it comes in although it might cut their profits a bit.

        Laws should be designed to protect the public, not corporate profits.

        1. Nick Kew

          Re: Seems reasonable

          The laws on what's legal and what isn't are quite clear.

          Whose private fantasy world is that?

          Recommended viewing for anyone who believes in clear answers in the real world.

  2. NetBlackOps

    Some things never change.

    IBM providing hardware, and now software, to one of the world's worst surveillance states. Back on my personal shit list IBM.

    1. a_yank_lurker

      Re: Some things never change.

      They got on my shit list years ago with their original bungling of the PC, there are some real horror stories from back in the Dark Ages. They never really got off of it and they only reason I routinely ignore them is they do have any products that are useful to me. The fact they have consistently proven they have not really changed from the Dark Ages means they still on the list

    2. 's water music
      Trollface

      Re: Some things never change.

      IBM providing hardware, and now software, to one of the world's worst surveillance states. Back on my personal shit list IBM.

      wait wut? The Nazi's weren't enough? :-)

      (yes, I know, you said back on)

  3. doublelayer Silver badge

    Far too many facets

    On one hand, allowing sites nearly complete freedom to allow anything through means they don't do anything to protect against their service being used for very illegal activities. Facebook, for example, hosted (and probably still does) many groups dedicated to the sale of stolen credit cards. They also allow advertisers to post ads that violate laws without verifying who it is or whether there are any problems. In those respects, there is a pretty good case for altering the law to fix that. However, we also need to avoid making places responsible for things that are not really their fault. As much as I despise Facebook for all their violations of privacy, they really aren't at fault as soon as someone uploads something illegal. They should remove it, but they didn't know it was coming. This applies perhaps more strongly to small sites, which don't have the kind of resources it would take to monitor all posts and accounts thoroughly. So there is a case for changing the law, and there is a case to clinging to it. Why do I have the feeling the politicians will take both cases and manage to find that spot in the middle that extracts the worst elements of both?

    1. sabroni Silver badge

      Re: Facebook

      Fuck stolen credit cards, they live stream massacres.

      How anyone stays on that fucking site since then baffles me. Well, it doesn't, because thinking it through takes a couple of seconds, but ffs....

      (and for the pedants going "no, just one massacre", an extra fuck you.)

    2. KitD

      Re: Far too many facets

      > They should remove it, but they didn't know it was coming.

      What's wrong with content moderation? Post your videos/comments but it will only go live when we've checked it out.

      Yes, it takes time & resources and cuts into FB's/Google's profits, but the world doesn't owe them an exorbitant revenue stream. They'll have to bloody well earn it by showing they're responsible hosters (which long-term may well be a good business model anyway.)

  4. The IT Ghost

    Slippery slope

    This is an indirect undermining of Freedom of Speech. By using third parties to censor what people can post, it creates a precedent. Once challenged in court, and the courts are likely to go along with it since its "private companies" doing the censoring (at the behest of government) its a very small step to create "efficiency" by having the social media companies pool resources on a government moderated panel to uphold the restrictions. And a smaller step to push the companies out of the panel by having them just approve what the government panel decides to do and then just not bother with the approvals at all. First Amendment is effectively bypassed for online posts. Newsies won't cover any in-person protests for fear any quotes they get from protesters would be a liability if published on their website. First Amendment rights for actual verbal speech...curtailed. All through short-sighted profit-chasing and lack of big-picture oversight. Opioid crisis indeed...what a crock. Not that there isn't a problem, but this isn't the way to fix it.

    And let's not forget the easy abuse....Persons A and B get together and collude. A posts a bunch of bad stuff on, say, Twitter. Instantly, B captures it and uses A's postings as a basis for a lawsuit, B gets a bunch of money, and shares some of it with A, so they both come out ahead dollar-wise - repeat on Instagram. Repeat on Facebook. Et al. Until all social media imposes a multi-day wait before anything posted is actually made available, so it can be scoured and examined by filters and human moderators for any possible liability issue.

    This is a very bad idea.

    1. Falmari Silver badge

      Re: Slippery slope

      It is not a slippery slope. Facebook google etc should be treated in the same way as printed material is.

      A newspaper or magazine is responsible for what appears in their print even when they have a letters page they can't just wash their hands of what appears there even when they are letters from readers.

      Of course in print it is easier to moderate what comes in from readers. But just because it is possibly more difficult to do in the digital world does not mean companies should be allowed to have a business model which avoids moderation just because it is difficult.

    2. Anonymous Coward
      Anonymous Coward

      Re: Slippery slope

      > A posts a bunch of bad stuff on, say, Twitter. Instantly, B captures it and uses A's postings as a basis for a lawsuit,

      It seems appropriate for us to describe this with the English verb "prenda", as in your example "they will prenda Twitter." I prenda, you prenda, she prendas; we prenda, you prenda, they prenda; we are prending, we have/had prended, we will prenda.

      Anon in case one of those guys gets his law license back...

  5. JimC

    Content creators bear primary responsibility for their speech and actions.

    The elephant in the room is that they don't. Anonymity and fake accounts see to that. So the end result is that no-one is responsible.

    The problem is easy to define, the solution, of course, is a little trickier.

  6. DontFeedTheTrolls
    Boffin

    Insurance

    So if IBM (or anyone) are selling tools to "ensure compliance", are they backed with insurance for when the tools fails to flag inappropriate content?

    Because they will fail, and website owners will be prosecuted.

  7. Lawrie-aj

    Not the Children!

    I'm not sure but wasn't WW1 and the rematch some years later a bit more detrimental to the physical and.mental health of the young?

    To say nothing of other stuff..... I wonder how much of this concern is a meme being spread for Chinese style control.

  8. Ima Ballsy
    Mushroom

    Hmmmmm

    MY US government probably will got for it.

    "Hey, we've not suspended your first amendment rights. We've just made sure PRIVATE enterprise can do that !!!"

  9. Daniel von Asmuth
    Headmaster

    Disclaimer

    The Register's management is not responsible for this comment!

  10. Mike 16

    Easy solution

    Just mandate compliance with RFC 3514 for all internet traffic.

    How this gets tacked on the to various IPV6<->4 hacks is left as an exercise to the reader.

  11. Anonymous Coward
    Anonymous Coward

    how about stick to the original intent of 230

    if you curate, edit or editorially control user content, you get no safe haven under 230.

    therefore, you allow all speech and no liability, or you control speech and have liability.

    either YouTube allows BOTH kitten videos AND Alex Jones, or they don't.

    but if they don't, they are curating content and therefore are not mere content providers.

    1st amendment is 1st for a reason. it's doesn't just protect safe speech, it protects offensive speech.

    hard to believe that the ACLU once went to court on behalf of Nazis in Skokie IL, to allow them to obtain a permit to march publicly,

    they were spat upon and generally derided but they exercised their free speech rights.

    now the social media cabal place a walled garden around content, but blithely claim that they are merely information conduits.

    they are editorializing and need to lose their protection from libel etc under section 230.

    1. jfm

      Re: how about stick to the original intent of 230

      As I replied to someone else who claimed this a few months ago:

      This is nonsense. Stratton Oakmont v Prodigy, decided in May 1995, was the court case that found that editorial control by a service provider changed them from a distributor of information, without liability, into a publisher with liability. Within months (Feb '96) section 230 of the Communications Decency Act was passed to /prevent/ service providers from becoming liable if they screened or moderated content, and indeed encourage them to do so.

      In other words, you've got section 230 exactly backwards.

      1. Carpet Deal 'em

        Re: how about stick to the original intent of 230

        Section 230 allows basic moderation, but it only goes so far:

        (1) Treatment of publisher or speaker

        No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

        (2) Civil liability

        No provider or user of an interactive computer service shall be held liable on account of—

        (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

        (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

        In other words, editorialization isn't protected under section 230. Given that Google, Twitter and Facebook have been doing just that in the form of selective enforcement, all that has to happen is for Uncle Sam to come down on them qua publishers with the fury of a thousand suns. At the very least, we can get some experimental data to consider when crafting any revisions.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like