back to article Twitter, Mozilla, Vimeo slam Europe’s one-size-fits-all internet content policing plan

Twitter, Mozilla, Automattic, and Vimeo have signed an open letter aimed at EU tech chief Margrethe Vestager taking issue with new digital rules due to be announced later this month and asking for a more flexible approach when it comes to internet content. The EU's Digital Services Act and the Democracy Action Plan aim to …

  1. Snake Silver badge

    Indeed

    The problem with one size fits all, stay-or-delete approach is in the decision making: it will lead to one person's, or one group's, idea of freedom of speech. Sooner or later, sooner most likely, some moderator will decide to go further rather than lighter and delete in the gray areas rather than taking a more delicate approach. And then that more heavy-handed application of the rules becomes the status quo.

    1. bombastic bob Silver badge
      Childcatcher

      Re: Indeed

      The problem with one size fits all, stay-or-delete approach is in the decision making

      another way of putting it: many people will agree to censorship (aka 'moderation'), "for the children" or for any other reason, until it censors *THEM*.

      (I'm amazed I agree with Tw[a,i]tter on anything, even if only slightly)

  2. Howard Sway Silver badge

    But then maybe it’s just the wrong algorithms

    No. The problem is the techno-utopian belief that you can find an algorithmic solution to what are really human problems. Life on the web is just as complex and difficult as life in society. So, just as the societal choice for policing can vary from police state through community based policing to free for all anarchy, so the choices for how societies regulate the web will be made. The web started like a free living anarchist commune, but as most such ventures collapse in real life due to existing within a corporate system to at least some extent, it's a bit naive to have expected it to always remain that way. The danger is that by stubbornly insisting too much on an anything goes approach, the backlash will be big enough to swing the pendulum too far towards the police state end.

    Nobody knows all the answers, but we're certainly in a weird place at the moment where folk who can do little more than constantly shout about "freedom" also back scarily authoritarian politicians who misuse the massive power of state government to take freedom away from other people.

    1. NATTtrash

      Re: But then maybe it’s just the wrong algorithms

      The web started like a free living anarchist commune...

      ...and great times they were.

      But more to the point: one of the remnants of that time is the discussion that is still actual up to this day: the fact many platforms can still label themselves as "just a service provider", which then benefits them greatly because they can never be touched while making money. Thus, FB will never be liable for what some morons do on their platform. Amazon can sell lead pencils for kids and claim it's just facilitating an external seller. Uber or AirBnB have a similar business models, and we all are familiar with the stories how they claim to just "facilitate their clients/ customers/ independent contractors".

      Reading between the lines of this piece, it would not surprise me if some time (soon?) this "facilitating" thing is going to change. And no worries, if that happens we will still have freedom of speech. But me wonder how much these freedom loving platforms (businesses) will then still allow/ facilitate that if they are the ones that will get the court cases and corresponding fines each time their users breach the rules. That's what they are afraid of. That's why there are open letters and sudden "we care and therefore self regulate". Because if the situation changes to something where more normal world rules are applied (to them), then making money becomes very scary and less lucrative...

      1. Chris Fox

        Lead pencils

        "Amazon can sell lead pencils for kids and claim it's just facilitating an external seller."

        I don't follow the significance of this example. So-called "pencil lead" (and for that matter, "black lead") is actually just graphite. All considered, traditional graphite pencils are probably among the least harmful of drawing and writing implements. But otherwise I would agree that there are some complex issues here.

        1. bombastic bob Silver badge
          Boffin

          Re: Lead pencils

          can you actually buy a pencil that has actual Pb in it? Does anyone even make such a thing?

          According to THIS article (from the Washington {Bleep}), some time in the 1500's a large natural graphite source was discovered, and they made writing instruments from this material. And mistakenly people thought it was a type of lead.

          So I guess pencils have always been made with graphite as the material that makes the marks on paper.

          I guess that deflates the earlier comment about selling lead pencils to children...

    2. bombastic bob Silver badge
      Childcatcher

      Re: But then maybe it’s just the wrong algorithms

      The problem is the techno-utopian belief that you can find an algorithmic solution to what are really human problems

      I was just thinking (dangerous I know): if it were an option to have "content warnings" rather than content removal, or perhaps no content warnings at all for that matter, how many people would choose "wild west" or "anarchy" over "nanny mode" ?

  3. Anonymous Coward
    Anonymous Coward

    Who decides the definition of "Harmful"?

    The whole issue of removing "harmful" content is a train wreck just waiting to happen.

    Who decides what constitutes "harmful" content? The term "harmful" is so loose that just about anything could be viewed as harmful. Harmful to who? Harmful to a politician's career? The fact that "harmful" is very subjective. In today's SJW world where even technical speech such as master/slave is being censored by virtue signallers, all speech is doubtless harmful to someone.

    Handing this kind of decisions to algorithms will only result in perfectly legal speech being arbitrarily censored. How do you fight such a removal? Who do you contact? What remedies are there?

    The loose definitions and lack of accountability shows these laws up as what they are: a convenient method for the powers that be to remove anything they don't agree with. This is a censorship regime and nothing else.

    1. DCFusor

      Re: Who decides the definition of "Harmful"?

      Those who want more control want us to forget:

      The truth doesn't need to be protected from other ideas.

      Those other ideas collapse when revealed as untruths.

      It's messy and takes awhile, but in the end, the truth wins. Those afraid of it shouldn't be protected, they are in fact the enemies of the rest of us, and undeserving.

      Some times it's better to keep quiet at let people think you're a fool, than to open your mouth and prove you are - via this truism, false ideas are shown to be the work of fools. They are their own best counter-arguments. Why would someone want to prevent that? /rhetorical, see politician above

      1. Michael Wojcik Silver badge

        Re: Who decides the definition of "Harmful"?

        Those other ideas collapse when revealed as untruths.

        I am firmly in favor of strong protections for freedom of expression, and automatically hostile toward any censorship system. But this comment is simply incorrect, as a vast number of methodologically-sound psychological experiments and the vast sweep of human history both attest.

        Even testable false hypotheses don't show any sign of being overwhelmed by truth. Take, oh, the Flat Earthers. Or the homeopaths. And of course untestable hypotheses (religion, conspiracy theories, solipsism, etc) cannot logically be refuted.

        Education helps. Some economic pressures can help - though it's difficult to institute most of those without unacceptable constraints on expression. For the most part, though, we have to bear the costs of significant numbers of people believing false ideas and acting accordingly, as the price of freedom of expression. That's a trade-off inherent in the human condition.

        It's a wind-eye. A society can choose to be blind but sheltered from the cold, or confront the gale and see.

  4. Criminny Rickets

    Moderators can overcompensate

    User posts on online blog: Blue is the new Green

    Moderator: - Hey, that user is using Blue as a racist term... click click, there this little program will get rid of all articles with the word blue in it. What do you mean, what colour is the sky?

    1. Hollerithevo

      Re: Moderators can overcompensate

      So you think future badness will be the fault of social justice warriors? A big jump from Google to the straw man of this.

      1. Anonymous Coward
        Anonymous Coward

        Re: Moderators can overcompensate

        > So you think future badness will be the fault of social justice warriors

        You should interview Jordan Peterson.

        1. Khaptain Silver badge

          Re: Moderators can overcompensate

          > So you think future badness will be the fault of social justice warriors

          They are not the cause, they are the result....

  5. lglethal Silver badge
    Go

    All the things they are proposing are good - community moderation, transparent algoithms, etc. These are great ideas, but they have alll been rejected out of hand by the dominant players, because a) moderators are expensive, and b) if you make algorithms transparent then others can see how much you skew the results to favour your own interests and not the communites.

    We cant be hurting the bottomline or the company's dominance now can we?

    1. Danny Boyd

      What strikes me as being rather illogical is that these companies talk about (and implement) various forms of censorship (via community moderation, algorithms, etc.), and still claim they are "pure platforms" and cannot be held responsible for the content the users post. To me, it's either one, or the other. If you censor the users' postings, then you can (and should) be held responsible for quality of your censorship. If you are a "pure platform", you don't even look at what your users post, let alone censor the postings.

      1. Graham Cobb Silver badge

        Don't be ridiculous. There is no conflict between doing some moderation and being an open platform. "Either one thing or the other" is the most ridiculous thing to say regarding societal norms.

        The truth is, moderation is hard, expensive and different people prefer different amounts of it (I, and presumably you, prefer it to be light - others want it to err on the side of safety). This is exactly the same as "real life": some societies have relatively light-touch policing which emphasises freedom (without being completely free to do anything you want), others have heavy policing which means everything is illegal unless there is a law allowing it. Both exist in the world, and proponents of both exist in every society today.

        The only issue here is the way to resolve this. Traditionally, once a society has made its choice on the freedom-safety axis, it creates laws and institutions that apply that decision to the people and businesses within its geographic control. On the internet, of course, it is (currently at least) corporations which are making those decisions, not countries - and whatever they decide, it will be different from most countries.

      2. walterp

        None of the "platforms" are "pure platforms", they are all publishers. They have always been publishers. They still have Section 230 coverage as publishers. Platform doesn't exist as a legal term in this context.

        The point of Section 230 was to allow publishers to look at user postings and edit the content without having the threat of "You edited the content, now you are libel for everyone on the site" hanging over each company.

        The publishers can be held libel for changes they make to posts that then cause libel. What they can't be sued for is removing posts. Private companies removing posts from computers they own is not censorship.

  6. Hollerithevo

    A friend in need is a friend indeed

    I loved this bit:

    "However, Twitter and friends are concerned about the direction the EU is going when it comes to harmful content, and warn it may only increase the power of tech giants."

    Because Twitter is just the little fellow, the pal who's on our side?

    1. tiggity Silver badge

      Re: A friend in need is a friend indeed

      Ironically Twitter reporting tolls get maliciously used (by various organised groups) to silence accounts that the complaining group disagree with.

      Twitter should put its own house in order before moaning about censorship.

  7. jonathan keith

    If algorithms are the answer, what is the real question?

    Surely if algorithms are able to identify particular types of material to the degree where they can then enable

    • the "limiting [of the] number of people who encounter harmful content"
    • the "placing [of] a technological emphasis on visibility over prevalence” and
    • "[limiting] the discoverability of harmful content"

    they can simply remove the flagged material in the first place, in line with the specific site's published, detailed content policy?

    My point being that a site's content policy is what will allow people to make a meaningful, informed choice.

    1. Graham Cobb Silver badge

      Re: If algorithms are the answer, what is the real question?

      Algorithms may be the answer but will never be perfect. So you don't want it to pretend it can make a binary choice (ban it or spread it everywhere). You want the algorithm output to be one factor that makes limited changes to the visbility.

      Having a non-perfect algorithm not remove content but not promote it either is a very reasonable answer - that would slow down the spread of the material, allowing time for human moderators (or even legal-type processes like complaints and appeals) to make decisions, as well as public opinion to join in (if the platform does not promote the content but many people still do, then that indicates that at least many people think it is acceptable).

      1. jonathan keith

        Re: If algorithms are the answer, what is the real question?

        if the platform does not promote the content but many people still do, then that indicates that at least many people think it is acceptable.

        That's the part I have a problem with though - just because many people think that anti-vax or QAnon conspiracy theories, or far-right and religious extremism, are acceptable, doesn't make them any less dangerous or damaging to our societies.

        One of the purposes of 'government'* after all, is to protect the citizenry from idiots, and if possible, to protect idiots from themselves.

        * Terms and conditions apply. May exclude theocracies, kleptocracies, autocracies, oligarchies, and socialist republics.

        1. Graham Cobb Silver badge

          Re: If algorithms are the answer, what is the real question?

          That is what the human moderators, and pseudo-legal processes are for. The algorithm buys time and reduces the volume of material those processes need to handle. It cannot replace them.

          The alternatives result in true catastrophes like the Vietnam war picture being banned for being "child porn".

  8. DCFusor

    :This statement is disputed" could realistically be applied to any statement whatever.

    People dispute all sorts of things almost all of us believe are true. From perpetual motion to the shape of the earth, (and remember epicycles?) - and a really long list of other things scientists are basically tired of having to debunk over and over - the Gish gallopers just wear you down...and then they say "no one refutes my argument" because they only bother to look at the last week - as if having blown it up a century ago or more doesn't count.

    The fact is, which ones get that tag - and which not - reveal quite clearly what the agenda is of those who use that tag. Sadly, this is mostly obvious to those who somehow learned critical thinking, despite the lack of teaching it in our schools.

    No one seems to want to do the effort to connect the dots - if some thing is true, there are implications, ditto if that thing isn't true - two utterly conflicting things can't both be true, for example.

    Yet people are lazy enough to allow serious cognitive dissonance to survive and even be celebrated, when it should be the butt of jokes.

    1. diodesign (Written by Reg staff) Silver badge

      "This statement is disputed"

      IMHO it's just a polite, non-confrontational way of saying "this statement is false". It's a way for organizations to say a statement is incorrect but we don't have the time and energy to get into a massive argument over it.

      It comes down to this: we've got to start drawing a line again between reality and fiction. It's been blurred by the fact that anyone on the internet can say anything they like and demand it be treated with exactly the same weight as other statements -- even if what they are saying is completely untrue. And when their statements are disputed, they scream censorship. No, it's because you're talking bollocks.

      We're in this mess at the moment because the line between reality and fiction is being blurred by those unhappy with the reality of the situation they find themselves in. They need something to blame. They need an explanation why things aren't going their way. They come up with a theory and they assert it as fact. It used to be one-night arguments in the pub. Now it's posts being shared to 100,000s of people if not more.

      Fine, if you want to, let's get down to some definition of what truth is. But we have to get there and stick to it, or nothing matters any more. Nothing at all.

      C.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like