back to article Can YouTube be held liable for pushing terror vids? Asking for a Supreme Court...

The US Supreme Court on Tuesday heard arguments in Gonzales et al. v. Google, a case likely to reshape the internet if it goes against the search ad giant. Spoiler alert: This looks unlikely, based on the oral arguments, according to several legal experts. But further legal challenges await and the case is far from over. The …

  1. Dinanziame Silver badge
    WTF?

    Quite apart from the discussion on section 230, I'm rather amazed that the family decided that the party which should be held responsible for a terrorist attack was Google. That's the mother of all Hail Mary liability lawsuits.

    1. jake Silver badge
      Pint

      Congratulations!

      "That's the mother of all Hail Mary liability lawsuits."

      You have won the Understatement of the Week award!

    2. Throatwarbler Mangrove Silver badge
      Unhappy

      Google is legally assailable and they have tons of money.

      1. Anonymous Coward
        Anonymous Coward

        What? Are you implying the family is looking for a quick buck? Like the sucker who poured hot coffee on his lap then sued Mickey D's?

        1. Michael Wojcik Silver badge

          That is not what happened in that case. Not even vaguely. It's one of the most-discussed liability suits in US history, and there's no excuse for repeating this canard.

          1. J. Cook Silver badge
            Joke

            The canard is getting a little tired of being beaten to death only to be raised from it's peaceful slumber only to be beaten to death again by it, too...

        2. Bruce Ordway

          who poured hot coffee on his lap then sued Mickey D's

          Quick buck?

          I think you might find there was more to that story.

          https://www.vox.com/policy-and-politics/2016/12/16/13971482/mcdonalds-coffee-lawsuit-stella-liebeck

    3. Anonymous Coward
      Anonymous Coward

      More to the point

      Section 230 only applies inside the US.

      So why didn't the Gonzalas familty sue in France?

      And who knew, as those supporting 230 claim, that the Internet doesn't work in the rest of the world where 230 protection does not exist?

    4. Anonymous Coward
      Anonymous Coward

      Was it algorithmically promoted? (I don't know)

      "Being completely responsible for" and "having some liability for promoting" are not the same thing. The latter is probably what they are claiming. In my personal judgement, I would draw the line at algorithmically promoting material that urged violent action, but I have no idea if UTube did that or not.

      Personally I am in favor of repealing 230 completely - society was better before it, and internet content would be better without it - section 230 favors the lowest denominator content, and financially the biggest companies. Without 230, tasteful curation stands a chance of becoming a valuable skill.

      1. stiine Silver badge

        Re: Was it algorithmically promoted? (I don't know)

        "valuable skill"?? I think you mean elgally required along with a $1B insurance policy.

        As an aside, am I the only user of Youtube who only has auto-play on when I'm watching a playlist on a channel, but the rest of the time, when the video ends, it doesn't just start playing what youtube's misguided algorithm thinks I'd enjoy watching next?

        1. Dinanziame Silver badge

          Re: Was it algorithmically promoted? (I don't know)

          There are people who watch YouTube like they used to watch TV; just automatically playing whatever's next. No, I don't understand either.

    5. ecofeco Silver badge

      That's your takeaway?

      Not "YouTube enabled terrorism"?

      Seriously?

      1. Dinanziame Silver badge

        What does "enabled" mean? Does Internet enable terrorism? Do terrorists use email? Do weapons manufacturer enable terrorism? For that matter, terrorists use cars; do car companies enable terrorism?

        YouTube represents what's on the internet, and there's a tiny sliver of that about terrorism. I find it ridiculous to claim that imperfect filtering of terrorism apology videos means "enabling" terrorism.

  2. Gene Cash Silver badge

    YouTube doesn't deserve section 230

    Considering how much YouTube demonetizes videos for the slightest of infractions, they are most certainly putting editorial and value judgements on videos.

    They found it offensive that someone was sticking their tongue out in the thumbnail on a video about hot sauces. If that's not "exercising editorial control" I don't know what is.

    1. jake Silver badge

      Re: YouTube doesn't deserve section 230

      So don't pay any attention to YouTube in your day to day life. It's not as if there were anything vital on it.

      1. Gene Cash Silver badge

        Re: YouTube doesn't deserve section 230

        Actually, yes there is. There's quite a few channels I follow with good content, and these people don't need to be penalized for making stuff.

        It's not as if there's any interesting entertainment in this shithole little town. They don't even have good burgers, which is a pretty difficult thing to screw up.

        1. LybsterRoy Silver badge

          Re: YouTube doesn't deserve section 230

          You have an interesting definition of the word "vital" which seems to share a definition of "mildly entertaining"

          1. nintendoeats

            Re: YouTube doesn't deserve section 230

            I think the point is that not restricting speech is vital, not that any one person is providing "vital" content.

            1. jake Silver badge

              Re: YouTube doesn't deserve section 230

              I wasn't talking about restricting anyone's speech.

              I was talking about choosing not to pay attention to a WWW site when one objects to the ownership thereof.

              Why the OP chose to go off on a tangent about the town he chooses to live in, and his own inability to make a burger is beyond me. Those subjects are not germane to anything I was talking about.

        2. jake Silver badge

          Re: YouTube doesn't deserve section 230

          "Actually, yes there is. There's quite a few channels I follow with good content, and these people don't need to be penalized for making stuff."

          Are you suggesting that somehow you owe these people a living? That you are morally forced to view their output on a daily (weekly, whatever) basis? Because I sure the fuck don't subscribe to that concept, for the simple reason that it doesn't (and can't) scale.

          Your living situation and dinner are under the control of one person. I suggest that you have him deal with it appropriately.

      2. katrinab Silver badge
        Flame

        Re: YouTube doesn't deserve section 230

        If someone is motivated by YouTube to go on a bombing spree, you are equally likely to be a victim of this whether you watch YouTube yourself or not.

        So that is not a reasonable argument to make.

        1. jake Silver badge

          Re: YouTube doesn't deserve section 230

          I'll subscribe to your theory when packs of kids are suddenly seen committing random acts of comedy after watching old M*A*S*H reruns.

          1. katrinab Silver badge
            WTF?

            Re: YouTube doesn't deserve section 230

            Children are inspired to commit random acts of comedy after watching comedy acts.

            I don't know anything about M*A*S*H specifically, but the general point is true.

            1. jake Silver badge

              Re: YouTube doesn't deserve section 230

              Children might imitate some comedic routines, but they don't go out of their way, travel any kind of distance, and inflict them on complete strangers. Instead, they make their immediate friends and family suffer. Usually repetitively. And it's not a random event. It's always mimicking exactly what they have seen.

          2. Felonmarmer Silver badge

            Re: YouTube doesn't deserve section 230

            How about Jackass reruns? Because that has definately happened.

    2. nintendoeats

      Re: YouTube doesn't deserve section 230

      There is specific wording about this, which is an important part of the lawsuit:

      (c)Protection for “Good Samaritan” blocking and screening of offensive material

      (1)Treatment of publisher or speaker

      No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

      (2)Civil liability

      No provider or user of an interactive computer service shall be held liable on account of—

      (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

      (B)any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]

      Content moderation is specificially exempted on the understanding that it would degrade the user experience for many websites if the service were not able to remove some content. The law specifically recognizes that such moderation does not make a website an active participant in the creation of content that is not removed. The essence of this lawsuit is the claim that promoting content with an algorithm DOES qualify as participating in the creation of the work (in this case youtube videos promoting terrorism). The plantifs are claiming that while taking down the video may not make the content provider a publisher of the work, doing anything to increase the liklihood of the video being seen DOES.

      Having seen what youtube looks like without the algorithm, I'm hoping that this reasoning will not become case law, though it perhaps may not be as disasterous as some are making it out to be.

    3. doublelayer Silver badge

      Re: YouTube doesn't deserve section 230

      "They found it offensive that someone was sticking their tongue out in the thumbnail on a video about hot sauces. If that's not "exercising editorial control" I don't know what is."

      I don't know for sure what that is, but I have a guess. Maybe they used some kind of machine learning model to guess whether something was offensive and it messed it up. Annoying for the creator, I'm sure, but not the heavy-handed censorship you make out. If you're posting here, you should already know that these programs are unreliable and that they are necessary. Ever had an email accidentally routed to the spam folder or dropped altogether by your mailserver? Did you turn off your spam filters altogether and remove the verification systems on the server? Me too and me neither, respectively.

      1. jake Silver badge

        Re: YouTube doesn't deserve section 230

        "but not the heavy-handed censorship you make out."

        Look up the effects that such a "strike" has on the folks who place "monetized" videos on YouTube.

        To save you the trouble, it has a very chilling effect on their perceived freedom of speech.

        'Ever had an email accidentally routed to the spam folder or dropped altogether by your mailserver? Did you turn off your spam filters altogether and remove the verification systems on the server? Me too and me neither, respectively."

        Completely separate issue.

        1. doublelayer Silver badge

          Re: YouTube doesn't deserve section 230

          It's not a completely separate issue. It's a direct comparison. The spam is unwanted and the systems that guess at it still produce false positives. There is no way to have a perfect spam filter, but few people turn it off entirely to avoid the false positives. Online moderation uses similarly unreliable filters because to them, and it is them who make the system and pay for it, the costs of some false positives outweigh the consequences of no moderation, or moderation only when their small manual moderation team gets the time to look at it.

          I'm sure the people who end up experiencing the false positives are having a bad day. This is not great for them, and if there was a good solution to it, I'd advocate for it, but there is not. I'm also sure that some of them complain about this automatic moderation being a freedom of speech issue, because it's popular to say that anything that happens that you don't like is a violation of that right when it isn't. For example, if a platform leaves your video up but doesn't pay you for it, clearly they haven't blocked you from saying what you wanted (even if they deleted it entirely, that's their right as well). If it happens too often, it might be worth them considering diversifying their income stream away from one unreliable video platform, maybe having multiple unreliable platforms or self-hosting their videos and finding alternative methods of advertising. Or complaining more often to YouTube and hoping that improves their services. When a business is unreliable, I tend to try to reduce my reliance on it.

          1. jake Silver badge

            Re: YouTube doesn't deserve section 230

            Of course it is a completely separate issue.

            Spam is about abuse of resources, not content.

            Youtube filtering is only about content, the system is otherwise being used as intended.

            1. doublelayer Silver badge

              Re: YouTube doesn't deserve section 230

              "Spam is about abuse of resources, not content."

              I don't know what you're attitude towards spam is, but mine is all about the content. It's unsolicited email, but there is some unsolicited mail (my friends saying hi, for example) that I want to read and some that I don't, and the thing that determines which is which is what the messages contain. The resources occupied on my server are relatively unimportant to me. Yes, the messages will take up a few kilobytes on my disk, but that's cheap and they're going to take up that space anyway while they wait for me to check the junk folder and do a clean. It's all about rejecting content I don't want to have shown to me. Not to mention that, even if you want to argue that I'm doing this for technical resources, any system that hosts user-generated content, especially video, pays for the disks on which it is stored and even more for the bandwidth of delivering it and would save by deleting stuff they don't like. I think they're equally about the content, but if you want to bring resource usage into it, the resource usage for a video host is a lot more than a mailserver.

              1. jake Silver badge
                Pint

                Re: YouTube doesn't deserve section 230

                "I don't know what you're attitude towards spam is"

                I've been an email admin since the 1970s, starting on BSD at Berkeley and Stanford. Spam has been anathema to me for far longer than I thought it would be.

                Spam is not about abuse ON the 'net, it is about abuse OF the 'net.

                You sending me a personal email telling me to fuck off is not spam. You sending to every email address on file with ElReg, telling us all to fuck off, is definitely spam. Not because of the content, but because you abused ElReg's resources to do your dirty work. (Just an example, relax, I don't think you would do either.)

                Cost shifted advertising can be part of it, but isn't necessarily a requirement. But abuse of bandwidth and disk space always is. Note I'm not talking about my own personal, private accounts, rather I'm talking about simultaniously abusing tens of thousands, and sometimes hundreds of thousands or millions of accounts. In these cases, the cost of spam can add up to thousands of dollars per month, To say nothing about the possibility of crashing corporate email and the like. This kind of crap never sees your personal disk space, it gets nuked in transit. And no, the content of such email is never even noticed, much less cared about.

                Note that the disks that store the videos you mention are probably not even housed in the same state as YouTube's email servers. They are completely different services, set up and run and maintained by different teams of people, unlike your little server with its owner/operator. Or my personal little servers, for that matter. But yes, on your own system, you should feel free to carry on as you see fit. Just don't try to equate "what I don't like" with spam, they are two completely different things ... although they certainly overlap in places.

                Regardless, an email from your brother-in-law touting a politician your abhor is NOT spam, not even if he sends you several similar messaages over a week or two. Your email admin isn't going to drop everything BIL sends through the servers as spam just because he sends you an email. But YOU are free to block it, if you like. Personally, I'd just call the ass on the phone and tell him to knock it off ...

                Make sense? Clear as mud?

                Have a beer:-)

                1. doublelayer Silver badge

                  Re: YouTube doesn't deserve section 230

                  I disagree, and I'm not sure administrating a mailserver for a long time necessarily means you have the perfect definition of spam. Spam can be sent to lots of people at once, or it can be sent to one person. The thing making it spam is whether it has been requested by the person who's going to read it. This might mean that something is spam to me and not to you, but just because there exists one person who wants to read the message doesn't mean it can't be spam to anyone else who receives it.

                  "This kind of crap never sees your personal disk space, it gets nuked in transit."

                  That depends how it's being relayed to me. Since I'm running my own mailserver, at least for some accounts, I'm the one nuking a lot of it in transit, but it's still using my bandwidth even if I've elected not to keep a copy. This is theoretically costing me, and if I were flooded with it, it would be a problem for me. Even without doing it, it's spam because it's being generated by bots that have identified that there's a server that accepts connections and send whatever crap they want, and I'm not interested in reading whatever they have to send. Even if bandwidth and CPU time was free, it would still be spam because the content is not desirable to me. For example, if my brother-in-law was sending messages to the point that I no longer want to see them and he didn't stop when asked, he would have become a spammer to me and my spam filtering systems would be used to prevent me having to see the unwanted messages. Perhaps you would like to split the definitions and call some things "spam" and others something else, maybe "junk mail". If that's not your approach, then we must agree to differ.

                  1. jake Silver badge

                    Re: YouTube doesn't deserve section 230

                    You would be very hard pressed to find a bona fide Email Systems Administrator who would agree to any definition of "spam" that did not include the word "bulk" (or a reasonable facsimile thereof). Likewise, you'll be equally hard pressed to find one who agrees that spam has anything to do with content.

                    You are free to disagree ... but experience suggests that changing technical terms to suit yourself is a fool's errand.

                    1. doublelayer Silver badge

                      Re: YouTube doesn't deserve section 230

                      I can only refer to the number of pieces of software which call themselves "spam filters", including software that has existed for decades, which scan message content to determine whether to send it to the inbox or sort it into another folder, often called "spam" although "junk" is also common. Every mail account that has come with its own folders by default tends to have such a folder, and many things scan messages to put them in there. This has existed for as long as I've used email, which is less time than you've spent administering mailservers, but it's not as if I made up the term yesterday. If spam must never mean unwanted messages unless they're also sent in bulk, the alternate definition is at least thirty years old and in common usage. We don't need to argue terminology, though, as you can take my original comment, swap "spam" for whatever term you think best fits the thing I'm talking about which remains filtered, and it still stands.

            2. Anonymous Coward
              Anonymous Coward

              Re: YouTube doesn't deserve section 230

              Spam is about abuse of resources, not content

              These days, I find spam is used as "unwanted content". A video about terrorism is spam. On YouTube, a porn video is spam. A search result that goes to a shady website is spam. Any unwanted email is spam, doesn't matter if it was targeted to one single person or millions.

              1. jake Silver badge

                Re: YouTube doesn't deserve section 230

                ’The question is,’ said Alice, ‘whether you can make words mean so many different things.’

    4. stiine Silver badge
      Facepalm

      Re: YouTube doesn't deserve section 230

      So, if I'm a publisher and I tell you I'm not going to pay for your article, nor any royalties, either, but I still publish it, am i censoring you?

      1. jake Silver badge

        Re: YouTube doesn't deserve section 230

        "So, if I'm a publisher and I tell you I'm not going to pay for your article, nor any royalties, either, but I still publish it, am i censoring you?"

        No, of course not. You are not a government, you cannot censor me. There are other places I can publish.

        However, if you continue making money from my work without paying me, I'd consider you an asshole, pull my work, and look for that different publisher.

        Which is a completely separate issue entirely, of course.

  3. ChoHag Silver badge

    > Creating liability for platforms that use algorithms to rank and moderate content will ultimately force websites to over-moderate or take a hands-off approach to content

    Creating liability for platforms that use moderation algorithms will ultimately force websites to be liable for their moderation algorithms.

    1. Zippy´s Sausage Factory

      Creating liability for platforms that use moderation algorithms will ultimately force websites to implement only government-approved moderation algorithms.

      1. Michael Wojcik Silver badge

        And will make it economically infeasible for newcomers to challenge established players.

        Revoking or limiting §230 is one of the stupider ideas in US politics these days, and that's not an easy bar to hit.

    2. Graham Cobb

      Creating liability for platforms that use moderation algorithms will ultimately force websites to be liable for their moderation algorithms.

      Which would just cause sites to not moderate at all. Even I, who am pretty free-speech-hard-core, wouldn't want that.

      Perfect moderation (by algorithms, humans, or the combination of the two which social media companies use) is impossible. I would expect any ElReg reader to know that even if politicians don't generally have the technology (or just life) experience to realise it.

      1. katrinab Silver badge

        There is a difference between deleting some dodgy material, but not all of it; and actively recommending dodgy material unprompted when visiting the front page of the website.

        Right now, YouTube is recommending me a load of videos from people telling everyone that the earth is round and that people who believe it is flat are idiots.

        I mean, the earth is round, and people who believe it is flat are indeed idiots, but the topic is not of much interest to me.

        1. An_Old_Dog Silver badge

          Conflation of Issues

          @katrinab: I wish I could upvote you again.

          The industry mouthpiece, Mr. Kovacevich, talks about "moderation", while the issue is algorithmic recommendation.

        2. jake Silver badge

          "but the topic is not of much interest to me."

          And now you know why visiting that side of YouTube is a waste of time for more than the obvious reasons.

          Suggestion: Nuke your youtube cookies to clear the idiot material.

          My eldest nephew takes it one step further when the idiotic recommendations get out of hand ... He doesn't have a youtube account, but he does have a couple dozen channels that he keeps an eye on. He keeps bookmarks to all of these in a specific folder in his browser. After nuking the cookies, he tells his browser to open all those channels. This resets the cookies to what is (for him!) a somewhat sane recommendations state. Yes, it takes a while to open all those pages (he takes the dog for a walk), but for him it's worth it. Obviously YMMV.

  4. Anonymous Coward
    Anonymous Coward

    Disappointed with the author's description

    Nohemi Gonzalez was described as having been murdered in "...a November 2015 terrorist attack in Paris..." This was one of three linked attacks which together formed the worst terrorist attack in France since the War. Gonzalez was one of thirteen killed in an attack on a restaurant. A bomb attack on the Stade de France during a football international also happened (the President of France was in attendance), but the biggest loss of life (and probably the best-known to readers) was the Bataclan Massacre, where over 90 people were murdered.

    I would prefer the article to be modified to reflect this. The existing description seems, to me, inadequate.

    1. Graham Cobb

      Re: Disappointed with the author's description

      Why? The article, and the case, are about YouTube's moderation. Not about the terrorists' crime.

      I certainly do not believe that sites should be required to "nerd harder" because of some despicability rating for the terrorist crime.

    2. doublelayer Silver badge

      Re: Disappointed with the author's description

      What part of the provided information was wrong? Was it not a terrorist attack? Was it not in Paris?

      There were other attacks there as well, but they aren't relevant to this article. That doesn't mean that they were worth forgetting or that the victims of them shouldn't be remembered, but the legal case which is the subject of this article is not related to them. If the other attacks have to be listed because they involved more victims, why not include other attacks that killed even more? Many countries have faced attacks that killed even more people, and those attacks are important too. They're not mentioned because they're not related to the news the article is talking about, and the other 2015 attacks are not either. It is no disrespect to the victims there.

  5. martinusher Silver badge

    SCOTUS not competent to rule -- by its own admission

    In other recent rulings that effectively gut gun control even further the court said that it was basic its judgements on historical precedent. In other words, what was usual and customary during the Revolutionary period. This Originalist approach to rulings suggests to me that they're incompetent by their own admission to rule on anything that was invented since about 1800. At a pinch you could regard Internet provision as similar to making and selling printing presses, ink and paper back then but realistically SCOTUS Is completely out of its depth.

    I daresay that there are plenty of people out there who think that the likes of Google need 'reining in', they're too big, too rich, too powerful so they've got to be brought down. But it really is a case of "Be careful for what you wish for..." -- there are no alternatives out there apart from governmental, or quasi-governmental, censorship. This, I'll be told, won't happen in our society because we're free -- except that I'll just point at Florida and say "Oh, yeah?". There are plenty of groups out there who'd love to be our moral arbiters, its not like its a new phenomenon, and giving them a legal standing to oversee the Internet is just asking for trouble. (I'd rather put up with a bit of ISIS propaganda.....if you fall for that then its not your morals that need protecting, its your sanity!)

    1. Anonymous Coward
      Anonymous Coward

      Re: SCOTUS not competent to rule -- by its own admission

      I have insufficient info about this Google case to judge it, and I certainly do not "hate" Google. However, I think your are projecting this case onto a censorship vs. free-speech issue, while I think 230 is really 3-(or more) way: Censorship vs. Free-speech vs. algorithmic promotion of hateful dopamine producing crap. That's 3, but 230 has many other faces - e.g., it is also used to market shoddy dangerous goods irresponsibly online while brick and mortar cannot do that. There is no reason for that to continue except campaign contributions.

      It is such an ugly sprawling mess that I think that repealing it completely without loopholes is the best alternative.

      Online newspapers, not covered by 230 (expect the comments), manage to survive and some of them even thrive, despite lots of factual mistakes, occasional lies, and "news" articles loaded with bias. Or laws protect all that kind of speech without 230. So I feel your fear of being strangled by censorship is a bit overboard.

      1. jake Silver badge

        Re: SCOTUS not competent to rule -- by its own admission

        Except it's not about censorship, no matter how much certain people want it to be.

        The alphagoo ijits can't censor anyone because they are not a government[0]. All they can do is stop people from publishing using their resources. There are plenty of other places one can publish.

        Unless you consider me a censor because I refuse to allow you to use my printing press to print your broadsheet.

        [0] Don't say"yet", not even in jest!

      2. Graham Cobb

        Re: SCOTUS not competent to rule -- by its own admission

        Sorry, your understanding of 230 is completely wrong.

        It has nothing to do with marketing dangerous goods. If that is illegal (which I believe it is) then it is still illegal. 230 does not protect any violations of federal law. It just says the person doing the selling is liable, not the market square in which the marketing takes place. If you want to stop dangerous goods being sold, then make sure that law enforcement have the necessary resources and incentives to address this, and the crimes have the necessary punishment to make it not worthwhile. Just like all other law enforcement.

        I don't understand your comment on online newspapers. Newspapers create content, and have extremely strong free speech rights - in practice stronger than everyday people. And, as you point out, their comments sections benefit from 230 providing their users with strong rights to freely discuss the paper's views and articles. Social media sites do not create content - they let other entities create and publish their content and give views and make comments.

        Google, YouTube, etc are just the latest version of bulletin boards. I can pin anything I like up on our village noticeboard - I remain responsible for it. The parish council does not become responsible for it.

  6. JohnSheeran

    It seems, at the core, that this whole thing is a simple case of freedom of speech and liability. While I tend to agree with what the Supreme Court is saying regarding the number of frivolous lawsuits, I think the case should be considered under the idea that a provider (Google) should be held liable IF they were alerted to questionable content and did nothing about or continued to promote it. If the content in question was never reported for any reason and got a lot of hits then it's tough to fault their algorithms for promoting it. It doesn't seem difficult to have an algorithm that also considers other criteria that would allow them to detect and review questionable content.

    Protecting freedom of speech is paramount of course. This case isn't so much about that as it is about liability (duh) but many will equate these ideas as the same thing.

  7. mark l 2 Silver badge

    For you to be recommend ISIS propaganda you really have to have looked for it or similar content. I don't rate the Youtube algorithm particularly highly as im often recommended videos ive already watched, and considering there is something like 300 hours of video uploaded every minute, im sure there must be 1000s of new videos that would be of interest that it doesnt seem to recommend.

    But what it does recommend to me is based on what videos i've previously watched, who im subscribed to, and your Google search history etc. And not once have i been recommend ISIS terrorism content in the several years ive been using the app / website.

    So unless these ISIS videos were promoted on the YT home page for people then I don't see how people are stumbling across them accidentally.

    1. Anonymous Coward
      Anonymous Coward

      الجهاد بالسيف

      I have no idea whether it was promoted or not - but your experience is not proof. The test would be to be subscribe to lots of arabic oriented content, wahabi islamic content, and see what happens. Maybe throw in the names of some fiery preachers or other key words.

  8. Reginald O.

    Let them eat words...

    The big corps want absolute iron-fisted control and ownership of all data, including speech, on the internet while demanding equally bullet proof protection from the government for consequences.

    The Supreme Court will likely kick this can down the road, again because they readily admit total incompetence and stupidity when it comes to things with wires, tubes, and transistors.

    However, the Court could step up and take control of the internet granting the PEOPLE their formerly inalienable rights while at the same time providing controls such as the current free speech standards in effect everywhere EXCEPT the internet.

    The corporations have too much power over the people simply because they own the wires and tubes. That's not right.

  9. YARR
    Boffin

    "Recommendation" misnomer?

    recommend = v. to suggest that someone or something would be good or suitable

    Recommend implies a value judgement, but should any content provider be making a value judgement about content they haven't personally reviewed?

    If "recommended" videos were renamed to "related" videos then no value judgement is implied, so the content provider is not promoting the videos, just stating that they have some similarity with the content you are currently viewing.

  10. ecofeco Silver badge

    Appaling and beyond the pale

    It is appalling to see so many comments that think enabling terrorism is somehow consequence free.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like