back to article TikTok isn't protected by Section 230 in 10-year-old’s ‘blackout challenge’ death

A US appeals court has issued an opinion that could have wide-ranging implications for social media platforms, finding that content selected for users by TikTok's algorithms doesn't qualify for Section 230 protection. In an opinion [PDF] published today, a three-judge panel from the Third Circuit Court of Appeals in …

  1. JessicaRabbit

    It seems like a politically safe opinion with regards to Tik Tok given the US' hostility towards it and its Chinese owners at present but I wouldn't be surprise if there's a sudden backtracking of this once it threatens to affect good old American social media companies.

    1. Strong as Taishan Mountains

      Yeah, this will be tiptoed around whenever it bumps up against Meta or X.

      1. DS999 Silver badge

        There isn't any way they can create a precedent for Tik Tok that doesn't also affect Facebook and Twitter. They might WANT to, depending political motivations, but the law can only be twisted so far. They have to come up with legal reasoning. I don't see how Tik Tok's "for you" is any different than the stuff that comes up in your Facebook feed not from those who follow, or similar on Twitter.

        Now if there was a law saying Section 230 applies differently to US owned and foreign owned social media companies that would provide the out, but such a law would have to pass. I imagine it could easily pass congress, provided it wasn't held up by those who want to make bigger changes to Section 230 or eliminate it entirely.

        I remember the good old days of Facebook long ago, when if you'd seen everything posted by your friends and pages you followed it would simply tell you that's it. Now you can doomscroll forever, and it'll keep coming up with stuff it thinks you'll like. Or rather stuff you might like or might hate, but stuff you'll engage with.

        1. John Brown (no body) Silver badge

          Based on the reasoning that algorithmically selected content is not covered, I wonder when some enterprising lawyer will try it with "AI" generated content being the responsibility of the "AI" creators?

          1. DS999 Silver badge

            That isn't an argument that Facebook or Twitter could try, since they both have their own in house AI. Not sure about Tik Tok, but the same is likely the case.

            But I imagine someone using OpenAI or whatever for their business will try to excuse themselves when illegal things happen "because the AI did that, not us!"

          2. Doctor Syntax Silver badge

            The Canadian air line case made it the responsibility of whoever deployed the AI. I don't know whether that was at a court high enough to set a precedent or whether Canadian precedents would be followed by a US court. In any event it seems a reasonable line to take.

  2. The Dogs Meevonks Silver badge

    Excuse me whilst I......

    Bwahahahahahahahahahahahahahahahahaha

    hahahahahahahahahahahahahahahahahahaha

    hahahahahahahahahahahahahahahahahahaha

    C'mon... start nailing these fucking anti-social platforms to the wall for their utter failure to deal with the hate, bile, misinformation and dangerous content they profit from, promote and push to others....

    I bet they get fined a tiny fraction of their income... but if this goes to a jury trial... the damages could result in hundreds of millions in compensation and punitive damages... then open the floodgates to more lawsuits.

    The only thing these bloodsucking leeches understand is money... let's make it unprofitable for them to survive without dealing with the problems they've created.

    1. Jellied Eel Silver badge

      Re: Excuse me whilst I......

      C'mon... start nailing these fucking anti-social platforms to the wall for their utter failure to deal with the hate, bile, misinformation and dangerous content they profit from, promote and push to others....

      I just hope it brings about the end of 'AI' generated 'recommendations'. Earlier today I made the mistake of watching a few mins of a popular YT personality, ie one that has somehow ended up with 10m+ subscribers. Turned out to be the usual clickbaity vid with lots of style, and virtually no content. Now, half my recommended videos are from this mu.. person. In which I'm also learning to hate the YT trends of clickbait combined with 'suprised pikachu face', or as Simon Whistler calls it 'soyface'.

      AI may conceivably be useful if it could let me filter those out.

      If this case leads to a feature long overdue, like allowing me to express my own interests, anti-social media companies could provide themselves with some protection, if I picked topics like 'how to win a Darwin Award'.

      1. 0laf Silver badge
        Flame

        Re: Excuse me whilst I......

        I no longer engage with these platform unless there is something very specific I need to see. I won't create an account and I'll use ad blockers if I have to go there.

        But now, I don't YT, I don't X, I don't TikTok and I only visit FB now when I have to to engage with the local groups (school etc), the local pages we refer to as the "[insert town] bigotry and hatred group".

        Using YT used to be actually interesting and even entertaining, now it's filled with such a staggering volume of junk, and monitised to such a grasping level that it's nearly impossible to find anything of value and when you do the actual viewing expereince is utterly awful.

        Why I don't really understand is why such shit products continue to exist.

        But then the majority of the web is garbage now, you can't even seach Autotrader without your searches being polluted by Cinch listings and of course there is no way to screen them out to only view local offers.

        1. MachDiamond Silver badge

          Re: Excuse me whilst I......

          "Using YT used to be actually interesting and even entertaining"

          I find YT to be incredibly useful, but I don't follow their recommendations as they are useless. Just because I want to see how to change the O2 sensor on my car doesn't mean I want to "pimp my ride". Before I deleted the rear seat and built a platform so I could do some car camping and transport things more easily, I was able to see what mistakes other people were making. The list goes ever on. I don't bother with "influencers" and entertainers.

      2. LybsterRoy Silver badge

        Re: Excuse me whilst I......

        I don't seem to get these recommendations - I suppose you have to be logged in for it?

        1. MachDiamond Silver badge

          Re: Excuse me whilst I......

          "I don't seem to get these recommendations - I suppose you have to be logged in for it?"

          When YT was bought by Google and The Big Evil put out their horrible EULA, I didn't sign up for anything further that Google had and closed my Gmail account a couple of years ago. I emphatically do not want to be a Google product and constantly surveilled by them while on-line. There's also no good reason to be signed in. The age restricted content can be unrestricted if you know how.

    2. BasicReality

      Re: Excuse me whilst I......

      I look forward to rules that allow social media to be fined for what the government calls "misinformation" or "hate speech." Perhaps under the next Trump administration we could go after those who claim climate change is caused by people. Or maybe fine companies for saying covid shots are safe and effective.

      If the government decides what is credible or allowed speech, be careful when you don't like the side that gets elected.

      Better off simply making a blanket requirement that all speech is allowed, except for extremely minimal limits such as actively calling for violence against someone or doxxing a person.

      1. Anonymous Coward
        Anonymous Coward

        Re: Excuse me whilst I......

        Netflix recommended Fast and Furious WTF and this influenced me to drive my Astra Diesel at 100MPH through a bus queue of nuns

        1. heyrick Silver badge
          Coat

          Re: Excuse me whilst I......

          Well, it's a shortcut straight to heaven, what are they whinging about?

        2. Anonymous Coward
          Anonymous Coward

          Re: Excuse me whilst I......

          Weird how you don't know the difference between fiction and lies reported to be facts.

          1. Doctor Syntax Silver badge

            Re: Excuse me whilst I......

            I take it you're from the US.

        3. MyffyW Silver badge

          Re: Excuse me whilst I......

          I watched The Nun and this influenced me to flagellate myself for buying a Vauxhall Astra, can I make a claim?

        4. David 132 Silver badge
          Happy

          Re: Excuse me whilst I......

          Now we know you’re a hallucinating AI model, because there’s no way an Astra diesel could get up to 100mph even if you pushed it out of a cruising Hercules.

          1. 0laf Silver badge

            Re: Excuse me whilst I......

            Unless it's a Vauxhall Astra van. Fastest car on earth

            If you were doing 150mph on the autoban in your exec saloon you'd still have a plumber in a manky Vaxhall Astra Van flashing you to get past

      2. martinusher Silver badge

        Re: Excuse me whilst I......

        I have s solution to this problem that works for me.

        I just don't watch the stuff. You should try it -- just because some algorithm / person / Centauran / whatever puts stuff in a "For You" list doesn't mean that you actually have watch the stuff.

        1. John Brown (no body) Silver badge

          Re: Excuse me whilst I......

          Yeahbut, sometimes the thumbnail and tagline is just sooooo enticing!!! And it only needs that one moment of weakness and down the rabbit hole you go :-)

          Seriously though, sometimes it really does sound interesting, right up to about 5 seconds after you clicked the link to find a stream of Kenburns effect slide show photos with the standard (free? open source?) computer generated US documentary narrator voice "reading" a badly edited script, mispronouncing names and abbreviations at every opportunity and inserting strange pauses in odd places due to poor or misused punctuation in the text.

        2. The Dogs Meevonks Silver badge

          Re: Excuse me whilst I......

          I've gone one further... I don't use social media at all. When some one tells me something... my first question is normally 'where did you get that information' and if the answer is from any social media site... I know to treat it with the distrust it deserves.

          I have used them in the past... I was asked to use facebook in 2007 by work colleagues, deleted it in 2008, asked to use it by friends in late 2010, deleted in early 2012. Twitter I used from around 2011 as an ex said it was her main platform of choice... but I only used it to catch up with her every few yrs and deleted it in 2018.

          The only one I enjoyed was G+ because whilst the data harvesting was still going on... there were no ads, no stream of shit pushed in your face from losers and fakes. It was all about the people you chose to follow and there was no forced follow back feature like with others.

          I now only use Mastodon, and in terms of the engagement and people I've found... it's great. No algorithm, real engagement from real people... some claim it's crap but these are the same people who don't actually engage with others. The types who shitpost links for likes and expect everyone else to do their work for them and spread anything they post. I actually talk to the people who comment on my posts and I comment on theirs... ya know... like normal people being 'social' should.

      3. Anonymous Coward
        Anonymous Coward

        Re: Excuse me whilst I......

        That would be all fine and dandy if morons didn't believe it... Of course, if that was the case, these propaganda outlets wouldn't exist.

        I find it odd that you seem unable to distinguish the difference between blatant lies and peer reviewed truth. Seems the propaganda is working on you.

      4. TheMeerkat Silver badge

        Re: Excuse me whilst I......

        Why there are so many people who think the government should decide what is “allowed speech” and what is not?

        I thought the majority would understand that it is bad, but it appears that the ,ajority would happily live in a totalitarian state.

      5. The Dogs Meevonks Silver badge

        Re: Excuse me whilst I......

        I know that the US loves to bang on about 'freedom of speech'

        But that doesn't include 'freedom from the consequences of speech'

        Many countries have laws regarding hate speech and dangerous content already. The spread of misinformation is what led to the recent riots in the UK... So let's clarify that part a little and say 'lies designed to misinform and mislead'

        I'm no legal expert... I just want to see ALL of these platforms reigned in and forced to become less toxic wastelands of hate and shit.

        1. Anonymous Coward
          Anonymous Coward

          Re: Excuse me whilst I......

          No, what led to the riots is decades of ignoring the concerns of the native working class in the face of mass importation of cheap labour, which suppresses wages, the slow ethnic cleansing of entire towns by that same imported labour, and the complete inaction of the state in the face of the growing violence, rape, abuse, and murder carried by those same imported people. The sparking incident is largely irrelevant to those decades of resentment. It could just as easily have been any of the violent incidents that happened in the weeks before or after (the suspects in which are unlikely to go to trial before the new year, unlike the rioters, who were unjustly rushed through the courts within days or even hours of arrest). The fact it was that one is pure happenstance.

      6. MyffyW Silver badge

        Re: Excuse me whilst I......

        I think there is a spectrum on which "not serving up snuff advice to 10 year olds" can be outlawed without infringing anybody's right to believe and say what they like about established, proven science.

    3. John Brown (no body) Silver badge

      Re: Excuse me whilst I......

      "...start nailing these fucking anti-social platforms to the wall...fined a tiny fraction of their income... but if this goes to a jury trial... the damages could result in hundreds of millions in compensation and punitive damages... then open the floodgates to more lawsuits."

      This is why they "settle out of court" with strongly bound NDAs. The aggrieved party gets "quick justice" and the accused gets to say "who, me?, not guilty, no fault" and the merry-go-round continues. None of the incumbents ever want to go to court over these sorts of cases. Same applies to Ts&Cs. No one wants' them tested in a court.

      They only ever seem to end up in court when Govt. wants a whipping boy or it's another $BigCorp with equally deep pockets and both see the advantage of "winning", or at least not losing.

      1. MachDiamond Silver badge

        Re: Excuse me whilst I......

        "This is why they "settle out of court" with strongly bound NDAs. "

        There's "common sense" and then there is precedent. Once a precedent is set. the lawyers will pounce on it creating more precedents to cite. Companies understand that if they lose once, they'll never have a chance to win in future. Cheaper to pay the odd plaintiff off to make them go away and end the news cycle before it gets going. If the NDA is very clear that if they talk, they have to give the money back, some people are smart enough to keep quiet. Some aren't and there have been some cases of that. If SuperMegaGiant, inc takes you to court for violating an NDA, chances are you will go down in flames and they might get some of their money back. The person will also be reminded that the case doesn't clear the NDA so if they talk some more, they can wind up in court some more.

  3. Crypto Monad Silver badge

    At last, some common sense

    These platforms intentionally select content that they expect users will "engage with".

    So they select content about asphyxiation, and users actually read it and/or try what is suggested. Who'd have thought it?

    1. Anonymous Coward
      Anonymous Coward

      Re: At last, some common sense

      To be honest, 40 years ago we really would not have credited that people would asphyxiate themselves after watching a vid, nor that a major commercial industry would legally exist facilitating it.

      1. Jadith

        Re: At last, some common sense

        Sorry, not sure if this is supposed to be sarcasm or what. Forty years, almost exactly, was right in the middle of the 'Satanic Panic' there is even a awful movie starring Tom Hank's about it.

        1. stiine Silver badge
          FAIL

          Re: At last, some common sense

          Do you know how much trouble you could get yourself into by spinning an Iron Maiden record backwards on dads turntable?

          The problem back then was Tipper Gore, who I think should have been hanged on the white house lawn.

          1. Natalie Gritpants Jr

            Re: At last, some common sense

            Yes I do. The angle of the needle arm will cause any dirt particles to be driven deeper into the vinyl. Those clicks will never come out and your dad will know.

        2. Joe W Silver badge

          Re: At last, some common sense

          And asphyxiating "challenges" (though they were not called that) did exist in schools back then as well... nothing new under the sun?

          1. 0laf Silver badge

            Re: At last, some common sense

            But the numbers reached were much smaller, you'd be less likely to be along while trying and possibly your friends might try to save you. Dare I also say due to exposure to actual real life not so many of us would be so stupid to try. Plus we were doing better things, like trying to light our own farts

      2. PRR Silver badge
        FAIL

        Re: At last, some common sense

        > 40 years ago we really would not have credited that people would asphyxiate themselves after watching a vid

        1965. A TV kids-show host told viewers to take money from parents' pants and pocketbooks, "....and mail them to me..." While his take was slim or none, "in 1965 plenty of adults were livid at the thought of a TV personality's crassly manipulating children for commercial gain".

        Green Mail -- Wiki

        His low take-up may have been due to not giving an address or where to find the postage stamps. I don't tic-toc, but I know today's YouTube would swiftly find detailed (if flawed) instructions on any stupid activity.

        And yes, in 1965 I was just about at the age of the child in this case. "Saved" by being in a different city with better New Year Morning shows than Soupy Sales.

        1. MachDiamond Silver badge

          Re: At last, some common sense

          "1965. A TV kids-show host told viewers to take money from parents' pants and pocketbooks, "....and mail them to me..." While his take was slim or none, "in 1965 plenty of adults were livid at the thought of a TV personality's crassly manipulating children for commercial gain"."

          When I was a small child, we had one TV in the house, dad controlled what was on and I was the remote control. I could watch Saturday morning cartoons and a few other things if the weather was bad, but the parental units monitored the content. Today, parents hand their kids their first unlimited service mobile at 5 yo when it's even a bad idea to have a TV in the kids room where what they watch isn't known.

          There's been some action in the US with requiring an absence of mobile phones at school during the school day. At least 5 states have that on the books. I think that's a very good idea. There's little reason for a child to have a phone at school during class. The push back from some parents is that they want their kids to be able to call them if there is a problem, but schools don't want kids to have phones for the same reason. Not that staff is trying to hide something, but that if there is a discipline issue, it's often due to phone use and the last thing they want is some hysterical parent on the phone screaming obscenities at them along with the kid's foul mouth. It's not always the case, but in-person conferences with parents can be more manageable. Many people have better manners when they are face to face with somebody. That might be fading, though.

          Getting back to the point, these challenges can cause damage when parents have done things to exclude themselves from the child's world to yet another degree. I can remember a few kids in my neighborhood whose parents interviewed the other kids and wouldn't let their little darlings play with any other kids that didn't meet the standard.

    2. mattaw2001

      Re: At last, some common sense

      We already have hundreds of years of law for this - "letters to the editor" pages. Its great to see the rules being consistently applied.

      Especially the two-faced "its an algorithm, we can't control it" and yet at the same time "its an algorithm, let's make giant piles of money from it".

      I've never understood how "it's just the algorithm, bro - waaay mysterious" in any way allowed them to escape responsibility for its choices.

      1. spacecadet66 Bronze badge

        Re: At last, some common sense

        Right? It's an algorithm...that was written by humans at the bidding of other, richer, humans. It wasn't beamed down from Arcturus, it didn't come off a mountain on stone tablets. Its provenance is well known and straightforward.

    3. Doctor Syntax Silver badge

      Re: At last, some common sense

      "So they select content about asphyxiation, and users actually read it and/or try what is suggested. Who'd have thought it?"

      Anyone who had experience of dealing with sudden deaths. Distinguishing between what's generally termed sexual asphyxia and suicide was a problem well before social media, the web or even the internet existed.

      1. Jellied Eel Silver badge

        Re: At last, some common sense

        Anyone who had experience of dealing with sudden deaths. Distinguishing between what's generally termed sexual asphyxia and suicide was a problem well before social media, the web or even the internet existed.

        Ah, good'ol autoerotic asphyixiation. One of those curious and high risk activities that can be practiced alone, or with friends. And walks the fine line between la petit mort and meeting Mort. It's one of those things where sites like YT and other social media can be good, or bad. So anyone searching for that should ideally be directed to content showing why that's a really bad idea & can be extremely dangerous.

        But people are weird. This is one of my favorite YT channels, and probably NSF.. anyone who's just eaten.

        https://www.youtube.com/watch?v=XAXfqISRZT4

        A doctor describing what happened when a woman decided to do some extreme weight loss and bought tapeworm eggs online. Which is rather shocking given people are recommending and selling this as a diet plan. But also got me wondering about the legalities given 'health' supplements are largely unregulated, so if this is legal, and if yes, it probably shouldn't be. And then how to stop this when victims are probably being hooked by dieting groups, then caught via direct messages and 'friends' probably flogging this diet plan via affiliate links.

        So a massive battle for health and law enforcement to try and catch & deal with dangerous stuff like this. Recently YT has started adding tags to qualified medical practioners, so at least some of the content may be trusted to some degree.

        1. Doctor Syntax Silver badge

          Re: At last, some common sense

          "Ah, good'ol autoerotic asphyxiation."

          Nothing good about it when it eventually ends up with the family visiting the coroner's court. It's not a situation that I had to investigate myself but I've had colleagues who did.

          1. Jellied Eel Silver badge

            Re: At last, some common sense

            Nothing good about it when it eventually ends up with the family visiting the coroner's court. It's not a situation that I had to investigate myself but I've had colleagues who did.

            That was semi-sarcasm/dark humor. Meet Mort "That was silly, wasn't it? And why the orange?". Pre-Internet, it's one of those things that people outside the BDSM community probably wouldn't have heard about, and if they did, may have been told about the risks. Then more people might have heard about it after sensational/salacious reporting around the orange case. Now thanks to the Internet, more thrill seekers might discover it, but might not be aware of the risks. So how 'social' media companies can either raise awareness of the dangers, or prevent kids from trying stupid stuff. Which is also a parental responsibility, but kids are increasingly tech savvy and find their way around attempts to block content.

            It's also part of a worrying trend. People chasing monetisation, influence and free stuff are doing increasingly dangerous 'extreme' stuff for views. So I used to cave, and dive. Putting the two together is a recipie for disaster without proper training & kit. But it's exciting, gets views and people may copy it and some will inevitably get trapped and die. And then endanger rescue workers doing body recoveries. It's one of those areas where I think the 'social' media companies need to do more to warn people about the dangers.

  4. BasicReality

    Not a bad ruling, now we need to get the other side. Allow all speech and keep protections, but when you decide to block certain topics, now you're a publisher and liable.

    1. Orv Silver badge

      Except Section 230 was specifically written to allow platforms to make moderation decisions without losing their liability protection.

      It was included in the Communications Decency Act, which required platforms to take down certain types of content. Platforms pointed out that, under existing precedents, this would open them up to liability. Up until that point the accepted doctrine was what you suggest -- that platforms could either moderate nothing, or take responsibility for everything. Since the CDA effectively *required* moderation, this risked making it completely infeasible for platforms to have user-generated content at all. Section 230 was meant to remedy that.

      TLDR; remove the Section 230 shield and platforms will have no choice but to stop distributing any user-supplied content and only publish their own curated stuff.

      1. John Brown (no body) Silver badge

        Luckily, removing the shield isn't the intent of the court ruling. The intent is to rein in the feature creep that social media has incrementally gained over time by constantly pushing the boundaries.

        1. doublelayer Silver badge

          It is not the intent of the ruling, though it wouldn't be hard to extrapolate it into doing that anyway. However, it is exactly what the original post in this thread would get if their idea was implemented. From previous posts, I'm guessing they're one of the people who don't like how moderators removed or posted additional information around something they agree with, and they want that to be illegal, but they haven't considered that the law they're trying to gut is the main reason that any similar posts are available at all.

        2. Orv Silver badge

          The effect would be the same. There's no getting around using "algorithms" to decide what people see. "Sort by date" is an algorithm. So is "select a video randomly," for that matter.

          Techdirt's post on this gets at the problems pretty well:

          https://www.techdirt.com/2024/08/29/third-circuits-section-230-tiktok-ruling-deliberately-ignores-precedent-defies-logic/

          1. 96percentchimp

            Neither sort by date nor random selection are editorial choices. They might even be user choices. An algorithm designed to encourage engagement based on a user profile is, without any doubt, an editorial choice which meets the First Amendment tests. If it qualifies for free speech, then it qualifies for the responsibilities that come with those rights.

      2. MachDiamond Silver badge

        "TLDR; remove the Section 230 shield and platforms will have no choice but to stop distributing any user-supplied content and only publish their own curated stuff."

        In this case it's the platform curating a list of recommendations using some sort of automation to keep people from leaving the website. A human curator might down-check "challenges" where people would put themselves or others in harms way, but that would cost money and give the appearance of advocating dangerous behavior where there's a possible out if a machine does the same thing.

        There a choice of not making recommendations at all or setting up to only offer more of the same. If you click on a video of the Rolling Stones, all of recommendations are also videos of the Rolling Stones (from an official channel).

        The influencer problem stems from the promotion algorithms. I've chided a person who's content I enjoy on the business of professional commercial photography that goes on about how he has to craft titles and photo thumbnails to gain more viewers. I find the reasoning not well thought out. His content is not general purpose T&A but something that's only going to appeal to a small number of people. Plenty of people will find his channel that are truly interested in the sort of content he puts out. It's the same as me not wanting to drive millions of people to my web site. That would be a negative as it might use more bandwidth than I am allocated and do me no good at all. If I only got 10 views a month and 2 people contacted me for work, that's far better than 100,000 visitors with 10,000 stupid questions and 2 jobs. I'm a quality over quantity sort of person. I also have no time to delete 10,000 stupid questions.

    2. doublelayer Silver badge

      "now we need to get the other side. Allow all speech and keep protections, but when you decide to block certain topics, now you're a publisher and liable."

      You do realize that, with something that stupid, the law would then say that everything at all would have to be posted to keep the protections. I.E. unless you keep up the terrorist beheading videos, you're liable. You would effectively prevent all public posting, including these forums, except for those places so extreme that they don't mind hosting literally anything, no matter how illegal, that someone decided to upload. That goes for the places you like as well. Maybe you're into some conspiracy theories and you're tired of those being moderated. Sorry, but if the sites that are keeping those up ban anything, be it even more extreme ones that you don't believe in or people disagreeing, they can now be sued for anything they keep up, meaning they're much less likely to decide to keep up the stuff you want to see or get away with it if they do.

  5. Anonymous Coward
    Anonymous Coward

    Social media companies, are bars that open 24/7, have no bouncers, the fire escapes are locked, and they employ legally blind bar staff so they can use that as an excuse for serving drinks to kids in school uniform.

    Then they claim no responsibility for the fights, deaths and people pissing on streets for miles around them.

    It doesn't fly in meatspace , long past time that they have adequate bouncers and fire escapes, and if that puts them out of business, well that's what happens to bars and clubs too.

    1. heyrick Silver badge
      Happy

      When I was at school (so late 80s), a local pub was quite happy to serve drinks to kids in school uniform. The guy's logic was "they are here enjoying some watered down beer and not god only knows where getting up to mischief". The local rozzers turned a blind eye so long as we (mostly) behaved because, well, the guy had a point. Plus being inexperienced children we were dumb enough to think that was real beer, though I usually asked for a tea because, ugh, beer tastes awful, and I'm one of those people who believes the only correct response to anything is to put the kettle on...

      1. John Brown (no body) Silver badge

        And anyway, theoretically, one could be a school pupil, dressed in school uniform at the pub, and be over 18 if in upper 6th doing A levels :-)

      2. I ain't Spartacus Gold badge
        Happy

        My school had a similar pub next door. Although, we were able to drink there because the landlord wasn’t very nice, and nobody else went in.

        That pub is now a famous Michelin stared restaurant. I tried to book a table a few years ago, and there was an 18 month waiting list. So it was easier to get in, when I was 16,in school uniform and it was illegal…

      3. collinsl Silver badge

        > "they are here enjoying some watered down beer and not god only knows where getting up to mischief"

        It's all for the greater good

      4. 0laf Silver badge
        Pint

        Also had a local pub which was tolerant to the underage as long as you were no trouble. It was common then to be taken to the pub aged 15/16 under the wing of older mates. School uniform might have been pushing it a bit but the landlord would buy us a drink on our 18th birthdays.

      5. spacecadet66 Bronze badge

        I remember the bar I hung out in when I was 14 and 15. Fun place, I don't know how they got away with serving crowds of blatantly underage drinkers for so long but, given where this was, I assume it was probably good old fashioned bribery.

      6. MrReynolds2U

        Same here

        I used to go into the pub next door to my school occasionally when I had a gap in the day. We'd play pool and have a pint.

        The unwritten rule was that you had to take your school tie off first. That way you were just someone wearing a shirt and trousers.

        It made the Maths lesson on return to school a lot more fun.

        1. spacecadet66 Bronze badge

          Re: Same here

          That's just good sense, going in with your bloodstream at the Ballmer Peak.

    2. Orv Silver badge

      I tend to agree, but this case isn't about whether TikTok should have had better age verification.

  6. prh99

    Yet another court inventing a section 230 exception.

    The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation. So by this logic a site is liable if they moderate? A situation 230 was enacted to prevent (undoing the Prodigy decision).

    The internet isn't safe kids and parents need to parent not use TikTok etc as a babysitter.

    1. Headley_Grange Silver badge

      "The internet isn't safe kids and parents need to parent..."

      So if a kid is unlucky enough to have bad parents then it's just tough shit and they deserve be left to mercy of every nasty nutter on the internet?

      1. prh99

        That's what CPS is for. A 10 year old left to browse TikTok unrestricted and unsupervised. If TikTok is liable for promoting the video the parents should be liable for negligence.

        1. Doctor Syntax Silver badge

          Alternative view: Tik Tok and the parents should be liable. The two are not mutually exclusive.

      2. heyrick Silver badge

        "then it's just tough shit and they deserve be left to mercy of every nasty nutter on the internet?"

        Not so different to real life if a child decides to wander off and try taking themselves to look around a city. The child might come back with a load of new experiences. Or they might vanish and never be seen again.

        In either case, as the above poster notes, there is surely an aspect of parental neglect in failing to, you know, parent.

        1. Jamie Jones Silver badge

          Yes, I agree there needs to be better parent responsibility, but to add to your analogy, if someone does kidnap a kid, and gets caught, they are in trouble. They can't get off scot-free by saying "it's the parents fault for not taking care of them".

          1. doublelayer Silver badge

            That's true of intentional abduction, but it isn't true of accidental harm. If a child wanders away and falls down some stairs, we don't blame the owner of the stairs for not having posted a guard to monitor for unaccompanied children. There is a limit to how much we need to modify public spaces, including the internet, to attempt to get safety that will not be achieved. No matter how much we do, there will be things on the internet that a young child should not see.

            1. Doctor Syntax Silver badge

              In this case it seems the intentional abduction is the closer analogy.

              1. collinsl Silver badge

                Indeed. To stretch the previous analogy, this is like having a flight of stairs with no handrails and a sign saying "try the stair surfing challenge, grease is on your right"

      3. stiine Silver badge
        Happy

        Yes. Sucks to be them, but then I DGAF.

    2. John Brown (no body) Silver badge

      "The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation. So by this logic a site is liable if they moderate? A situation 230 was enacted to prevent (undoing the Prodigy decision)."

      FWIW, it's the opposite. Moderation is limiting what you can see. This ruling is about the company choosing and selecting what you DO see. The difference may seem subtle, but it's a vital difference. A little like the UK offence of TWOCcing compared to theft of motor vehicle. Taking WithOut Consent isn't theft because they don't intend to permanently deprive, unlike etheft where it will be sold on, or stripped for parts. (TWOC can also be applied to borrowing dads car without asking, especially if you get in accident, or pulled for no insurance etc.)

      ie seemingly very similar on the surface, but different outcomes and treated differently in law.

    3. Number 39

      Wasn't it recommended? How is that not curation?

      1. MachDiamond Silver badge

        "Wasn't it recommended? How is that not curation?"

        I'd agree that a recommendation is curation, but those aren't moderation. Moderation is also not censorship.

    4. Falmari Silver badge

      @prh99 "The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation."

      Moderation is when a site or service removes content they do not allow. Curation is when a site or service selects its content. When a site or service recommends content to its users it is selecting content and that is curation.

      When a site or service curates content for its users then it is now acting as a publisher and therefore liable for that content.

  7. Seajay
    Childcatcher

    Age, Parents, Identity & Privacy

    So there are a number of things that are not easy in this. First the age - a 10 year old child shouldn't be on tiktok - they have a minimum age of 13, and various restrictions for teenagers above that age. Anyone younger would go to "tiktok for younger users" a curated area...

    https://newsroom.tiktok.com/en-us/our-work-to-design-an-age-appropriate-experience-on-tiktok

    So had this child lied about their age - had they basically said they were an adult? Had the parents allowed it? Did the parent know or care what the child was doing?

    If parents don't care (as someone mentioned above), then should government force companies to be stricter about how you let people on in the first place? That then becomes a case of proving age, which leads to proving identity, which then comes with a whole load of privacy issues, which many adults balk at.

    So we currently have a position of "self-policed" age policies, which break down if you don't have adults prepared to stop children, or you introduce much stricter identity checks to prove who you are and how old you are... which probably needs an infrastructure of it's own setting up in order to protect privacy... but perhaps that's the way we should be going? It would need government intervention to force that though, as it would be a huge barrier to entry that companies just won't self-impose.

    Then there's the content in the first place. Should a "blackout challenge" be a thing anyway (even for adults)? How is it moderated, where does "freedom of speech" come into it? Is "Play stupid games, win stupid prizes" enough to say to adults as well? Where does the responsibility for more vulnerable people lie?

    1. MachDiamond Silver badge

      Re: Age, Parents, Identity & Privacy

      "If parents don't care (as someone mentioned above), then should government force companies to be stricter about how you let people on in the first place? "

      No, the parents should be on the hook for facilitating the child's access. There's only so much a company can do to "verify" somebody's age. The vast majority of 13yo's aren't going to have a credit card and as somebody well past 13, I'm not handing out my CC number for some sort of verification.

      https://www.youtube.com/watch?v=iaHDBL7dVgs

  8. Anonymous Coward
    Anonymous Coward

    degrading to humanity

    So much of scam/adds/propaganda on social media is pushing the worst of humanity, constantly lowering the common denominator for value of human life.

    It's all for greed - Marketing/manipulation, more sick people = more pill sales, more scarry news = more add views. Don't think for one second anyone on FB board of directors care that they cash in on children's suicides. All they care about is cashing in. Tiktok (chinese gov) staff likely celebrate every time a non chinese person makes a fool of themself or worse.

    all I can say is Oh Well - you chose to watch and post tiktoc is an amplified reflection of the worst of society.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like