It seems like a politically safe opinion with regards to Tik Tok given the US' hostility towards it and its Chinese owners at present but I wouldn't be surprise if there's a sudden backtracking of this once it threatens to affect good old American social media companies.
TikTok isn't protected by Section 230 in 10-year-old’s ‘blackout challenge’ death
A US appeals court has issued an opinion that could have wide-ranging implications for social media platforms, finding that content selected for users by TikTok's algorithms doesn't qualify for Section 230 protection. In an opinion [PDF] published today, a three-judge panel from the Third Circuit Court of Appeals in …
COMMENTS
-
-
-
Wednesday 28th August 2024 18:45 GMT DS999
There isn't any way they can create a precedent for Tik Tok that doesn't also affect Facebook and Twitter. They might WANT to, depending political motivations, but the law can only be twisted so far. They have to come up with legal reasoning. I don't see how Tik Tok's "for you" is any different than the stuff that comes up in your Facebook feed not from those who follow, or similar on Twitter.
Now if there was a law saying Section 230 applies differently to US owned and foreign owned social media companies that would provide the out, but such a law would have to pass. I imagine it could easily pass congress, provided it wasn't held up by those who want to make bigger changes to Section 230 or eliminate it entirely.
I remember the good old days of Facebook long ago, when if you'd seen everything posted by your friends and pages you followed it would simply tell you that's it. Now you can doomscroll forever, and it'll keep coming up with stuff it thinks you'll like. Or rather stuff you might like or might hate, but stuff you'll engage with.
-
-
Thursday 29th August 2024 00:30 GMT DS999
That isn't an argument that Facebook or Twitter could try, since they both have their own in house AI. Not sure about Tik Tok, but the same is likely the case.
But I imagine someone using OpenAI or whatever for their business will try to excuse themselves when illegal things happen "because the AI did that, not us!"
-
-
-
-
-
Wednesday 28th August 2024 17:30 GMT The Dogs Meevonks
Excuse me whilst I......
Bwahahahahahahahahahahahahahahahahaha
hahahahahahahahahahahahahahahahahahaha
hahahahahahahahahahahahahahahahahahaha
C'mon... start nailing these fucking anti-social platforms to the wall for their utter failure to deal with the hate, bile, misinformation and dangerous content they profit from, promote and push to others....
I bet they get fined a tiny fraction of their income... but if this goes to a jury trial... the damages could result in hundreds of millions in compensation and punitive damages... then open the floodgates to more lawsuits.
The only thing these bloodsucking leeches understand is money... let's make it unprofitable for them to survive without dealing with the problems they've created.
-
Wednesday 28th August 2024 19:20 GMT Jellied Eel
Re: Excuse me whilst I......
C'mon... start nailing these fucking anti-social platforms to the wall for their utter failure to deal with the hate, bile, misinformation and dangerous content they profit from, promote and push to others....
I just hope it brings about the end of 'AI' generated 'recommendations'. Earlier today I made the mistake of watching a few mins of a popular YT personality, ie one that has somehow ended up with 10m+ subscribers. Turned out to be the usual clickbaity vid with lots of style, and virtually no content. Now, half my recommended videos are from this mu.. person. In which I'm also learning to hate the YT trends of clickbait combined with 'suprised pikachu face', or as Simon Whistler calls it 'soyface'.
AI may conceivably be useful if it could let me filter those out.
If this case leads to a feature long overdue, like allowing me to express my own interests, anti-social media companies could provide themselves with some protection, if I picked topics like 'how to win a Darwin Award'.
-
Thursday 29th August 2024 08:29 GMT 0laf
Re: Excuse me whilst I......
I no longer engage with these platform unless there is something very specific I need to see. I won't create an account and I'll use ad blockers if I have to go there.
But now, I don't YT, I don't X, I don't TikTok and I only visit FB now when I have to to engage with the local groups (school etc), the local pages we refer to as the "[insert town] bigotry and hatred group".
Using YT used to be actually interesting and even entertaining, now it's filled with such a staggering volume of junk, and monitised to such a grasping level that it's nearly impossible to find anything of value and when you do the actual viewing expereince is utterly awful.
Why I don't really understand is why such shit products continue to exist.
But then the majority of the web is garbage now, you can't even seach Autotrader without your searches being polluted by Cinch listings and of course there is no way to screen them out to only view local offers.
-
Friday 30th August 2024 19:05 GMT MachDiamond
Re: Excuse me whilst I......
"Using YT used to be actually interesting and even entertaining"
I find YT to be incredibly useful, but I don't follow their recommendations as they are useless. Just because I want to see how to change the O2 sensor on my car doesn't mean I want to "pimp my ride". Before I deleted the rear seat and built a platform so I could do some car camping and transport things more easily, I was able to see what mistakes other people were making. The list goes ever on. I don't bother with "influencers" and entertainers.
-
-
-
Friday 30th August 2024 19:09 GMT MachDiamond
Re: Excuse me whilst I......
"I don't seem to get these recommendations - I suppose you have to be logged in for it?"
When YT was bought by Google and The Big Evil put out their horrible EULA, I didn't sign up for anything further that Google had and closed my Gmail account a couple of years ago. I emphatically do not want to be a Google product and constantly surveilled by them while on-line. There's also no good reason to be signed in. The age restricted content can be unrestricted if you know how.
-
-
-
Wednesday 28th August 2024 19:32 GMT BasicReality
Re: Excuse me whilst I......
I look forward to rules that allow social media to be fined for what the government calls "misinformation" or "hate speech." Perhaps under the next Trump administration we could go after those who claim climate change is caused by people. Or maybe fine companies for saying covid shots are safe and effective.
If the government decides what is credible or allowed speech, be careful when you don't like the side that gets elected.
Better off simply making a blanket requirement that all speech is allowed, except for extremely minimal limits such as actively calling for violence against someone or doxxing a person.
-
-
Wednesday 28th August 2024 23:00 GMT John Brown (no body)
Re: Excuse me whilst I......
Yeahbut, sometimes the thumbnail and tagline is just sooooo enticing!!! And it only needs that one moment of weakness and down the rabbit hole you go :-)
Seriously though, sometimes it really does sound interesting, right up to about 5 seconds after you clicked the link to find a stream of Kenburns effect slide show photos with the standard (free? open source?) computer generated US documentary narrator voice "reading" a badly edited script, mispronouncing names and abbreviations at every opportunity and inserting strange pauses in odd places due to poor or misused punctuation in the text.
-
Thursday 29th August 2024 07:47 GMT The Dogs Meevonks
Re: Excuse me whilst I......
I've gone one further... I don't use social media at all. When some one tells me something... my first question is normally 'where did you get that information' and if the answer is from any social media site... I know to treat it with the distrust it deserves.
I have used them in the past... I was asked to use facebook in 2007 by work colleagues, deleted it in 2008, asked to use it by friends in late 2010, deleted in early 2012. Twitter I used from around 2011 as an ex said it was her main platform of choice... but I only used it to catch up with her every few yrs and deleted it in 2018.
The only one I enjoyed was G+ because whilst the data harvesting was still going on... there were no ads, no stream of shit pushed in your face from losers and fakes. It was all about the people you chose to follow and there was no forced follow back feature like with others.
I now only use Mastodon, and in terms of the engagement and people I've found... it's great. No algorithm, real engagement from real people... some claim it's crap but these are the same people who don't actually engage with others. The types who shitpost links for likes and expect everyone else to do their work for them and spread anything they post. I actually talk to the people who comment on my posts and I comment on theirs... ya know... like normal people being 'social' should.
-
-
Wednesday 28th August 2024 23:24 GMT Anonymous Coward
Re: Excuse me whilst I......
That would be all fine and dandy if morons didn't believe it... Of course, if that was the case, these propaganda outlets wouldn't exist.
I find it odd that you seem unable to distinguish the difference between blatant lies and peer reviewed truth. Seems the propaganda is working on you.
-
Thursday 29th August 2024 07:39 GMT The Dogs Meevonks
Re: Excuse me whilst I......
I know that the US loves to bang on about 'freedom of speech'
But that doesn't include 'freedom from the consequences of speech'
Many countries have laws regarding hate speech and dangerous content already. The spread of misinformation is what led to the recent riots in the UK... So let's clarify that part a little and say 'lies designed to misinform and mislead'
I'm no legal expert... I just want to see ALL of these platforms reigned in and forced to become less toxic wastelands of hate and shit.
-
Thursday 29th August 2024 13:01 GMT Anonymous Coward
Re: Excuse me whilst I......
No, what led to the riots is decades of ignoring the concerns of the native working class in the face of mass importation of cheap labour, which suppresses wages, the slow ethnic cleansing of entire towns by that same imported labour, and the complete inaction of the state in the face of the growing violence, rape, abuse, and murder carried by those same imported people. The sparking incident is largely irrelevant to those decades of resentment. It could just as easily have been any of the violent incidents that happened in the weeks before or after (the suspects in which are unlikely to go to trial before the new year, unlike the rioters, who were unjustly rushed through the courts within days or even hours of arrest). The fact it was that one is pure happenstance.
-
-
Wednesday 28th August 2024 22:53 GMT John Brown (no body)
Re: Excuse me whilst I......
"...start nailing these fucking anti-social platforms to the wall...fined a tiny fraction of their income... but if this goes to a jury trial... the damages could result in hundreds of millions in compensation and punitive damages... then open the floodgates to more lawsuits."
This is why they "settle out of court" with strongly bound NDAs. The aggrieved party gets "quick justice" and the accused gets to say "who, me?, not guilty, no fault" and the merry-go-round continues. None of the incumbents ever want to go to court over these sorts of cases. Same applies to Ts&Cs. No one wants' them tested in a court.
They only ever seem to end up in court when Govt. wants a whipping boy or it's another $BigCorp with equally deep pockets and both see the advantage of "winning", or at least not losing.
-
Friday 30th August 2024 19:16 GMT MachDiamond
Re: Excuse me whilst I......
"This is why they "settle out of court" with strongly bound NDAs. "
There's "common sense" and then there is precedent. Once a precedent is set. the lawyers will pounce on it creating more precedents to cite. Companies understand that if they lose once, they'll never have a chance to win in future. Cheaper to pay the odd plaintiff off to make them go away and end the news cycle before it gets going. If the NDA is very clear that if they talk, they have to give the money back, some people are smart enough to keep quiet. Some aren't and there have been some cases of that. If SuperMegaGiant, inc takes you to court for violating an NDA, chances are you will go down in flames and they might get some of their money back. The person will also be reminded that the case doesn't clear the NDA so if they talk some more, they can wind up in court some more.
-
-
-
-
-
-
-
Thursday 29th August 2024 13:19 GMT 0laf
Re: At last, some common sense
But the numbers reached were much smaller, you'd be less likely to be along while trying and possibly your friends might try to save you. Dare I also say due to exposure to actual real life not so many of us would be so stupid to try. Plus we were doing better things, like trying to light our own farts
-
-
Thursday 29th August 2024 15:36 GMT PRR
Re: At last, some common sense
> 40 years ago we really would not have credited that people would asphyxiate themselves after watching a vid
1965. A TV kids-show host told viewers to take money from parents' pants and pocketbooks, "....and mail them to me..." While his take was slim or none, "in 1965 plenty of adults were livid at the thought of a TV personality's crassly manipulating children for commercial gain".
Green Mail -- Wiki
His low take-up may have been due to not giving an address or where to find the postage stamps. I don't tic-toc, but I know today's YouTube would swiftly find detailed (if flawed) instructions on any stupid activity.
And yes, in 1965 I was just about at the age of the child in this case. "Saved" by being in a different city with better New Year Morning shows than Soupy Sales.
-
Friday 30th August 2024 21:17 GMT MachDiamond
Re: At last, some common sense
"1965. A TV kids-show host told viewers to take money from parents' pants and pocketbooks, "....and mail them to me..." While his take was slim or none, "in 1965 plenty of adults were livid at the thought of a TV personality's crassly manipulating children for commercial gain"."
When I was a small child, we had one TV in the house, dad controlled what was on and I was the remote control. I could watch Saturday morning cartoons and a few other things if the weather was bad, but the parental units monitored the content. Today, parents hand their kids their first unlimited service mobile at 5 yo when it's even a bad idea to have a TV in the kids room where what they watch isn't known.
There's been some action in the US with requiring an absence of mobile phones at school during the school day. At least 5 states have that on the books. I think that's a very good idea. There's little reason for a child to have a phone at school during class. The push back from some parents is that they want their kids to be able to call them if there is a problem, but schools don't want kids to have phones for the same reason. Not that staff is trying to hide something, but that if there is a discipline issue, it's often due to phone use and the last thing they want is some hysterical parent on the phone screaming obscenities at them along with the kid's foul mouth. It's not always the case, but in-person conferences with parents can be more manageable. Many people have better manners when they are face to face with somebody. That might be fading, though.
Getting back to the point, these challenges can cause damage when parents have done things to exclude themselves from the child's world to yet another degree. I can remember a few kids in my neighborhood whose parents interviewed the other kids and wouldn't let their little darlings play with any other kids that didn't meet the standard.
-
-
-
Wednesday 28th August 2024 22:13 GMT mattaw2001
Re: At last, some common sense
We already have hundreds of years of law for this - "letters to the editor" pages. Its great to see the rules being consistently applied.
Especially the two-faced "its an algorithm, we can't control it" and yet at the same time "its an algorithm, let's make giant piles of money from it".
I've never understood how "it's just the algorithm, bro - waaay mysterious" in any way allowed them to escape responsibility for its choices.
-
Thursday 29th August 2024 08:09 GMT Doctor Syntax
Re: At last, some common sense
"So they select content about asphyxiation, and users actually read it and/or try what is suggested. Who'd have thought it?"
Anyone who had experience of dealing with sudden deaths. Distinguishing between what's generally termed sexual asphyxia and suicide was a problem well before social media, the web or even the internet existed.
-
Thursday 29th August 2024 09:41 GMT Jellied Eel
Re: At last, some common sense
Anyone who had experience of dealing with sudden deaths. Distinguishing between what's generally termed sexual asphyxia and suicide was a problem well before social media, the web or even the internet existed.
Ah, good'ol autoerotic asphyixiation. One of those curious and high risk activities that can be practiced alone, or with friends. And walks the fine line between la petit mort and meeting Mort. It's one of those things where sites like YT and other social media can be good, or bad. So anyone searching for that should ideally be directed to content showing why that's a really bad idea & can be extremely dangerous.
But people are weird. This is one of my favorite YT channels, and probably NSF.. anyone who's just eaten.
https://www.youtube.com/watch?v=XAXfqISRZT4
A doctor describing what happened when a woman decided to do some extreme weight loss and bought tapeworm eggs online. Which is rather shocking given people are recommending and selling this as a diet plan. But also got me wondering about the legalities given 'health' supplements are largely unregulated, so if this is legal, and if yes, it probably shouldn't be. And then how to stop this when victims are probably being hooked by dieting groups, then caught via direct messages and 'friends' probably flogging this diet plan via affiliate links.
So a massive battle for health and law enforcement to try and catch & deal with dangerous stuff like this. Recently YT has started adding tags to qualified medical practioners, so at least some of the content may be trusted to some degree.
-
-
Thursday 29th August 2024 12:50 GMT Jellied Eel
Re: At last, some common sense
Nothing good about it when it eventually ends up with the family visiting the coroner's court. It's not a situation that I had to investigate myself but I've had colleagues who did.
That was semi-sarcasm/dark humor. Meet Mort "That was silly, wasn't it? And why the orange?". Pre-Internet, it's one of those things that people outside the BDSM community probably wouldn't have heard about, and if they did, may have been told about the risks. Then more people might have heard about it after sensational/salacious reporting around the orange case. Now thanks to the Internet, more thrill seekers might discover it, but might not be aware of the risks. So how 'social' media companies can either raise awareness of the dangers, or prevent kids from trying stupid stuff. Which is also a parental responsibility, but kids are increasingly tech savvy and find their way around attempts to block content.
It's also part of a worrying trend. People chasing monetisation, influence and free stuff are doing increasingly dangerous 'extreme' stuff for views. So I used to cave, and dive. Putting the two together is a recipie for disaster without proper training & kit. But it's exciting, gets views and people may copy it and some will inevitably get trapped and die. And then endanger rescue workers doing body recoveries. It's one of those areas where I think the 'social' media companies need to do more to warn people about the dangers.
-
-
-
-
-
-
Wednesday 28th August 2024 21:15 GMT Orv
Except Section 230 was specifically written to allow platforms to make moderation decisions without losing their liability protection.
It was included in the Communications Decency Act, which required platforms to take down certain types of content. Platforms pointed out that, under existing precedents, this would open them up to liability. Up until that point the accepted doctrine was what you suggest -- that platforms could either moderate nothing, or take responsibility for everything. Since the CDA effectively *required* moderation, this risked making it completely infeasible for platforms to have user-generated content at all. Section 230 was meant to remedy that.
TLDR; remove the Section 230 shield and platforms will have no choice but to stop distributing any user-supplied content and only publish their own curated stuff.
-
-
Thursday 29th August 2024 05:46 GMT doublelayer
It is not the intent of the ruling, though it wouldn't be hard to extrapolate it into doing that anyway. However, it is exactly what the original post in this thread would get if their idea was implemented. From previous posts, I'm guessing they're one of the people who don't like how moderators removed or posted additional information around something they agree with, and they want that to be illegal, but they haven't considered that the law they're trying to gut is the main reason that any similar posts are available at all.
-
Thursday 29th August 2024 19:16 GMT Orv
The effect would be the same. There's no getting around using "algorithms" to decide what people see. "Sort by date" is an algorithm. So is "select a video randomly," for that matter.
Techdirt's post on this gets at the problems pretty well:
-
Friday 30th August 2024 14:34 GMT 96percentchimp
Neither sort by date nor random selection are editorial choices. They might even be user choices. An algorithm designed to encourage engagement based on a user profile is, without any doubt, an editorial choice which meets the First Amendment tests. If it qualifies for free speech, then it qualifies for the responsibilities that come with those rights.
-
-
-
Friday 30th August 2024 21:38 GMT MachDiamond
"TLDR; remove the Section 230 shield and platforms will have no choice but to stop distributing any user-supplied content and only publish their own curated stuff."
In this case it's the platform curating a list of recommendations using some sort of automation to keep people from leaving the website. A human curator might down-check "challenges" where people would put themselves or others in harms way, but that would cost money and give the appearance of advocating dangerous behavior where there's a possible out if a machine does the same thing.
There a choice of not making recommendations at all or setting up to only offer more of the same. If you click on a video of the Rolling Stones, all of recommendations are also videos of the Rolling Stones (from an official channel).
The influencer problem stems from the promotion algorithms. I've chided a person who's content I enjoy on the business of professional commercial photography that goes on about how he has to craft titles and photo thumbnails to gain more viewers. I find the reasoning not well thought out. His content is not general purpose T&A but something that's only going to appeal to a small number of people. Plenty of people will find his channel that are truly interested in the sort of content he puts out. It's the same as me not wanting to drive millions of people to my web site. That would be a negative as it might use more bandwidth than I am allocated and do me no good at all. If I only got 10 views a month and 2 people contacted me for work, that's far better than 100,000 visitors with 10,000 stupid questions and 2 jobs. I'm a quality over quantity sort of person. I also have no time to delete 10,000 stupid questions.
-
-
Wednesday 28th August 2024 21:20 GMT doublelayer
"now we need to get the other side. Allow all speech and keep protections, but when you decide to block certain topics, now you're a publisher and liable."
You do realize that, with something that stupid, the law would then say that everything at all would have to be posted to keep the protections. I.E. unless you keep up the terrorist beheading videos, you're liable. You would effectively prevent all public posting, including these forums, except for those places so extreme that they don't mind hosting literally anything, no matter how illegal, that someone decided to upload. That goes for the places you like as well. Maybe you're into some conspiracy theories and you're tired of those being moderated. Sorry, but if the sites that are keeping those up ban anything, be it even more extreme ones that you don't believe in or people disagreeing, they can now be sued for anything they keep up, meaning they're much less likely to decide to keep up the stuff you want to see or get away with it if they do.
-
-
Wednesday 28th August 2024 19:43 GMT Anonymous Coward
Social media companies, are bars that open 24/7, have no bouncers, the fire escapes are locked, and they employ legally blind bar staff so they can use that as an excuse for serving drinks to kids in school uniform.
Then they claim no responsibility for the fights, deaths and people pissing on streets for miles around them.
It doesn't fly in meatspace , long past time that they have adequate bouncers and fire escapes, and if that puts them out of business, well that's what happens to bars and clubs too.
-
Wednesday 28th August 2024 21:53 GMT heyrick
When I was at school (so late 80s), a local pub was quite happy to serve drinks to kids in school uniform. The guy's logic was "they are here enjoying some watered down beer and not god only knows where getting up to mischief". The local rozzers turned a blind eye so long as we (mostly) behaved because, well, the guy had a point. Plus being inexperienced children we were dumb enough to think that was real beer, though I usually asked for a tea because, ugh, beer tastes awful, and I'm one of those people who believes the only correct response to anything is to put the kettle on...
-
Thursday 29th August 2024 09:39 GMT I ain't Spartacus
My school had a similar pub next door. Although, we were able to drink there because the landlord wasn’t very nice, and nobody else went in.
That pub is now a famous Michelin stared restaurant. I tried to book a table a few years ago, and there was an 18 month waiting list. So it was easier to get in, when I was 16,in school uniform and it was illegal…
-
Thursday 29th August 2024 19:15 GMT MrReynolds2U
Same here
I used to go into the pub next door to my school occasionally when I had a gap in the day. We'd play pool and have a pint.
The unwritten rule was that you had to take your school tie off first. That way you were just someone wearing a shirt and trousers.
It made the Maths lesson on return to school a lot more fun.
-
-
Wednesday 28th August 2024 20:20 GMT prh99
Yet another court inventing a section 230 exception.
The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation. So by this logic a site is liable if they moderate? A situation 230 was enacted to prevent (undoing the Prodigy decision).
The internet isn't safe kids and parents need to parent not use TikTok etc as a babysitter.
-
-
Wednesday 28th August 2024 21:57 GMT heyrick
"then it's just tough shit and they deserve be left to mercy of every nasty nutter on the internet?"
Not so different to real life if a child decides to wander off and try taking themselves to look around a city. The child might come back with a load of new experiences. Or they might vanish and never be seen again.
In either case, as the above poster notes, there is surely an aspect of parental neglect in failing to, you know, parent.
-
-
Thursday 29th August 2024 05:51 GMT doublelayer
That's true of intentional abduction, but it isn't true of accidental harm. If a child wanders away and falls down some stairs, we don't blame the owner of the stairs for not having posted a guard to monitor for unaccompanied children. There is a limit to how much we need to modify public spaces, including the internet, to attempt to get safety that will not be achieved. No matter how much we do, there will be things on the internet that a young child should not see.
-
-
-
Wednesday 28th August 2024 23:14 GMT John Brown (no body)
"The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation. So by this logic a site is liable if they moderate? A situation 230 was enacted to prevent (undoing the Prodigy decision)."
FWIW, it's the opposite. Moderation is limiting what you can see. This ruling is about the company choosing and selecting what you DO see. The difference may seem subtle, but it's a vital difference. A little like the UK offence of TWOCcing compared to theft of motor vehicle. Taking WithOut Consent isn't theft because they don't intend to permanently deprive, unlike etheft where it will be sold on, or stripped for parts. (TWOC can also be applied to borrowing dads car without asking, especially if you get in accident, or pulled for no insurance etc.)
ie seemingly very similar on the surface, but different outcomes and treated differently in law.
-
Friday 30th August 2024 10:53 GMT Falmari
@prh99 "The supreme court also said in various cases that it's free speech when a site or service picks what content they will allow...moderation, arguably a form of curation."
Moderation is when a site or service removes content they do not allow. Curation is when a site or service selects its content. When a site or service recommends content to its users it is selecting content and that is curation.
When a site or service curates content for its users then it is now acting as a publisher and therefore liable for that content.
-
-
Thursday 29th August 2024 09:40 GMT Seajay
Age, Parents, Identity & Privacy
So there are a number of things that are not easy in this. First the age - a 10 year old child shouldn't be on tiktok - they have a minimum age of 13, and various restrictions for teenagers above that age. Anyone younger would go to "tiktok for younger users" a curated area...
https://newsroom.tiktok.com/en-us/our-work-to-design-an-age-appropriate-experience-on-tiktok
So had this child lied about their age - had they basically said they were an adult? Had the parents allowed it? Did the parent know or care what the child was doing?
If parents don't care (as someone mentioned above), then should government force companies to be stricter about how you let people on in the first place? That then becomes a case of proving age, which leads to proving identity, which then comes with a whole load of privacy issues, which many adults balk at.
So we currently have a position of "self-policed" age policies, which break down if you don't have adults prepared to stop children, or you introduce much stricter identity checks to prove who you are and how old you are... which probably needs an infrastructure of it's own setting up in order to protect privacy... but perhaps that's the way we should be going? It would need government intervention to force that though, as it would be a huge barrier to entry that companies just won't self-impose.
Then there's the content in the first place. Should a "blackout challenge" be a thing anyway (even for adults)? How is it moderated, where does "freedom of speech" come into it? Is "Play stupid games, win stupid prizes" enough to say to adults as well? Where does the responsibility for more vulnerable people lie?
-
Friday 30th August 2024 22:02 GMT MachDiamond
Re: Age, Parents, Identity & Privacy
"If parents don't care (as someone mentioned above), then should government force companies to be stricter about how you let people on in the first place? "
No, the parents should be on the hook for facilitating the child's access. There's only so much a company can do to "verify" somebody's age. The vast majority of 13yo's aren't going to have a credit card and as somebody well past 13, I'm not handing out my CC number for some sort of verification.
https://www.youtube.com/watch?v=iaHDBL7dVgs
-
-
Thursday 29th August 2024 14:18 GMT Anonymous Coward
degrading to humanity
So much of scam/adds/propaganda on social media is pushing the worst of humanity, constantly lowering the common denominator for value of human life.
It's all for greed - Marketing/manipulation, more sick people = more pill sales, more scarry news = more add views. Don't think for one second anyone on FB board of directors care that they cash in on children's suicides. All they care about is cashing in. Tiktok (chinese gov) staff likely celebrate every time a non chinese person makes a fool of themself or worse.
all I can say is Oh Well - you chose to watch and post tiktoc is an amplified reflection of the worst of society.