Worth noting that one of the questions that Twitter/X refused to answer was "how many people do you have working on eSafety".
I suspect the answer is suspiciously close to "Zero".
Australia's e-safety Commission – the education and regulatory agency devoted to keeping Australians safe online – has warned Google and fined X/Twitter for inadequate responses to inquiries on how the platforms detect, remove and prevent child sexual abuse material and grooming. The Commission oversees Australia's Basic …
And you know this how ?
I saw a recent interview where Elon Musk stated the percentages of which the "extremely naughty/bad" content had been reduced. Ok it was E Musk himself that stated this but it was on a public interview and I am quite sure that he will have the figures to back it up otherwise he had to reason to state them as any falsification would obviously be fact checked...
I normally think that Musk has poor opinions about things, but I have to accept that he is close to a genius when you look at his history. Charging X for a regulatory weakness is one thing but it seems to be the social media environment that makes child sexual abuse shared everywhere these days - DAMN THAT"S HORRIBLE.
We need to look at the social media environment that is creating this in the background, Musk is working on his stuff, with no evidence of any personal sexual or social stupidity, only the ability to be filthy rich.
> Because it's not in his interest if he doesn't have them..
That hasn't worried him about any of the other figures that you would think were in his interest to know.
Like vaguely accurate delivery dates for cybertrucks and semis, why the physics of Hyperloop are nonsensical, how much use second hand digging equipment actually is.
Oh, and how much one ought to spend buying Twitter. Knowing a sensible number for that (and how to calculate it using, what is it called now, discovery or due diligence or something) might have been in his best interest.
> I saw a recent interview where Elon Musk stated the percentages of which the "extremely naughty/bad" content had been reduced.
Well, yes - the number of such posts reported to Musk has been reduced.
Every such post moderated has been counted and that total reported back to Musk (as a percentage of the whole).
The problem is, who is left at X to actually try and moderate? Who is there to run the searches for naughty content (and check to be sure they are running correctly)?
https://www.abc.net.au/news/2023-10-16/social-media-x-fined-over-gaps-in-child-abuse-prevention/102980590
Quoting directly from there:
The eSafety commissioner, Julie Inman Grant, can now require online service providers to report on how they are meeting any or all of the expectations as part of the eSafety Act.
"This was about the worst kind of harm, child sexual exploitation as well as extortion, and we need to make sure that companies have trust and safety teams, they're using people processes and technologies to tackle this kind of content," she told ABC News Channel.
"Frankly, X did not provide us with the answers to very basic questions we'd ask them like, 'How many trust and safety people do you have left?'"
True. But could be replicated by other jurisdictions around the world. Repeat infringements tend to trigger an escalation in penalties. Hence unless Elon changes tune this could be the beginning of some significant cost of doing business his way. By a magnitude or two?
That would hurt. Maybe cheaper to withdraw from those territories.