back to article Our amazing industry-leading AI was too dumb to detect the New Zealand massacre live vid, Facebook shrugs

Facebook admitted, at best nonchalantly, on Thursday that its super-soaraway AI algorithms failed to automatically detect the live-streamed video of last week's Christchurch mass murders. The antisocial giant has repeatedly touted its fancy artificial intelligence and machine learning techniques as the way forward for tackling …

  1. macjules
    FAIL

    You mean make more money?

    it can “stop the spread of viral videos of this nature,” and react faster to live videos being flagged.

    If they can spot FB Live videos of massacres etc, couldn't they do something with their "AI" along the lines of "we can deduce that someone might be planning a massacre". Just throwing that idea in the pot, so to speak, and not at all looking forwards to Facebook's idea of The Minority Report.

    1. Dan 55 Silver badge

      Re: You mean make more money?

      You could do it from the audio stream alone alone and quite easily find and ban any terrorist or gangland video involving shooting, if it weren't for the yee-haa fuck-yeah redneck gun demonstration videos which are wholesome family viewing and attract advertising dollars.

      1. Anonymous Coward
        Anonymous Coward

        Re: You mean make more money?

        Those could be at least put in a queue to be published only after verification.

        1. Dan 55 Silver badge

          Re: You mean make more money?

          Muh freedumb of speech!

      2. Anonymous Coward
        Anonymous Coward

        Re: You mean make more money?

        If they detected the audio signature of gunshots and blocked those videos for further review, someone would edit the video to dub "pew pew pew" for the gunshots.

  2. Chris G

    Facebook virgin

    I have little to no clue how FB works but curious; Did Facebook make any money out of the videos before they were taken down?

    1. Dan 55 Silver badge

      Re: Facebook virgin

      I think a livestream needs the streamer's co-operation to stop for ad breaks, but shared copies certainly have advertising, which is why they're currently facing a boycott.

    2. Anonymous Coward
      Anonymous Coward

      Re: Facebook virgin

      "I have little to no clue how FB works but curious; Did Facebook make any money out of the videos before they were taken down?"

      Yes. Consider it similar to television - FB display ads around content, the main difference being that ads are displayed concurrently with content, as you find on websites such as ElReg if they aren't removed by adblockers.

      The distinction (trying to be completely fair to FB) is that FB would show the same or very similar ads if you were to view Mr A Public's cat videos or Mr B Troublemakers acts of terrorism.

      A more PR savvy organisation would have acknowledged the potential for profit from this video and made a significant charitable donation that offset any arguments about them profiting from the videos almost immediately.

    3. Harry Haller
      Pirate

      Re: Facebook virgin

      ...Raises a point ... does Co-op funerals advertise on Faecesbook. Might be missing a trick ..probably an opportunity for a few "over 50s funeral plans" as well

  3. lglethal Silver badge
    Go

    Im honestly curious here about something, if anyone knows more please respond below.

    So someone starts uploading dodgy content (dodgy/disgusting/distressing/illegal content). Its flagged by the AI or even by a real person. And the upload is blocked. What happens then? From the sounds of it nothing.

    What should happen? Well in my humble opinion, a) account of the uploader blocked, b) the IP address of the uploader recorded, c) if the content is actually illegal (and im guessing most of the truly distressing stuff is whether its because its video of actual criminal activity or just falls foul of various obscentiy laws), then the IP address is passed over to the cops. The police can then obtain necessary warrants, and go arrest these people.

    This would result in a few improvements to the anti-social media environment - a) those workers who have to actually peruse all this distressing content would probably suffer less mental illness (if you know that the people uploading this are being brought to justice and that your actions are making the world a better place, then you can withstand a lot more than knowing all you can do is play whack-a-mole without consequence for the bastards), and b) people would be less willing to upload/share this stuff if they were expecting a knock on the door from PC Plod. Everyone wins.

    I half expect at least a few commentards to start banging on about "Freedom of Speech" yadda yadda, but two things - a) facebook blocks this stuff already, and there is no requirement for facebook to allow your bollocks on its platform, and b) if the best thing you can say about what your saying is, that its not technically illegal to say it, then you really dont have much of an argument.

    Whilst IP Address searches are not going to stop the hardcore wackos uploading stuff, those using VPN's, tunnels, or those coming from countries with no laws on this stuff, etc. the vast majority that reshare these videos are doing it from their own accounts and own residences. Stopping those secondary sources, stops these things spreading beyond their fringe element. And that has to be worthwhile...

    1. cornetman Silver badge

      "I half expect at least a few commentards to start banging on about "Freedom of Speech" yadda yadda, but two things -.."

      It's a lot easier than that. We see bombings and shootings on the news all the time, Admittedly, it's "the bad people" getting shot and blown up so presumably that's OK.

      When we're doing the shooting, then it's news. If <insert_current_bad_people> are doing it, then it's "disgusting propaganda".

      I'd rather take my news unfiltered and make my own decision on what I think, thank you very much.

      1. Anonymous Coward
        Anonymous Coward

        The news doesn't show people being shot or blown up. It shows people shooting, or explosions from a distance. Very different from a helmet mounted camera showing people being shot only a few feet away.

        1. Zolko Silver badge

          @ DougS : "The news doesn't show people being shot or blown up"

          The "news" may-be, but films and video-games certainly do show that too. And that leaked video showing a US helicopter shooting unsuspecting people in the street showed that too. Therefore, we can conclude that: showing real people being blown up for spectacle: that's OK. Showing virtual people being blown up for fun, that's OK. Showing bombs blowing up real people for real, it's OK as long as you don't actually see the real people being dismembered by the real bombs. Showing real police shooting and beating real people during civil manifestations, that's OK.

          I don't recognise a pattern there, therefore it's no surprise that an AI wouldn't either.

        2. Anonymous Coward
          Anonymous Coward

          @Doug S

          re: "The news doesn't show people being shot or blown up."

          It should do. People would then be rather less gung-ho about getting involved in wars, if they actually saw death and dismemberment close up.

  4. Michael H.F. Wilkinson Silver badge
    Mushroom

    How do they intend to get more data to train their "AI"

    What they are really saying is that they need MANY more videos of white supremacist massacres shot from the point of view of the killer to train their "AI" (artificial idiot?) to detect them. Well isn't that just dandy. Any volunteers to be the victims?

    </sarcasm>

    1. Headley_Grange Silver badge

      Re: How do they intend to get more data to train their "AI"

      Doom.

    2. paulf
      Unhappy

      The need for AI moderation [Re: How do they intend to get more data to train their "AI"]

      FTA: "The job can be extremely mentally distressing, so the ultimate goal is to eventually hand that work over to algorithms."

      Perhaps we should take a step backwards and wonder why the Facebook platform is attracting the kind of content upload that would cause such distress in the first place? We're told Facebook is a wonderful way for the kids to keep in touch with Granny and to watch cat videos, but this kind of content sounds like the kind of nasty that would be more at home on the dark web. Whatever your view on moderation [Human|AIMachine Learning] or [proactive|reactive], maybe that's the problem we should be solving first?

  5. Anonymous Coward
    Anonymous Coward

    Storyville, The Internet’s Dirtiest Secrets: The Cleaners: www.bbc.co.uk/iplayer/episode/m0003f2f via @bbciplayer

    1. Alan Brown Silver badge

      "Storyville, The Internet’s Dirtiest Secrets:"

      I don't need to see that.

      Having spent time working with "cleaners" in the past, I'll say that one one of the single greatest problems is that companies create such positions and then staff them with people they specifically want to fail at it - they systematically _prevent_ them from taking action and deny them resources to get the job done, then toss them under a bus when they burn out.

      The ONLY way to make companies change their ways is to hold their feet to the fire and hold C-level management individually accountable for their actions. Otherwise what you get is hollow assurances and meaningless box ticking exercises so they can _say_ they're taking action whilst spending as little money as possible and possibly getting rid of "troublesome employees" whose "sin" has been wanting to do the right thing.

      1. bobblestiltskin

        The ONLY way to make companies change their ways is to hold their feet to the fire and hold C-level management individually accountable for their actions.

        Sometimes I think that Burgess had it right in 'A Clockwork Orange'. I think that these executives should be made to watch the live streams of the self-harm for a day a week to see the effects of their largesse of their products upon society?

  6. tiggity Silver badge

    No surprise

    Current "AI" will always struggle, even when trained to pattern match something, a few edits to a video and its soundtrack can allow it to slip through, even though it's essentially the same offensive content as its now fooled the algorithms.

    .. Alternative is just a blunderbuss algorithm, where false positives reign supreme and just about anything with any potentially offensive content needs human OK

    Too many politicians / legislators have an idea of all sorts of magic algorithms that can be applied (be it for copyright tests, detecting offensive material) - the big tech companies need to emphasis that there is no intelligence in the AI, its just pattern matching that is very flawed and we are decades away from technology that can analyse content in the way a human does *

    * Even then there are issues, someone adapt at using Unreal engine (or other game creation tools of choice), albeit with lots of time needed, could create a "game" footage video sufficiently similar to the NZ actual video to trigger erroneous pattern match

  7. Anonymous Coward
    Anonymous Coward

    Our Hearts go out, but we will keep the add revenue

    Pity about those lives, but hey, we still get to bank the add revenue from the page views those vids brought us.

    1. Jason Bloomberg Silver badge
      Stop

      Re: Our Hearts go out, but we will keep the add revenue

      While I have little truck with Facebook or social media in general - I called out Zuckerberg as a "cunt" just a few days ago - I don't believe it's fair or reasonable to suggest Facebook deliberately set about making money from this tragedy, or that any availability of such footage came about through a desire to profit from that. That to me is just opportunist demonisation.

      There are plenty of legitimate criticisms which can be levelled at Facebook, on this and other issues, but it seems to me that the greater problem here are the sick fuckers who made a determined and concerted effort to get the footage disseminated 1.5 million times than Facebook's failures to prevent that.

      It's a wake-up call that Facebook isn't as good at its game as it might like to think. Their reporting process is flawed, and compounded by only one person reporting, their response times are less than desirable, their faith in AI is misplaced, over-optimistic at least.

      But Facebook is also a victim here, of those who wanted to glorify and publicise this terrorism, who set about doing that, who would be delighted if Facebook were destroyed, and our condemnation is directed at Facebook and not them.

      We must not aid the terrorists and their supporters by allowing them to misdirect our outrage and anger at Facebook.

      1. Paul Crawford Silver badge

        Re: Our Hearts go out, but we will keep the add revenue

        But Facebook is also a victim here

        Not in my book. They allow instant-streaming and easy sharing to maximise the addictive nature of anti-social media for revenue-generating reasons.

        They could easily change things to enforce a delay of 30 minutes on uploaded material before anyone other than the up-loader can see it, have moderators checking shared videos that get a sudden spate of interest, etc, and they have $22B profits to easily cover the cost. So I have no sympathy for FB in this case as it is an inevitable consequence of their operating model and the inevitable few sickos out there.

        1. catprog

          Re: Our Hearts go out, but we will keep the add revenue

          So if someone wants to do a livestream(For instance an artist drawing a commission, or a gaming session or a spacex launch) they are no longer allowed to do so?

    2. Dan 55 Silver badge

      Re: Our Hearts go out, but we will keep the add revenue

      Thoughts and prayers, but no donating the ad revenue from these videos to charity, or even acknowledging that ad revenue was made from these videos.

      Meanwhile, anything goes, moderate as little as possible, pretend Facebook has nothing to do with society being polarised, and carry on raking it in.

  8. SVV

    AI video detection

    If people have saved the livestream and then re-uploaded it, wouldn't it be easy to spot the files, as their data would be identical to a lesser or greater degree?

    And simpler and faster than some shoddy AI.

    1. Anonymous Coward
      Anonymous Coward

      Re: AI video detection

      "If people have saved the livestream and then re-uploaded it, wouldn't it be easy to spot the files, as their data would be identical to a lesser or greater degree?"

      Yes, which is why FB have blocked 1.4m+ uploads.

      No if the content is uploaded to other services or modified in ways that bypass the AI (i.e. mirroring the video, altering colour schemes in easily reversible ways, altering size by trimming content, the addition or removal of watermarks etc). Check out movie/TV sharing forums for ways to avoid the AI.

    2. SonOfDilbert

      Re: AI video detection

      Isn't this explained in the article?

  9. Pete4000uk

    Not ready

    This kind of tech is not ready yet. Just look at the stories of AI thinking a dog is a musical instrument. I expect things like compression artifacts would trip up AI.

  10. Alistair
    Windows

    Are we all missing something here?

    Is there something horribly wrong with society as a whole?

    Or are there 1.5 Million truly f0cked up humans out there who just plain don't understand how f0cked up they are?

    It staggers my imagination NOT that FB was unable to stem the tide, but that we have that many people who think that posting that video on FaceBook of all places MAKES SENSE!!!

    /mutters ger orf me lawn and considers quitting the human race.

    1. Anonymous Coward
      Anonymous Coward

      RE: Are we all missing something here?

      "Or are there 1.5 Million truly f0cked up humans out there who just plain don't understand how f0cked up they are?"

      A friend working in China was horrified to watch work colleagues at a school gathered round watch the video as though it was a TV show. I would hazard a guess that there is a disassociation between what they are watching and the individuals reality i.e. they can't relate to a different culture being affected in a distant country that they are unlikely to ever go to.

      I've mentioned Live Leak and worse before - this type of material (graphic violence resulting in severe injuries/death) has been available for some time and seems to get a fairly consistent audience based on publicly available stats.

      "Is there something horribly wrong with society as a whole?"

      I believe 99% of society is okay. Amongst 7.7 billion people with around a third having decent Internet access permitting video streaming, that leaves around 25 million globally that lack the social skills to empathize in these types of situations and search for this type of content.

      Feel free to alter that number by your own perception of society. And remember to smile....it's Friday.

  11. Aristotles slow and dimwitted horse

    The reason...

    The reason Facebook AI didn't work is because it's not AI. In fact, it's nowhere near the 'I' in AI at all. But I think we all agree that it is artificial.

  12. the Jim bloke
    Pirate

    If Facebook had actual competitors

    You can bet that Facebook would be able to screen and find any objectionable content that the rival put up, almost immediately.

    Dont let companies self-regulate, they will do the bare minimum - or less.

    Get someone who HATES them to watch for misbehaviour, and give the regulatory body real teeth... big sharp pointy teeth, and performance bonuses for using them...

    1. Anonymous Coward
      Anonymous Coward

      Re: If Facebook had actual competitors

      Good idea - get some anti FB politicians to do it.

      The sooner some of them experience first hand the PTSD inducing filtering they demand of FB employees, the better.

      Maybe FB should just automatically forward some of it to our elected representatives who see themselves as judge, jury and executioner, for them to decide.

  13. Anonymous Coward
    Anonymous Coward

    And FB are still claiming not to be a publisher?

    Censoring content suggests otherwise.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like