Re: Oh, now sites are responsible for what's posted on them?
"If you don't like the way Facebook curate your feed, stop using Facebook. "
Why thank you, In fact I've stopped using FB years ago, but that's not the point. FB aren't curating the feeds in a way that the users prefer (since users can't express any preference among all available posts), they are curating it in a way to maximise user engagement and dopamine micro-hits. For most users, they have no way of knowing if the feed is being in their best interest because they have no alternate comparison (ie they don't know what FB is filtering out). FB might be filtering out a bunch of posts that to the user would be extremely interesting (but to which the user has less chance of likeing or resharing), and the user would never know.
Either way, what Facebook is doing, algorithmic or not, is editorial, and therefore they should be liable to clean out illegal content at the very least within a certain time limit of it being flagged, and they should be jointly liable with the poster should they decide to ignore or permit illegal content.
Regarding "And as for your comments on moderation... just go and do some real research before repeating them. You obviously have completely missed the scale of the problem."
I don't need to research the scale, I know it's massive. BUT If FB algorithm can curate a gazillion posts and feeds in real time, it can just as efficiently scan and flag a lot of problems in real time. It needs human eyeballs only on posts that either score very highly on their AI warning scale or on post that are actively reported (ie they use their own users as content moderators to cut costs). Even then, you are probably right that given the scale of the issue, many illegal posts might slip through, but at least it would be an improvement.
Your solution seems to be to shrug shoulders, absolve FB of all responsibility and give up even trying.