From experience, the horrors that content moderators are exposed to is really quite horrific. The general Facebook user has no real understanding of the absolute depths of depravity and disturbing imagery some people post on a daily basis.
Some do it purely because they find beheadings funny.
Some do it to prove a point (PETA), which most often has the opposite effect
Some do it as revenge against others who have 'wronged' them
Whatever the reason, content moderators are given no real counselling, and certainly are not paid enough for the exposure of terror they see every day, so you don't have to.
And, typically, the only real time you hear about moderators, is when they decide to censor something that some people think shouldn't be. Regardless if the rules, or interpretation of such, are broken; too many people forget that the use of social media is at the behest of the service provider, and not a right that everyone has - the number of people who complain when an image is taken down that, while breaking specific rules, is also considered a 'breach of free speech' or whatever nonsense they come out with, forget that it's entirely up to the platform if they wish to keep it up or not.
And yes, while it is true that people more or less know what they're signing up for when they choose to moderate any social media UGC, it doesn't make it any less horrifying for them when they come across content which is just beyond normal human tolerances for either gore, suffering or deviance. So it is unsurprising that people feel they can sue such firms, if they feel their time and effort to, let's face it - make these companies money, is undervalued by not offering proper support and training for such unpleasant content.
Generally speaking, the human race is a group of despicable beings, who delight in the pain and suffering of other people.