back to article Do Facebook's algorithms drive political polarization? Meta says no, but researchers say it's complicated

Four research papers this week concluded that users' political beliefs and behavior don't seem to be all that impacted by information amplified by Facebook's algorithms. But does that mean the social media platform doesn't contribute to political divisiveness? Meta believes so. "Although questions about social media's impact …

  1. pc-fluesterer.info
    Alert

    Mock fight

    The main menace for democracy is not the algorithms but the individual targeted advertising based on "big data" user profiles. That can be quite manipulating, as seen in the election campaign for the MAGA POTUS. In Germany we have a recent example as well: https://targetleaks.de/

  2. Anonymous Coward
    Anonymous Coward

    Of course algorithms drive political polarization (though indirectly)

    Because online marketing is the business model of Facebook et al, and probably more profitable than selling drugs.

    If marketing did not work, social media would not be so hugely profitable. There is a fundamental reason for existence of a lucrative ecosystem of SEOs and social media agencies around the networks. Just check job offers and above average salaries in social media marketing.

    Intentionally or unintentionally those companies become ministries of truth, because what users see, read or buy is heavily manipulated by third parties. Even if the algorithms were not designed for it. Examples: fake reviews, sold back-links in search, fake views, likes, bot accounts and comments. It is not a coincidence that some countries employ whole armies of online trolls. They would not do it, if this did not work.

    Then there is a whole army of influencers, who discovered that stirring polarization is lucrative. Because people love scandal. Online tabloid journalism.

    The counter-measures should focus on tracking authority propagation, not information itself. That is: who said what and where. How authoritative was the source. Then who back-linked, liked or reposted a message. Unfortunately this information is typically non-public.

  3. I am David Jones Silver badge

    „The papers, published in Nature and Science (here are the links to the two other studies), show that people's political opinions aren't generally swayed by political news being posted and shared on Facebook“

    I guess what is relevant is whether, for those whose political opinions *are* swayed by FB, whether there is a tendency to move towards or away from the centre.

    1. pc-fluesterer.info

      No question here

      Of course the tendency is away from the centre. The algorithms foster divisiveness, polarisation, radicalisation, because that catches peoples attention.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like