Facebook’s algorithms promote posts featuring “outrage” and “sensationalism”, leading to more outrage and divisiveness

Facebook curates your “news feed”. FB promotes posts that have likes, comments or shares, or which are from people you have previously interacted with. The posts from friends with whom you have not interacted much gradually vanish.

Thus, FB’s curated news feed hides much from you while promoting items that, as a side effect of their algorithm, are likely to be sensational or causing outrage resulting in more interaction.

Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook.

They concluded that the new algorithm’s heavy weighting of reshared material in its News Feed made the angry voices louder. “Misinformation, toxicity, and violent content are inordinately prevalent among reshares,” researchers noted in internal memos.

Source: Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead. – WSJ

Many groups – such as political groups and those operating as propaganda arms to persuade others to adopt their position – figured this out and intentionally posted more provocative content in order to get their posts seen more widely.

The effect of Facebook and other social media “curation” is to promote a culture of outrage that feeds upon itself, creating still more outrage and divisiveness. This in turns leads to more and more angry people, and eventually spills over into real life confrontations and protests. Facebook has publicly denied this is their fault, while their internal documents show they’ve known about it for a long time.

Facebook has long blamed users for “misinformation” being spread on their platform; yet their own algorithms dramatically encouraged the spread of misinformation and inflammatory rhetoric.

I have an entire blog on the ills of social media – see SocialPanic.org.

I saw the effects of their algorithm changes first hand. In addition to the Social Panic blog, I also run a blog on computer programming. Years ago, both had their posts cross published to their own group pages on Facebook. Eventually I saw that the viewership of the group pages had plummeted to where maybe just 1 in 10 members of the groups ever saw the posts. Facebook’s algorithm had largely censored views from the blog posts. I eventually stopped using Facebook groups because their value became nil.

In effect, FB’s algorithm changes, meant to improve interaction, had the opposite effect for these non-controversial topics to the point their content literally vanished from Facebook.