Bug at Facebook Gives Harmful Posts Much More Visibility

37

Facebook posts with violence or fake news that are usually automatically less visible were shown just more than usual due to an error in the algorithm. Facebook nuances that the impact was limited.

 

According to an internal report that The Verge was able to get hold of, the problem was noticed by Facebook itself in October and has been solved since March 11. Last fall, Facebook noticed that half of all newsfeeds were shown more harmful content in the past six months.

It concerns posts that are not strictly prohibited according to the company but that are on the edge of what is allowed. For example, think of certain forms of nudity, posts with violence or disinformation that external fact-checkers have already refuted. Such posts may continue to be shared, but Facebook ensures that they appear less spontaneously in other people’s newsfeeds, making them less likely to go viral.

But that’s where it goes wrong. Instead of hiding them, they were shown more often, increasing the number of views by thirty percent. Initially, the company found no cause because such posts first increased and later decreased, only to return later, but the problem has been solved since the beginning of this month.

Facebook confirms the incident to The Verge but minimizes it by saying that it was a small and temporary increase. “It has had no meaningful impact on our statistics.”

We must, however, nuance that in the past, Facebook regularly lied and kept silent about figures and the impact of its own social network. For example, last year, the company wanted to frame its own security problems as a sector problem; later, it knew internally how harmful Instagram is for young girls but has kept silent about this externally. In the past, it has also lied to advertisers about video’s success on its platform.

This week, it was also announced that parent company Meta had hired a PR agency to spread negative stories about TikTok and frame that platform as a danger to children, often citing stories that originated on Facebook. For example, for more than a decade, scandals about Facebook and how the company communicates about it have been surfacing to such an extent that it’s impossible to know whether Facebook is telling the truth or flatly lying about its internal kitchen.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.