Facebook Notifies Users Who Read Misinformation About Corona

35

Facebook users who like, share, or comment about erroneous information about COVID-19 will soon be notified if it turns out to be incorrect information.

 

If a message on Facebook is flagged as false by Facebook’s independent fact-checkers, consisting of professional news organizations, then who has responded to it in the past or clicked through it was wrong information. They are also referred to as the WHO or the FPS Public Health.

Facebook has been using that approach for fake news in general. Besides, it says that it has since removed hundreds of thousands of posts that have violated the guidelines of the social network. The exception is statements by politicians.

Facebook’s report comes just after campaign platform Avaaz comes out with its own investigation investigating 104 posts and videos on Facebook that have been labelled false by fact-checkers.

On average, it took 22 days between it to appear until Facebook labelled a message as misinformation. A large part of the debunked stories continued to circulate without a label.

Facebook is the only platform that informs users of misinformation and the platform puts a lot of effort into preventing fake news from circulating undisturbed.

The company recently announced that it would also extend that approach in Belgium in partnership with Knack, a sister publication of Data News, and DPA.

For the sake of clarity, however, the approach does not mean that those who do not receive a report do not read fake news. For example, most fact checks are based on English-language content, and not everything is checked. But in this way, the social network tries to do as much as possible to contain the problem.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.