Facebook is Testing Ability to Warn Users about Extremist Messages

35

Facebook is testing an alert feature to detect extremist ideas and provide people with the information they need to seek help. The test on Friday caused concern among the American Republican party.

 

Facebook spokesperson Andy Stone indicated on Twitter that it is an initiative to combat violent extremism. The Redirect Initiative aims to direct people who type search terms linked to hatred or violence to specific resources, information and support groups.

For example, according to Facebook, searches linked to white supremacy in the United States are being diverted to the Life After Hate group, which helps people leave far-right groups.

Screenshots of warning messages shared on Twitter show Facebook also asking users if they are concerned that one of their friends is becoming an extremist or have been exposed to extremist ideas themselves. The users then can click on a link to get help or dismiss the pop-up message.

Facebook and other online platforms are under pressure to end the spread of fake news and other messages that could fuel violence.

The social network has recently beefed up all of its automatic tools that help moderators keep exchanges on its Facebook pages and groups polite. The tools check whether the messages respect the rules about what content is acceptable.

Facebook also removes profiles that do not respect the rules. For example, ex-President Donald Trump’s profile was removed after he encouraged his supporters to storm the Capitol in Washington in early January.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.