More than 200 moderators from Facebook say the social media company is putting their lives at risk.
This is because they are obliged to work in the office, even if the workplace is located in a corona fire. The complainants wrote a letter to Facebook’s leadership.
Moderators are used worldwide to remove harmful content on platforms such as Facebook. This includes images with violent or insulting content. In the case of Facebook, the moderators are often put to work at the company through outsourcing.
According to the letter writers, several employees in the office have now become infected with covid-19.
At the start of the crisis, Facebook asked its employees to work from home as much as possible. However, some content, such as explicit images related to children, must be tested in a secure environment.
In some cases, Facebook tries to keep the images from the platforms via artificial intelligence (AI). However, according to the complainants, Facebook’s AI is not smart enough to replace human judgment and therefore requires physical presence.
In addition to the call for a safer workplace, the moderators also want to be offered a permanent contract with Facebook itself. A risk compensation must, therefore, be included in any agreement.