Shareholders Want More Control Over Risk and Safety at Meta

65

Some shareholders and a nonprofit are asking Meta, the parent company behind Facebook, for more independent scrutiny of the company’s efforts to maintain overall security on the company’s platforms. Although such calls in the past often fell on a cold stone.

 

The letter, which Axios could view, calls for an independent review from the audit and risk oversight committee. Actually, an audit of the internal control body of Facebook/Meta. Specifically with a view to the risks to public safety and the public interest.

Harrington Associates and Park Foundation, two shareholders of Facebook, and the nonprofit Campaign for Accountability, among others, ask for such verification to be presented at Meta’s shareholders’ meeting.

The chance that something like this will also be approved is small. Although Facebook (today Meta) is a publicly-traded company, the company has a shareholder structure that means that founder and CEO Mark Zuckerberg owns 58 percent of the voting shares. That is, even if he doesn’t own half of the company, he continues to make all the decisions.

It is not the first time that external calls have been made for more control and better management of Facebook. A similar question arose in 2018 following the Russian disinformation campaigns that contributed to Trump’s election. The audit committee’s role at Facebook was then expanded, but not to the extent that was requested.

Perhaps something like this will happen again because Meta is again under enormous political and social fire. Unfortunately, the company did not see that a coup in the US was planned on its platform early this year. There was also a fine to the American regulator FTC that was overpaid to put Zuckerberg himself out of prosecution.

But the biggest uproar came after the leaked documents of whistleblower Frances Haugen. They showed, among other things, that Facebook knows that Instagram is harmful to young people and can lead to suicide, that the company maintains VIP lists that are allowed to break the rules.

Perhaps the most serious fact, though, is that Facebook consistently did nothing, or acted only afterwards, against disinformation and hate campaigns in Myanmar against the Rohingya. This Muslim population was hunted and massacred in the country. Hate and lies that were massively disseminated that forced hundreds of thousands to flee, and about 25,000 people were killed while Facebook did nothing.

Events like this show that Facebook, which makes about $3 billion in profits a month, is still failing to keep its platform safe for its users, while it wants to invest heavily in its metaverse, a new way of socializing interact, but presumably with the same pain points that today it does not address or tackles too late.

Leave A Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.