Facebook Updates Controversial Content Moderation Process for VIPs


NY
CNN

Facebook parent Meta on Friday announced an update to its “cross-validation” moderation system after it faced criticism for treating VIPs differently by applying different review processes to VIP posts compared to regular users.

But Meta did not accept all of the recommended changes that were previously put forward by its own Review Board, including a proposal to publicly determine which high-profile accounts are eligible for the program.

The cross-validation program was criticized in November 2021 after report from the Wall Street Journal indicated that the system shielded some VIP users such as politicians, celebrities, journalists, and Meta’s business partners such as advertisers from the company’s normal content moderation process, in some cases allowing them to post infringing content without consequences.

The program has grown to 5.8 million users as of 2020, according to the magazine. Meta Supervisory Board said following a report that Facebook did not provide critical details about the system. At the time, Meta stated that the system’s criticisms were fair, but that this cross-validation was created to improve the accuracy of content moderation, which “may require more understanding.”

Meta Supervisory Board V December Policy Recommendation called the program designed to “serve business interests” and said it ran the risk of harming ordinary users. The council — a Meta-funded organization that claims to operate independently — urged the company to “drastically increase transparency” of the cross-validation system and how it works.

On Friday, Meta said it would implement some or all of many of the more than two dozen recommendations from the Review Board to improve the program.

Meta says that among the changes it has committed to making, it will seek to distinguish between accounts included in the expanded due diligence program for business and human rights reasons and to detail those differences for the board of directors and in the company’s transparency center. Meta is also improving its process for temporarily removing or hiding potentially harmful content while it awaits further review. And the company also said it will work to ensure cross-validation reviewers have the appropriate language and regional expertise “whenever possible.”

The company, however, refused to implement recommendations such as publicly tagging government officials and political candidates, business associates, members of the media, and other public figures included in the cross-checking program. The company said such public identifiers could make these accounts “potential targets for attackers.”

“We are committed to maintaining transparency to the board of directors and the public as we continue to deliver on our commitments,” Meta said in a statement regarding the cross-checking program.

The review board said in a tweet Friday that the company’s proposed changes to the cross-validation program “could make Meta’s approach to error prevention fairer, more credible and legitimate, given key criticisms” in its December policy advice.