The oversight board claims that Facebook’s handling of high-profile users is flawed.

118
The oversight board claims that Facebook's handling of high-profile users is flawed.

The board wants Meta, the parent company of Facebook, to alter its cross-checking system.

According to Facebook’s parent company Meta, everyone must abide by the rules governing what types of content are and are not permitted on its platforms, such as hate speech and harassment.

The social media juggernaut’s claim, according to a board tasked with reviewing some of Meta’s most difficult content moderation decisions, is “misleading,” the board said on Tuesday.

In 2021, Meta requested that the Oversight Board look into a program called cross-check, which enables high-profile users of Facebook and Instagram to receive an additional review of their content if it is reported as violating the platform’s policies. More information about the program was made public by The Wall Street Journal last year. The publication noted that the system shields millions of prominent users from how Facebook typically enforces its terms of service. For instance, before Facebook removed the content, Brazilian soccer star Neymar was able to share nude pictures of a woman who had accused him of rape with tens of millions of his fans.

The Oversight Board pointed out various issues with Meta’s cross-check program, including the fact that it provides some high-profile users with additional protection, in a 57-page policy advisory opinion about the program. The opinion also calls into doubt the effectiveness of Meta’s program.

The Oversight Board Administration’s Thomas Hughes said in a statement that the opinion “details how Meta’s cross-check program prioritizes influential and powerful users of commercial value to Meta and as structured does not meet Meta’s human rights responsibilities and company values, with profound implications for users and global civil society.”

What you should know about Meta’s cross-checking program is as follows:

Why was this program made by Meta?

The cross-check tool, according to Meta, aims to stop the firm from inadvertently taking action against content that doesn’t break its standards, particularly in situations when there is a higher risk associated with making a mistake.

The business claims to have used this program with posts from news organizations, celebrities, or governments. In a blog post published in 2018, Meta stated, “For example, we have Cross Checked an American civil rights activist’s account to prevent accidentally deleting instances of him raising awareness of hate speech he was encountering.”

In its transparency center, the company also provides more information about the program’s operation.

What cross-checking issues did the board discover?

Because information marked for further human review is kept on the website for a longer period of time, the board came to the conclusion that the programme “treats users unequally.” According to Meta, the company can take longer than five days to decide on user-submitted content that is subject to cross-checking.

This implies that content identified as violating Meta’s rules is left up on Facebook and Instagram when it is most viral and may cause harm because of cross-checking, according to the opinion.

The judgment states that the program also seems to be more advantageous to Meta’s commercial objectives than it is to its commitment to human rights. The board drew attention to program transparency problems. The people on Meta’s cross-check list aren’t made public, and the company doesn’t keep track of data to show whether the program acids in more precise content moderation decisions.

The board questioned Meta 74 about the scheme. 58 of the questions were fully and 11 partially answered by Meta. Five queries were not addressed by the company.

What adjustments did the board advise Meta to make for cross-checking?

The board recommended 32 changes to Meta, among them prioritizing human rights-related content and reviewing users in a different workflow from its business partners. The only criteria for gaining additional protection shouldn’t be a user’s amount of followers or star status.

Additionally, during the initial review, while moderators re-examine the post, Meta should delete or hide very inflammatory content that is reported as violating its guidelines.

The opinion stated that “Such content should not be permitted to remain on the platform accumulating views simply because the person who posted it is a business partner or celebrity.

“In over public to hold state actors, political candidates, and business partners accountable for upholding the platform’s rules, the board also wants Meta to be more transparent about the program by publicly marking some accounts protected by cross-checking, such as those of state actors, business partners, and others. Cross-checked content should also be appealable by users to the board.

How did Meta respond to the viewpoint of the board?

The business stated that it will respond in 90 days after evaluating the board’s recommendation.

The program has been improved over the past year, according to Meta, including the expansion of cross-checking reviews to include all 3 billion users. The business claimed that to determine whether the information is more likely to be unintentionally taken down, an algorithm is used. Additionally, Meta mentioned that it established annual reviews to examine who is getting a higher level of review.