Strike four: Facebook misses election misinfo in Brazil ads

Facebook failed to detect blatant election-related disinformation in ads ahead of Brazil’s 2022 elections, a new report from Global Witness has found, continuing a pattern of not capturing material that violates its policies that the group describes as “alarming”.

The ads contained false information about the country’s upcoming elections, including promoting an incorrect election date, incorrect voting methods and questioning the integrity of the election.

This is the fourth time the London-based nonprofit has tested Meta’s ability to detect flagrant violations of the rules of its most popular social media platform, and the fourth such test by Facebook has refused In the three cases above, Global Witness sent ads containing violent hate speech to see if Facebook’s controls, whether human reviewers or artificial intelligence, would catch it. They didn’t.

“Facebook has identified Brazil as one of its priority countries where it is investing special resources specifically to tackle election-related disinformation,” said Jon Lloyd, senior adviser at Global Witness. “So we wanted to really test their systems with enough time for them to perform. And with the mid-US tests right around the corner, Meta simply has to get it right, and right now.”

Brazil’s national elections will be held on October 2 amid high tensions and misinformation that threaten to discredit the electoral process. Facebook is the most popular social media platform in the country. In a statement, Meta said it has “ widely prepared for the 2022 elections in Brazil.”

“We’ve launched tools that promote reliable information and flag election-related posts, we’ve established a direct channel for the Superior Electoral Court to send us potentially harmful content for review, and we continue to work closely with Brazilian authorities and investigators” , the company said. .

In 2020, Facebook began requiring advertisers who want to run election or political ads to complete an authorization process and include “Paid for” disclaimers in those ads, similar to what it does in the US. The increased safeguards followed the 2016 US presidential election, when Russia used rubles to pay for political ads designed to stoke division and unrest among Americans.

Global Witness said it broke those rules when it sent the test ads (which were approved for publication but never ran). The group placed the ads from outside Brazil, from Nairobi and London, which should have raised red flags.

It also wasn’t required to put a “paid for” disclaimer on ads and didn’t use a Brazilian payment method — all safeguards Facebook says it had put in place to prevent misuse of its platform by of malicious actors trying to intervene in the elections. the world.

“What’s pretty clear from the results of this investigation and others is that their content moderation capabilities and the integrity systems they deploy to mitigate some of the risk during election periods simply don’t work,” he said. Lloyd.

The group is using ads as a test and not regular posts because Meta claims to hold ads to an “even stricter” standard than regular, unpaid posts, according to its help center page for paid ads.

But judging by the four investigations, Lloyd said that’s not really clear.

“We constantly have to take Facebook at its word. And without a verified independent third-party audit, we simply cannot hold Meta or any other tech company accountable for what they say they’re doing,” he said.

Global Witness submitted ten ads to Meta that clearly violated its policies on election-related advertising. They included false information about when and where to vote, for example, and cast doubt on the integrity of Brazil’s voting machines, echoing disinformation used by malicious actors to destabilize democracies around the world.

It will be the first Brazilian election since far-right President Jair Bolsonaro, who is seeking re-election, came to power. Bolsonaro has repeatedly attacked the integrity of the country’s electoral systems.

“Disinformation played a major role in the 2018 election, and this year’s election is already clouded by reports of widespread disinformation, spread from above: Bolsonaro is already sowing doubt about the legitimacy of the election result, causing fears of inspiration in the United States Jan. 6 coup attempt ‘stops the robbery,’” Global Witness said.

In its previous research, the group found that Facebook failed to capture hate speech in Myanmar, where ads used a slur to refer to people of Muslim or East Indian origin and call for their death; in Ethiopia, where the ads used dehumanizing hate speech to call for the killing of people belonging to each of Ethiopia’s three main ethnic groups; and in Kenya, where the ads talked about beheadings, rapes and bloodshed.

Source link

You May Also Like

About the Author: SteveSossin

Welcome! I keep up on all the latest cbd and thc news!