The company told Reuters that Facebook is taking a more aggressive approach to close the coordination group of real user accounts that participate in certain harmful activities on its platform, using the same strategy that its security team uses for activities that use fake accounts.

The new method reported here for the first time uses the tactics commonly adopted by Facebook security teams to massively shut down the networks involved in influencing operations that use fake accounts to manipulate public debates, such as the Russian troll farm.

This may have a major impact on how the social media giant handles political and other coordinated campaigns that violate its rules, at a time when Facebook’s abuse on its platform is being scrutinized by global lawmakers and civil society groups.

Facebook stated that it now plans to take the same network-level approach to a coordinated set of real accounts, systematically violating its rules through large-scale reporting, many of which incorrectly report the target’s content or accounts in order to close or bind them. One type of user may coordinate online harassment against individuals through a large number of posts or comments.

In a related change, Facebook said on Thursday that it will take the same type of approach to the activities of real users who cause “cooperative social harm” on and off its platform, as it announced the cancellation of Germany’s anti-COVID restrictions on the Querdenken campaign.

A spokeswoman said that these expansions are still in the early stages, which means that Facebook’s security team can identify the core movement that promotes such behavior and take more thorough actions than the company deletes posts or personal accounts.

See also  Salesforce uses Salesforce+ to enter streaming media to showcase business-centric content

In April, BuzzFeed News published a leaked internal Facebook report about the company’s role in the U.S. Capitol riots on January 6 and its challenges in curbing the fast-growing “stop theft” movement. One survey result is that Facebook “has almost no policies surrounding the real harm of coordination.”

After the 2016 U.S. election, U.S. intelligence officials concluded that in 2017, Facebook’s security experts separated from the company’s content managers and dealt with threats from opponents who tried to evade its rules, and began cracking down on the use of fake accounts. Affect operation. Media platforms are part of the online influence movement-Moscow denies this claim.

Facebook referred to this prohibited activity of the fake account group as “coordinating untrue behavior” (CIB), and its security team began announcing a complete removal in its monthly report. The security team also deals with certain threats that may not use fake accounts, such as fraud or cyber espionage networks or publicly affecting operations, such as media activities in certain countries.

The source said that the company’s team has long been arguing how to intervene in the large-scale movement of real user accounts at the network level, systematically violating its rules.

In July, Reuters reported on the online information warfare department of the Vietnamese army, which is engaged in operations including a large number of accounts to Facebook, but also often uses real names. Facebook deleted some accounts for these large-scale reporting attempts.

Facebook is facing increasing pressure from global regulators, legislators and employees to crack down on widespread abuses in its services. Others criticized the company for allegations of censorship, anti-conservative bias, or inconsistent enforcement.

See also  Facebook, Google become Vietnam's "human rights free zone": Amnesty International

The expansion of Facebook’s network outage model to affect real accounts raises further questions about how changes might affect the types of public debates, online campaigns, and campaign strategies throughout the political sphere.

“A lot of times, problematic behaviors seem very close to social movements,” said Evelyn Douek, a Harvard law lecturer who studies platform governance. “It will depend on this definition of harm… but obviously people’s definition of harm can be very subjective and vague.”

High-profile examples of coordinated activities surrounding the U.S. election last year, from teenagers and K’s who claimed to have used TikTok to disrupt the rally held by former President Donald Trump in Tulsa, Oklahoma to pay for the political activities of online emoji creators. -pop fans have also sparked a debate about how the platform should define and handle coordinated activities.

© Thomson Reuters 2021