In a transparency report published today, Meta announced that it had taken down thousands of accounts in what is believed to be the largest known cross-platform covert influence operation in the world. The operation targeted more than 50 platforms, including Facebook, Instagram, Twitter (now called X), YouTube, TikTok, and Reddit.
The campaign aimed to promote “positive commentary” about China and its Xinjiang province, where the Chinese government has been accused of human rights abuses and forced labor against the Uyghur ethnic minority. Additionally, posts were made criticizing the US, Western foreign policies, journalists, and researchers seen as critics of the Chinese government.
Although the influence campaign was based in China, it targeted several regions including the US, Taiwan, Australia, the UK, and Japan. Meta was able to trace the operation back to individuals associated with Chinese law enforcement. This campaign was part of a series of similar campaigns that Meta has been combatting since 2019, collectively referred to as “Spamouflage”.
During their investigation, Meta uncovered clusters of fake accounts that seemed to post on a regular schedule, suggesting that the people behind them may have been working together in a shared office with scheduled meal breaks. These fake accounts would spread similar content across multiple platforms, often using misleading headlines to attract attention.
Overall, Meta took down a significant number of accounts, including 7,704 Facebook accounts, 954 Pages, 15 Groups, and 15 Instagram accounts. Despite the breadth of the operation, the network did not succeed in gaining too many authentic followers. Many of the acquired followers were fake and from locations such as Vietnam, Bangladesh, and Brazil, which were not the intended targets of the disinformation campaign.
According to the report, the Spamouflage network has increasingly targeted smaller platforms since Meta began cracking down on it in 2019. This includes platforms like Chinese financial forum nanyangmoney[.]com and Australian local forum Melbournechinese[.]net.
While this disinformation campaign is the largest one flagged by Meta, it is not the only one mentioned in their report. Meta also removed dozens of accounts linked to other covert influence campaigns that targeted Turkey and violated Meta’s policies. Moreover, after taking action against a Russia-based operation spreading disinformation about the war in Ukraine last year, Meta found that the campaign had since expanded its targets to include countries in Europe, the US, and Israel. Impersonating news outlets was a common tactic used by many of these campaigns.
Meta’s efforts to combat misinformation and disinformation campaigns are ongoing, and they are constantly working to identify and remove such content from their platforms. These campaigns highlight the importance of continued vigilance and the need for robust measures to combat the spread of false information and protect user trust and safety online.