NEWS   TOP   TAGS   TODAY   ARCHIVE   EN   ES   RU   FR 
NEWS / 2023 / 11 / 30 / META TAKES DOWN THOUSANDS OF FAKE FACEBOOK ACCOUNTS TARGETING 2024 VOTERS

Meta takes down thousands of fake Facebook accounts targeting 2024 voters

18:39 30.11.2023

In a recent announcement, Meta revealed that thousands of fake social media accounts originating from China were created to pose as American users and spread polarizing political content. The network, which consisted of nearly 4,800 fake accounts, aimed to build an audience on Meta-owned platforms like Facebook and Instagram. However, the tech company was able to identify and eliminate the network before it gained significant traction.

The fake accounts utilized fake photos, names, and locations to appear as regular American Facebook users who were actively engaging in political discussions. Unlike other networks that spread fake content, these accounts focused on resharing posts from X, formerly known as Twitter. The shared posts came from various sources including politicians, news outlets, and other entities. Interestingly, the content pulled from both liberal and conservative sources, indicating that the network's goal was not to support a particular side but to exacerbate partisan divisions and inflame polarization.

This newly discovered network highlights how foreign adversaries exploit U.S.-based tech platforms to sow discord and distrust. It also sheds light on the potential threats posed by online disinformation in the upcoming national elections not only in the United States but also in countries such as India, Mexico, Ukraine, Pakistan, and Taiwan.

While Meta did not publicly link the Chinese network to the Chinese government, it did confirm that the network originated in China. The content disseminated by these accounts aligns with other Chinese government propaganda and disinformation campaigns that aim to magnify partisan and ideological divisions within the U.S. To appear more authentic, some of the accounts substituted their American-sounding names and profile pictures with ones suggesting they were from India. These accounts then began spreading pro-Chinese content related to Tibet and India, showcasing how fake networks can be redirected to target new subjects.

Meta has frequently emphasized its efforts to shut down fake social media networks as evidence of its commitment to protecting election integrity and democracy. However, critics argue that the company's focus on fake accounts deflects attention from its failure to address the existing misinformation on its platform that has contributed to polarization and distrust. For instance, Meta allows paid advertisements on its site that propagate baseless claims about the 2020 U.S. election being rigged or stolen, amplifying the lies of former President Donald Trump and other Republicans. Multiple federal and state election officials, as well as Trump's own attorney general, have debunked these claims.

When questioned about its ad policy, Meta stated that it is primarily focused on future elections and will reject ads that cast unfounded doubt on upcoming contests. However, the company has been inconsistent in its approach. While it recently announced a new artificial intelligence policy requiring political ads to carry a disclaimer if they contain AI-generated content, it has allowed other manipulated videos, such as a digitally edited video of President Joe Biden falsely claiming he is a pedophile, to remain on the platform.

Critics argue that Meta cannot be trusted and accuse the company of failing to take its role in the public sphere seriously. They urge people to observe Meta's actions rather than its promises. Despite the criticisms, Meta executives discussed the network's activities during a conference call with reporters, which took place the day after the company announced its election policies for the upcoming year. However, experts who study the link between social media and disinformation believe that 2024 will present new challenges, especially considering the emergence of sophisticated AI programs that can create lifelike audio and video to mislead voters.

Jennifer Stromer-Galley, a professor at Syracuse University who studies digital media, described Meta's election plans as modest compared to the unregulated environment on X, formerly known as Twitter. Since Elon Musk acquired the platform, he has eliminated content moderation teams, allowed previously banned users back on the platform, and used it to spread conspiracy theories. Both Democrats and Republicans have called for legislation addressing algorithmic recommendations, misinformation, deepfakes, and hate speech. However, the likelihood of significant regulations being passed before the 2024 election is slim, making self-regulation by the platforms crucial.

Kyle Morse, the deputy executive director of the Tech Oversight Project, a nonprofit advocating for federal regulations on social media, believes that Meta's current efforts to protect the election serve as a concerning preview of what can be expected in 2024. As major countries prepare to hold national elections and disinformation tactics become increasingly sophisticated, the responsibility falls on social media platforms to proactively address these challenges and ensure the integrity of the democratic process.

/ Thursday, 30 November 2023 /

themes:  Meta  China    AI (Artificial intelligence)  X (Twitter)  USA  Elon Musk  Facebook

VIEWS: 223


18/05/2024    info@iqtech.top
All rights to the materials belong to their authors.
RSS