Washington — When the user, who went by the name Harlan, first appeared on social media, he claimed to be an Army veteran living in New York and supporting Donald Trump for president. Harlan said he was 29 years old, and his profile picture showed a handsome, smiling young man.
A few months later, Harlan has undergone a transformation: He now claims to be 31 and from Florida.
A new investigation into a Chinese disinformation network targeting American voters found that Harlan’s claims were as false as his profile photo, which analysts believe was created using artificial intelligence.
As voters prepare to go to the polls this fall, China has been hatching its own plan, building a network of fake social media users to mimic Americans. Regardless of who Harlan is or where he is, he is just one part of a larger effort by America’s adversaries to use social media to influence and subvert American political debate.
The account was traced by analysts at Graphika, a New York company that tracks online networks, to Chinese disinformation group Spamouflage, which has been known to online researchers for several years and was nicknamed for its tendency to spread large amounts of seemingly unrelated content alongside misinformation.
“One of the world’s largest covert online influence operations, run by Chinese state actors, has become increasingly aggressive in its efforts to infiltrate and sway US political debate ahead of the election,” Jack Stubbs, Graphika’s chief information officer, told The Associated Press.
Intelligence and national security officials say Russia, China and Iran are conducting online influence operations aimed at American voters ahead of the November election. Iran has become more aggressive in recent months, including covertly supporting U.S. protests against the Gaza war and attempting to hack into the email systems of two presidential candidates, but Russia remains the biggest threat, intelligence agencies say.
But China is taking a more cautious and nuanced approach. Beijing doesn’t see much advantage in endorsing one presidential candidate or the other, intelligence analysts say. Instead, Chinese disinformation campaigns are focusing on election issues that are of particular importance to Beijing, such as U.S. policy toward Taiwan, in an effort to undermine confidence in the election, the vote, and the United States overall.
Officials say this is a long-term effort that will continue beyond Election Day as China and other authoritarian countries seek to use the internet to undermine support for democracy.
Chinese embassy spokesman Liu Pengyu rejected Graphika’s findings as full of “bias and malicious speculation” and said “China has no intention and will not interfere in elections.”
Platform X, formerly known as Twitter, suspended some of the accounts linked to the Spamflage network after questions were raised about their authenticity. The company did not respond to questions about why it suspended the accounts or whether it was related to Grafika’s report.
TikTok also removed accounts associated with Spamouflage, including Harlan’s.
“Throughout the US presidential election, we will continue to remove false accounts and harmful misinformation to protect the integrity of our platform,” a TikTok spokesperson said in an emailed statement on Tuesday.
Compared with armed conflict or economic sanctions, online influence operations can be a low-cost, low-risk means of exerting geopolitical power. Given our increasing reliance on digital communications, the use of online disinformation and disinformation networks is only likely to increase, said Max Lesser, senior emerging threats analyst at the Foundation for Defense of Democracies, a national security think tank in Washington.
“The influence operations are going to broaden the playing field beyond just Russia, China and Iran to also include smaller actors,” Lesser said.
Lesser said the list could include not only nation states but also criminal organisations, domestic extremist groups and terrorist organisations.
When analysts first noticed Spamouflage five years ago, the network tended to post generally pro-China, anti-American content. In recent years, Spamouflage has expanded and become more vocal in tone, focusing on divisive political topics such as gun control, crime, race relations and support for Israel during the Gaza war. The network also began mass-producing fake accounts designed to mimic U.S. users.
The spam accounts don’t often post original content, but rather use platforms like X and TikTok to recycle and repost content from far-right and far-left users. Some of the accounts are designed to appeal to Republicans, while others are geared towards Democrats.
While Harlan’s account was successful in garnering attention (a video mocking President Joe Biden received 1.5 million views), many of the accounts created by the spam mob movement failed to gain traction, a reminder that online influence operations are often a numbers game: The more accounts there are, the more content there is and the greater the chance that any given post will go viral.
Many of the newly linked accounts in Spamflage were taking pains to appear American, sometimes in blatant ways. “I’m American,” one account proclaimed. Some gave their American identity away with awkward English or odd word choices, some more clumsy than others. One account’s biography read, “My English is bad, but I’m smart and I love Trump.”
Graphica researchers believe Harlan’s profile photo was created using AI, but it was identical to one used by an account linked to an earlier spam mob. Messages sent to the person running Harlan’s account were not returned.