Indian influencer ER Yamini has never tweeted in her life; she prefers to cultivate her huge fan base on Instagram and YouTube.
But in early March, a Twitter account using her photo tweeted: “#IStandWithPutin. True Friendshipaccompanied by a video showing two men embracing, one representing India and the other Russia.
Yamini says she does not support either country in the Russia-Ukraine war and worries about her supporters.
“If they see that tweet, what will they think of me?“, he wonders: “I wish they didn’t use my picture on that profile.”
The fake account is part of a network promoting Russian President Vladimir Putin on Twitter, which used the hashtags #IStandWithPutin Y #IStandWithRussia on March 2 and 3.
That led to trending topics in different regions, particularly in the global south, seemingly showing support for the war, in countries like India, Pakistan, South Africa and Nigeria.
Some of the activity tracked was organic, in other words produced by real people, reflecting genuine support in some countries for Putin and Russia.
But many other profiles appear to have been inauthentic. They retweeted posts in large numbers, produced few original posts, and were created very recently.
“They were likely produced by bots, fake profiles, or compromised accounts, artificially amplifying support for Putin in these countries,” says Carl Miller, co-founder of CASM Technology, a company that investigates online harm and disinformation.
It tracked 9,907 profiles promoting support for Russia on March 2 and 3, in several different languages. CASM found that over 1,000 of those accounts had spam-like characteristics.
The BBC investigated hundreds of these apparently fake profiles. Our investigation confirms Miller’s suspicions: they’re trying to pass as genuine, but they’re actually fake.
Through reverse image search, we discovered that the images used by these profiles were copied from celebrities, influencers, and ordinary users, who had no idea that their faces were being used to support Russia in its war against Ukraine.
We have not been able to determine who created the accounts or if they have any connection to the Russian government.
An account called Preety Sharma, for example, claims in her bio that she is a “model and businesswoman” originally from India, now in Miami.
It was created on February 26, two days after the invasion of Russia. “Putin is a good person,” says one of his retweets.
But the woman depicted in the account’s profile photo is on the other side of the world.
Nicole Thorne is a influencer Australian social media personality who has 1.5 million followers on Instagram and only occasionally uses her original Twitter profile.
Another account tries to impersonate the Indian singer Raja Gujjar. His first tweet was posted on February 24, the first day of the invasion. And all 178 of the account’s posts are retweets, a strong indicator of automation.
The BBC contacted Thorne and Gujjar, and both confirmed that these accounts were not theirs.
However, while some of the investigated accounts were very bot-like, not all of them were fake.
Doing a reverse search of one of the profiles, created in February 2022 with tweets from March 2 and which had no followers, the BBC found the LinkedIn account of a young Indian.
turned out to be authentic and assembled by Senthil Kumar, an aeronautical engineer. We asked him why he created an account solely to retweet pro-Russian messages.
“Usually I open Twitter and see the trends. So I saw these posts and just retweeted them,” he said.
He thinks that Russia has supported India in the past, and the Indians should now support Russia. And his profile is new, she said, because she forgot the password to her old account.
The accounts tweet a mix of criticism of Western countries, express solidarity among the so-called Brics countries (Brazil, Russia, India, China, South Africa) and offer direct support for Putin.
“We default to the idea that information campaigns will target the West. However, none of the accounts were targeting the West or claiming to be from the West,” says Miller.
To identify what could be a group of inauthentic accounts, he adds, researchers analyze account creation dates, an “inhuman” tweeting pattern (such as an account that tweets 24 hours a day), and the variety of topics tweeted.
“None of these things is a smoking gun, but they all add up to allow us to see if a certain community of accounts looks suspicious,” says Miller.
The lack of a genuine imagea Profile it can also be a telltale sign.
Of a sample of 100 accounts tracked by CASM, the BBC found that 41 had no profile photos. Another 30 had illustrations or photographs of personalities such as Putin or the CEO of Facebook, Mark Zuckerberg. Only a quarter had photographs of people, and some of them were stolen.
Twitter prohibits impersonation of “individuals, groups, or organizations to mislead, confuse, or mislead others.”
The company told us that since the war began, has deleted more than 100,000 accounts for violations of its spam and platform manipulation policy, including the suspension of dozens of accounts connected to the hashtags #IStandWithRussia Y #IStandWithPutin.
Twitter says it investigated and suspended hundreds of the accounts flagged by the CASM investigation and sent to the platform by the BBC, including 11 of the 12 accounts specifically flagged by us for using other people’s profile pictures.
But he made it clear that found no evidence of widespread coordination to artificially amplify sentiment around the Ukraine war.
Remember that you can receive notifications from BBC World. Download the new version of our app and activate it so you don’t miss out on our best content.