New Delhi: Highlighting the rise in manipulated content being targeted at India, two major tech firms — Meta and OpenAI — in separate reports this week said they have taken actions against accounts posting content with the aim to influence public debate on issues, including the general elections in India.
OpenAI, the US-based artificial intelligence (AI) company which created ChatGPT, in its report released this week, said that it had disrupted a covert influence campaign by an Israeli firm that used its AI model to create fake social media personas and generate Indian elections-related content, including anti-Bharatiya Janata Party (BJP) content, that was posted across multiple social media platforms.
Meta, which owns Facebook, Instagram and WhatsApp, said that it removed a number of Facebook accounts, pages and groups for violating its policy against “coordinated inauthentic behaviour”. These accounts, groups and pages originated in China, and targeted the Sikh community not just in India but in several other countries as well.
In its report, OpenAI said that in May, it disrupted some activity focused on the Indian elections less than 24 hours after it began. Notably, this is the first time the company has released a report on ‘AI and Covert Influence Operations: Latest Trends’.
The firm said that an Israeli political campaign management firm STOIC was generating content about the Gaza conflict and, to a lesser extent, the Histadrut trade unions organisation in Israel and the Indian elections.
“We banned a cluster of accounts operated from Israel that were being used to generate and edit content for an influence operation that spanned X, Facebook, Instagram, websites, and YouTube,” the firm said, adding that this network was operated by STOIC.
The operation, it said, used OpenAI models to create fictional personas and bios for social media based on certain variables such as age, gender and location, and to conduct research into people in Israel who commented publicly on the Histadrut trade union in Israel. “Our models refused to supply personal information in response to these prompts,” it said.
The company added that content generated by this network was posted across multiple social media platforms, including Facebook, Instagram and X. “In some cases, we identified this operation’s fake accounts commenting on social-media posts made by the operation itself, likely in an attempt to create the impression of audience engagement,” it said. However, according to OpenAI, this campaign attracted low levels of engagement.
Also read: How govt brought Vodafone Idea back from the brink & earned a neat profit in the process
Many of the social-media accounts that posted this network’s content used profile pictures that appear to have been created using an earlier type of artificial intelligence: generative adversarial networks (GAN). Such images can be readily downloaded from the internet, it said.
“This operation was divided into a number of topical campaigns, most of which were loosely associated with the Gaza conflict and the broader question of relations between individuals of Jewish and Muslim faith. In May, we disrupted some activity focused on the Indian elections less than 24 hours after it began,” OpenAI said.
It added that in May, the network began generating comments that focused on India, criticised the ruling BJP party and praised the opposition Congress party.
Reacting to the report, Minister of State for Electronics and IT Rajeev Chandrasekhar said on X, “It is absolutely clear and obvious that @BJP4India was and is the target of influence operations, misinformation and foreign interference, being done by and/or on behalf of some Indian political parties. This is a very dangerous threat to our democracy. It is clear vested interests in India and outside are clearly driving this and needs to be deeply scrutinized/investigated and exposed. My view at this point is that these platforms could have released this much earlier, and not so late when elections are ending.”
OpenAI said it has shared threat indicators with peers across the industry and so far, these campaigns do not appear to have meaningfully increased their audience engagement or reach as a result of their use of its services.
‘Created fictitious activist movement called Operation K’
Meanwhile, Meta said that it had removed 37 Facebook accounts, 13 Pages, five Groups, and nine accounts on Instagram for violating its policy against coordinated inauthentic behavior. This network originated in China and targeted the global Sikh community, including in Australia, Canada, India, New Zealand, Pakistan, the UK, and Nigeria.
It said this activity was targeted at multiple social media platforms, included several clusters of fake accounts, including one with links to an unattributed CIB (Coordinated Inauthentic Behavior) network — meaning coordinated efforts to manipulate public debate for a strategic goal in which fake accounts are central to the operation. This network was from China and was targeting India and the Tibet region that Meta disrupted in early 2023.
“This operation used compromised and fake accounts — some of which were detected and disabled by our automated systems prior to our investigation — to pose as Sikhs, post content and manage Pages and Groups. They appeared to have created a fictitious activist movement called Operation K which called for pro-Sikh protests, including in New Zealand and Australia. We found and removed this activity early, before it was able to build an audience among authentic communities,” it said.
They posted primarily in English and Hindi about news and current events, including images likely manipulated by photo editing tools or generated by AI, “in addition to posts about floods in the Punjab region, the Sikh community worldwide, the Khalistan independence movement, the assassination of Hardeep Singh Nijjar, a pro-Khalistan independence activist in Canada, and criticism of the Indian government”, it added.
(Edited by Zinnia Ray Chaudhari)
Also read: Amid Hong Kong spying row, China ‘uncovers’ espionage attempts in its aerospace sector