scorecardresearch
Friday, March 29, 2024
Support Our Journalism
HomeTechWhy the Manoj Tiwari deepfakes should have India deeply worried

Why the Manoj Tiwari deepfakes should have India deeply worried

Still in nascent stages, deepfakes involve manipulating videos to show someone saying or doing something they never said or did. 

Follow Us :
Text Size:

New Delhi: In early February, two videos showed Delhi BJP chief Manoj Tiwari making a bilingual appeal for votes ahead of the 2020 assembly elections. The premise, content and setting of the two videos — one in Haryanvi, the other in English — were identical. And so was another facet. Both were fakes. Deepfakes, to be precise.

Yet in nascent stages, deepfakes are among the latest innovations of modern technology. They involve manipulating videos to show someone saying or doing something they never said or did. 

For example, an awareness video released by Buzzfeed in April 2018 showed former US President Barack Obama refer to his successor Donald Trump as a “dipshit”. Two other videos, released by the UK-based research organisation Future Advocacy and artist Bill Posters in November 2019 to highlight the dangers of deepfakes ahead of the British general elections, showed PM Boris Johnson and his prime rival, Labour’s Jeremy Corbyn, endorse each other.

During research for this report, ThePrint also came across a deepfake showing Delhi Chief Minister Arvind Kejriwal speaking in Punjabi. 

Deepfakes are not yet a perfect art. Inconsistencies exist, but they are minor enough that one might miss them unless they’re specifically looking for them.

The two Tiwari deepfakes were improvised from a video he released in support of the Citizenship Amendment Act following its passage in Parliament last December. Carrying no disclaimer about their nature, the videos appear quite convincing. 

They are the first known instance of deepfakes being employed in an Indian election campaign. In a country where dubious WhatsApp forwards have been enough to drive mobs to murder, and an age where fake news goes viral in minutes, the advent of deepfakes is a prospect that has experts worried. 


Also Read: 31 deaths later, WhatsApp is yet to be serious about fighting fake news in India


What are deepfakes?

Deepfakes, a product of artificial intelligence, can loosely be described as Photoshop for videos. They are known thus because the first-ever recorded use of artificial intelligence (AI) to manipulate a video came in 2017 from a user on social media network Reddit who went by the name “deepfakes”.

Given their potential for misuse amid the disinformation epidemic confronting the world, they were banned by Facebook earlier this year

Making a deepfake involves zeroing in on a video of the targeted person and training an AI programme to mimic their movements. An impressionist can then be made to record a statement, and this audio superimposed on the original video with AI-manipulated lip movements.

The Obama video, for example, was made with the help of original footage of the US President and a recording by American comedian and Oscar winner Jordan Peele. 

These two videos, particularly the portions showing Obama’s face and Peele’s mouth, were allowed to be processed by machine-learning software for 56 hours. By the time this window was over, the computer had programmed itself to make a final deepfake video where Obama appeared to be speaking words actually uttered by Peele.

Apart from their potential use in political campaigns, deepfakes have also been made to project unwitting people into porn movies, with ‘Wonder Woman’ Gal Gadot being a famous example.

According to a September 2019 report by Amsterdam-based cybersecurity firm Deeptrace, 96 per cent of deepfake videos online continue to be non-consensual porn.

BJP’s shifting stance

The fact that the Tiwari videos were deepfakes was first reported by media portal Vice on 18 February.

The deepfakes, each clocking in at 44 seconds, were allegedly approved by the Delhi BJP, but the party is now seeking to distance itself from the technology.

In the Vice report, Neelkant Bakshi, Delhi BJP IT cell and social-media co-in charge, was quoted as saying that the technology “has helped us scale campaign efforts like never before”. A day later, he said in a statement that “…if used positively, the technology definitely sounds good”. 

However, he also claimed in the statement that the deepfake videos were made on a pilot basis and not used in any actual campaign.

We have not tied up with any agency for creating such videos using deepfake technology. One of our team members had come across a video of Tiwari ji (speaking) in Haryanavi so we circulated that in our internal groups and to known people for feedback,” he added. 

On 6 February, according to Bakshi, the Delhi BJP circulated the video in 5,800 WhatsApp groups known to the party. 

It was felt that if the video had been in English, it would have been better, so one of our team members asked for the English video, which those guys gave us.”

“Those guys” refers to a Chandigarh-based political communications firm, The Ideaz Factory, whose chief strategist for new media, Sagar Vishnoi, refused to comment for this report.

Speaking to ThePrint 21 February, Bakshi described the Delhi BJP as a “victim of this technology”. 

“Someone used a Facebook video of our Delhi BJP president… Manoj Tiwari ‘Mridul’ Ji and sent us his video with changed content in Haryanvi dialect,” he added. “It was shocking for us as it may have been used by the opposition in bad taste, especially the Aam Aadmi Party (AAP)… We strongly condemn the use of this technology, which is available in open arena and has been used without our consent.”


Also Read: That rumour you read on WhatsApp can be deadly


‘A threat to democracy’

Experts say deepfake videos can be used to deceive the public and damage democratic processes like elections.

“Deepfakes are a technology that can totally undermine societies and the democratic process as the population will find it difficult to discern reality from fiction…” said information warfare expert and retired military officer Pavithran Rajan. “Ideally, in this scenario, there is a need to communicate by some method that the speech is artificially created.”

Donara Barojan, an expert formerly associated with the Washington-based DFR Labs, linked to the think-tank Atlantic Council, said the Tiwari deepfakes were “deeply troubling for several reasons”. 

“First, it was highly believable, despite the fact that it was generated using publicly available software that doesn’t require advanced IT skills to use. Second, no one noticed it was a deepfake video until its creators came out and said it was a forgery…” she added. 

“We can only hope the technology platforms will take this threat seriously and develop software capable of identifying these types of forgeries.”

Is there a legal provision to punish misuse? 

Advocate Pavan Duggal, who specialises in cyber law, said the Indian law was silent on deepfakes. 

“In the absence of direct legal provisions addressing deepfakes, Section 66D of the Information Technology act may be applied,” he added.

The section deals with punishment for cheating by personation through the use of computer resources.

Duggal, however, added that this was a “bailable offence”, which means limited deterrence because it is not considered a deeply serious crime. He said it was “imperative” for India to form direct legal provisions to deal with deepfakes.

What Indian political parties say about deepfakes

As things stand, major political parties don’t seem too excited about deepfakes. 

“Third-party agencies have made a mess this time,” said BJP national IT and social media campaign committee member Khemchand Sharma, when asked to comment on the Tiwari deepfakes. Sharma said the communications strategy for the Delhi campaign could have been handled better for a better outcome for the party.

Sudhir Yadav, the AAP spokesperson and IT-social media in-charge for Haryana, said the party “has not used… and does not plan to use deepfake videos”. 

“The party may only consider using deepfake videos if there’s an absolute need for it and there is some essential-use case,” he added, saying deepfakes “can be greatly misused in politics to make false messages go viral”.

Congress social media head Rohan Gupta, meanwhile, questioned the potential of deepfakes in political campaigns. “Deepfakes, like fake news, carry the connotation of being fake. Deepfake video messages from a candidate won’t succeed in truly connecting with voters because it looks and feels fabricated.”

The next big assembly election will be in Bihar towards the end of this year, but the technology doesn’t elicit excitement among local players either. 

Janata Dal (United) leader K.C. Tyagi said the party need not use social media tricks to win elections and is against any disinformation social media campaigns. He added that Nitish Kumar, the current Bihar Chief Minister and JD(U) chief, is different from other leaders and “doesn’t need to resort to techniques like deepfakes”.

Sanjay Yadav, political adviser to Rashtriya Janata Dal (RJD) leader and former Bihar Deputy Chief Minister Tejashwi Yadav, said he would take a look at news reports on the Tiwari deepfakes before responding but did not get back to ThePrint. 

Political strategist Prashant Kishor, who was national vice-president of the JD(U) until they parted ways over the CAA, did not respond to requests for comment for this report. However, he had told ThePrint in January that he was not familiar with deepfakes.


Also Read: Will we be okay if lynchings were not based on false WhatsApp rumours?


 

 

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular