New Delhi: Highlighting the limitations of existing laws in addressing emerging threats from content produced by modern technologies, such as deepfakes, a parliamentary standing committee recommended that the Ministry of Electronics and Information Technology (MeitY) develop a framework to ensure the watermarking of all social media content.
This, the committee noted, would help users differentiate between content generated by users and edited or manipulated using Artificial Intelligence (AI), and recommended that MeitY set up technical standards. It further envisaged a role of coordinator for monitoring and issuing detection alerts for the Computer Emergency Response Team (CERT-In).
These recommendations were made in the 254th report of the Parliamentary Standing Committee on Home Affairs, titled ‘Cyber Crime—Ramifications, Protection and Prevention’. The report of the committee, headed by Bharatiya Janata Party (BJP) MP Radha Mohan Das Agrawal, was tabled in both houses of parliament Wednesday.
“The Committee further recommends that to address issues of deepfake or obscene content being uploaded in the social media, MeitY should consider developing an innovative technological framework mandating all photos, videos and similar content shared on digital platforms to have a watermark as it would help to prove the origin of the content and make it more difficult to fake or edit/manipulate,” it said in its report.
Also Read: Is the golden age of OTT over? Censorship, stars, and the shift to ‘safe’ content
‘Amendments to IT Act, panel for OTTs’
The committee further emphasised that the social media platforms need to be held more accountable in case of their inaction or delayed action in responding to unlawful content, including morphed videos, fake profiles, misinformation and content promoting violence based on religion or caste.
While the committee took into account various new rules introduced by MeitY to address laxity by social media intermediaries, it recommended a periodic review of the immunity granted to these firms under the ambit of the Information Technology Act, 2000.
Social media intermediaries are exempted from liability for hosting or transmitting third-party content on their platforms under section 79 of the IT Act.
“Recognising the challenges posed by technologies such as the metaverse, blockchain and generative AI, the Committee recommends the development of forwardlooking, flexible regulatory guidelines. The Committee is of the view that the safe harbour protections available to intermediaries under the IT Act, 2000, should be reviewed periodically to strike a balance between their liability immunity and the need for greater accountability, especially where platforms fail to act on unlawful content. The Committee also recommends a periodic review of enforcement mechanisms and penalty provisions under the IT Act, 2000, to ensure their deterrent effect remains robust,” the committee further recommended.
Additionally, the committee also advocated for amendments to the Act itself to introduce “specific provisions” making these intermediaries liable for legal actions when they fail to comply with orders from authorities to remove content within prescribed limits.
“Such amendments should include graded penalties, including monetary fines and potential suspension of operations for persistent non-compliance, while ensuring that due process and appeal mechanisms are preserved to maintain the balance between content regulation and freedom of expression,” the committee further suggested.
It also reviewed the operations and regulations of OTT platforms and observed that the absence of pre-release checks, compared to films or movies that require mandatory certifications, and a weak system in place for verifying the age of the audience group, leaves minors at risk of exposure.
To deal with the issue, the committee said, while observing that the OTT platforms have emerged to be “a primary source of entertainment,” that the government may well consider constituting a recognised panel to monitor newly released content on these platforms.
These committees should comprise child development specialists, educators, legal experts, social scientists and community representatives, the committee further recommended to the government.
“The Committee also recommends that OTT platforms must be required to adopt more strong technology enabled age verification systems and effective parental control mechanisms, going beyond simple self-declaration, to better restrict minors’ access to age-inappropriate material,” read the report.
(Edited by Amrtansh Arora)
Also Read: OTT isn’t stopping viewers from turning to TV for sports. Now, remove legacy regulations