New Delhi: Ministry of Electronics and Information Technology (MeitY) has proposed that labels identifying AI-generated content must remain visible for the entire duration of such content, tightening a requirement that currently only mandates “prominent visibility” without specifying how long the label must stay on screen.
The change comes as a late addition to a broader package of draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. MeitY had first published the draft amendments on 30 March for public consultation, with a comment deadline of 29 April. On Tuesday, the ministry extended that deadline to 7 May, citing the need to give stakeholders time to examine the new proposal before responding.
The amendment targets Rule 3(3)(a)(ii), which governs how platforms must label synthetically generated content. The current language requires a label “that ensures prominent visibility in the visual display”. The proposed substitution reads: “…that ensures continuous and clearly visible display of such label throughout the duration of the content”.
A watermark or disclosure appearing only in the opening seconds of a video would no longer satisfy the rule. Under the revised standard, the label would have to remain on screen for as long as the AI-generated content runs—a change with direct implications for short-form video platforms, news broadcasters using AI-generated visuals, and political advertisers.
MeitY’s notice, dated 21 April, states the extension was granted so that “stakeholders have an opportunity to examine and submit feedback on the aforesaid additional changes along with the earlier draft amendments”.
The broader package
The labelling change arrives alongside several other proposed amendments that have been in the public domain since 30 March and remain unchanged.
The most consequential of these is the insertion of a new sub-rule (4) under Rule 3, which makes compliance with any “clarification, advisory, order, direction, standard operating procedure, code of practice or guideline” issued by MeitY a part of an intermediary’s due diligence obligations under Section 79 of the IT Act—the provision that grants platforms safe harbour from liability for third-party content.
The draft lays down procedural conditions for such directions: They must be issued in writing, specify the statutory basis under which they are issued, define their scope and applicability, and remain consistent with the Act and the rules. Critics have argued, however, that folding ministerial directions into the due diligence framework gives the government a mechanism to hold platforms liable for non-compliance with instructions that may not have gone through legislative or judicial scrutiny.
Two other amendments insert a “without prejudice” clause into the data deletion timelines under Rule 3(1)(g) and 3(1)(h), clarifying that the deletion obligations do not override data retention requirements under other laws—including those arising from law enforcement requests. The amendment addresses an ambiguity that intermediaries had flagged since the original rules came into force in 2021.
Rule 8 has also been amended to clarify that the digital media oversight framework applies to news and current affairs content hosted by ordinary users on intermediary platforms, not just to registered publishers. The Inter-Departmental Committee under Rule 14 has been given expanded jurisdiction to take up matters “referred to it by the ministry” in addition to its existing role of hearing grievance appeals from publishers.
The IT Rules have been amended four times since their original notification in February 2021, with changes in October 2022, April 2023, October 2025, and February 2026. This would be the fifth set of amendments.
(Edited by Viny Mishra)
Also read: India leads in AI talent, but also brain drain & anxiety, says Stanford’s AI index report

