Minister of Electronics and Information Technology Ravi Shankar Prasad announced in February this year the Narendra Modi government’s intention to legislate a new Information Technology Act. He said that since technology was now at the centre of issues like privacy, cybercrime, delivery of services and digital payments, the IT Act needs to reflect that too. In April, his ministry initiated a consultative process with stakeholder ministries and industry associations, seeking inputs for the new law.
The IT Act originally came into effect in 2000 to give legal recognition to electronic records and digital signatures. It was amended in 2008 to include provisions on intermediary liability, data protection, and cyber offences. In the last 20 years, the landscape it regulates has far outgrown its legislative remit. New players have evolved and the internet ecosystem faces complex challenges today. India is home to more than 687 million internet subscribers who rely on this digital public sphere to shop, communicate, entertain, commute and medicate. Any legislation that governs the internet needs to be reflective of this hyperconnected reality.
Intermediaries are entities that, among other functions, facilitate information flow or host information in cyberspace. They include internet service providers, social media companies and messaging platforms, to name a few. One question a new IT Act must answer is: what is the liability of such entities? In their early years, “safe harbour” provisions in law guaranteed intermediaries conditional immunity against third-party content. This served twin purposes: it spurred economic innovation as well as communication at an unprecedented scale. Consequently, these platforms have been woven into the fabric of people’s digital lives today.
This progress came with a set of major challenges — most specifically in relation to social media, whose rapid rise gave birth to phenomena like fake news and misinformation, which often has violent consequences. In India, between January 2017 and June 2018, rumours on WhatsApp led to 69 incidents of mob violence, in which 33 people lost their lives. Revenge pornography, which involves sharing explicit sexual imagery without the depicted person’s consent, also became a dark reality. The current state of affairs necessitates these platforms’a closer regulation than ever. Mark Zuckerberg, the co-founder of Facebook, too, accepted this need for oversight at the Munich Security Conference in February this year.
But will it be possible to regulate intermediaries in a manner that eliminates these challenges but doesn’t hamper their ability to foster innovation?
Section 79 of the IT Act provides safe harbour to intermediaries, provided that they observe the stipulations in the Information Technology (Intermediaries Guidelines) Rules, 2011. However, these guidelines are replete with broad terms that run the risk of wide misinterpretation. For example, Rule 3(2)(b) mandates intermediaries to inform users to not upload and/or share any information that is “disparaging”, “blasphemous”, “obscene”, and “libellous”. But what counts as obscene or disparaging is neither defined under the Rules nor under the Act.
Also read: Grocery delivery apps and video streaming — Covid is boosting the stay-at-home economy
In its 31st report, the Parliamentary Committee on Subordinate Legislation highlighted this gap and recommended steps to remove such ambiguities. Unfortunately, the recommendations were not acted upon.
The continued presence of offensive content on social media platforms, has often forced users to knock on the court’s doors. Users have sought legal intervention on issues that have ranged from a ban on pornography to the removal of defamatory content. The adjudication of these cases, by different courts, is fragmented, and has added to the intermediaries’ liabilities that are presently unclear in law.
Take for example, the Prajwala case that highlighted the circulation of sexually violent videos on WhatsApp. In 2017, the Supreme Court suggested remedies which included blocking search queries based on identified keywords and preventing users from uploading such videos. Implementing such measures would require proactive monitoring of content. This, however, runs contrary to the Supreme Court’s 2015 judgment, in Shreya Singhal v. Union of India, under which intermediaries were not required to exercise their own judgment to ascertain legitimacy of content. This only further accentuates the need for an overarching law that provides certainty to intermediaries and users alike.
Unlawful content is a global issue for which a holistic solution has still not been found. Most countries have, until now, relied on solutions that are either onerous or overbearing. For example, Singapore’s Protection from Online Falsehoods and Manipulation Act, 2019 (POFMA) bestows sweeping censorship powers to the government, allowing it to order correction, redaction or blocking of information that it deems prejudicial to public interest. In January this year, within three months of taking effect, POFMA was invoked to order Facebook to block the page of a news website.
Some countries have also factored in the power of social networks to influence public opinion. For example, Germany’s Network Enforcement Act, 2018 (NetzDG), uses a threshold-based gradation approach towards regulation of social media platforms by limiting its application to platforms with more than two million registered users in the country. This makes the operating conditions for smaller platforms, especially startups, less onerous.
Also read: Twitter, Facebook profited a lot from India’s hate agenda. Time to pull the plug with a law
Synthesise new realities
A new IT Act presents an opportunity to harmonise the existing legal framework governing e-commerce. Despite being enacted to facilitate electronic commercial activities, the IT Act never defined what commercial activities entail. Instead, the IT Act envisaged a liability model, applicable horizontally to intermediaries.
As new business models evolved, the digital ecosystem became more heterogeneous and gave rise to distinct categories of actors. These include aggregation platforms like Uber and Zomato, and digital payment services like Paytm. Within certain categories, there are functional sub-classifications too. For example, the video-on-demand segment includes user-generated videos and curated content.
More importantly, the Act’s amorphous approach towards platform governance led to a fragmentation that involved multiple regulators. Today, e-commerce is defined across a spate of laws, including the Consumer Protection Act, 2019; the Central Goods and Services Tax Act, 2017; and the extant Foreign Direct Investment (FDI) policy. As a result, rule-making vis-à-vis e-commerce has become cumbersome.
While the Allocation of Business Rules assign e-commerce matters to the Department for Promotion of Industry and Internal Trade (DPIIT), the IT Act, the parent legislation that governs it, remains with the Ministry of Electronics and Information Technology (MeitY). Further, under the Consumer Protection Act, 2019, the Department of Consumer Affairs has the power to frame rules governing e-commerce. Thus, there is confusion over which ministry exercises jurisdiction on e-commerce and to what extent.
Also read: Mark Zuckerberg refocuses on online retail with Facebook’s ‘Shops’
Stable but adaptive regulation
A stable regulatory environment, which ensures the predictability of regulatory action, is necessary for India to cultivate a $5 trillion digital economy by 2024. Thus, the challenge for the new law will be to bridge current regulatory gaps and remain future-proof.
Traditional models rely on the ‘regulate and forget’ approach, but this is unsuitable to technology-oriented businesses. Regulators across other jurisdictions have revised their approaches and adopted models that are agile, to keep pace with technological evolution. Adaptive regulation, an iterative approach based on faster feedback loops, is one such model. Risk-weighted regulations, a risk-based and segmented approach informed by data analysis, is another example.
Perhaps the most prominent approach is a principles-based one. It delineates outcomes that stakeholders can achieve, rather than prescribing procedures. For example, Australia’s Federal Privacy Act, which defines 13 privacy principles governing processing of personal data, is a manifestation of this approach.
A new IT Act presents the Modi government with an opportunity to address India’s distinct digital challenges. It is also a chance to remove ambiguities around technology in law, re-examine the responsibilities of actors in cyberspace, and ultimately create a regulatory environment that helps India become a digital economic powerhouse.
The authors work at Koan Advisory Group, a technology policy consulting firm. Views are personal.
This article is part of ThePrint-Koan Advisory series that analyses emerging policies, laws and regulations in India’s technology sector. Read all the articles here.