New Delhi: While the much-awaited draft Digital Personal Data Protection (DPDP) Rules—released over 16 months after the DPDP Act was passed—pave the way for industry to start compliance preparations by providing a broad direction, some experts point out that these “vague” rules come too late and offer too little.
Meanwhile, the government has stressed that the proposed rules have been released after extensive consultation with the industry and create a balance between regulation and innovation while also protecting the rights of the citizens.
The Draft DPDP rules, which are needed to enforce the DPDP Act, were released for public consultation this month. Stakeholders can submit their feedback and comments to the government till 18 February. The draft rules lay out a framework for various issues, including taking consent of individuals for collecting and processing data, rights of data principals (to whom the data belongs), data localisation and protecting privacy of children’s data. It also provides for the registration of consent managers, who will act as a single point of contact for data principals to give, manage, review and withdraw their consent.
“These rules were highly anticipated, with the expectation that they would address implementation challenges, procedural gaps, and areas where the Act required further clarity. While the draft does attempt to cover some of these aspects, there is still significant ground to cover,” said Shreya Suri, Partner at law advisory firm IndusLaw.
The Internet Freedom Foundation (IFF), a digital rights organisation, in a statement, said that the DPDP Rules are “too little, too vague and too late”. For instance, it noted that the proposed rules allow overbroad data processing powers to the government in the context of the provision or issue of a subsidy, benefit, service, certificate, licence, or permit. “Further, Rule 6 on reasonable security safeguards for preventing personal data breaches is vague and requires more specifics,” it said.
Ujval Mohan, Manager-Public Policy at public policy think tank The Quantum Hub (TQH) Consulting told ThePrint that in a law as complex as the DPDP Act, building some ‘flexibility’ is unavoidable.
“Implementing a general scope law like the DPDP Act is complex because of the different ways it interacts with our diverse economy and polity. It is not always possible to predict these interactions or how they evolve, and hard code them into the Act or Rules. Therefore, building flexibility to determine how the law applies in different contexts is unavoidable,” Mohan said. For example, what is considered ‘reasonable’ safeguards for the education sector may not be appropriate for health services.
Experts had also raised concerns over the proposed rules suggesting that significant data fiduciaries may be subject to data localisation requirements. This, they said, was an overreach by the rules and is inconsistent with the provisions of the Act.
Kamesh Shekar, Senior Programme Manager, public policy think tank The Dialogue, said that the proposed rules open a “backdoor” for potential data localisation in India, and do not “entirely align” with the provisions of DPDP Act 2023, that discusses processing personal data outside India.
“Section 16(1) (of the DPDP Act) only discusses restrictions on data transfer for processing outside India, while Rule 14 (of the draft rules) also discusses data processed within India’s territory,” said Shekar.
“These restrictions on cross-border data transfer would be disproportionate, as data security is agnostic to location. Therefore, as we move forward, the central government must carefully deliberate and engage with stakeholders to chart an optimal path forward for cross-border data transfer,” he said.
Speaking to reporters Tuesday, Union Minister for Electronics and Information Technology Ashwini Vaishnaw defended the proposed provisions. “The Act had a clear provision for restriction on processing of personal data outside India…there is no backdoor, this was in the law…. Countries, keeping their interest in mind, keeping the interest of their citizens in mind, will put certain restrictions.”
He added that the data localisation will be guided by sectoral requirements and restrictions will be imposed only where necessary. “That is why we have created the mechanism of a committee. This committee will evaluate any localisation needs raised…and will consult industry stakeholders before making any recommendations.”
Also read: How safe is your private life? Your SIM can be cloned, phone number spoofed & WhatsApp hacked
Protecting data of minors
One of the most significant provisions in the rules is introduction of parental consent for processing the data of minors.
The draft rules state that a data fiduciary should adopt appropriate technical and organisational measures to ensure that verifiable consent of the parent is obtained before the processing of any personal data of a child. They also need to ensure that the person providing consent for a child’s data processing is the child’s parent or legal guardian, and that the parent or guardian is identifiable.
“Protecting children from online harms should be a key priority, and there are different approaches and stakeholders that can come together to achieve this. All of these interventions may not be well suited to be included in a data protection law,” TQH’s Ujval Mohan said.
Mohand said while parental consent is one layer of defence, there are significant limitations and disadvantages that come with this. “Parents are not always aware of specific dynamics of children’s online behaviour, thus over-relying on parental consent does not necessarily keep children safe,” he said, adding that in the Indian context, implementing strict parental consent requirements is likely to disadvantage a large segment of children whose parents may not be digitally aware or have the bandwidth to provide meaningful consent.
“We should be conscious not to create barriers for young Indians from all backgrounds interacting with the online world. It is also important to consider how online spaces are designed and operated to keep children safe, regardless of whether their parents have consented to their usage,” Mohan said.
The rules state that in case of a child, the Data Fiduciary must verify that the parent is an adult by using reliable identity details or a virtual token “mapped” to such details.
Noting that the draft rules’ layout means that data fiduciaries can generate an electronic form of token for identifying individuals who provide consent, The Dialogue’s Shekhar, said, there is less procedural clarity on verifying consent from parents or lawful guardians.
“Similarly, there is less clarity on identifying and segregating everyone who uses digital services based on age, i.e., below and above 18. This would be difficult to implement in the digital setup, as it is difficult to ascertain whether a user is a minor. This omission enables minors, such as a 16-year-old, to misrepresent their age (e.g., claiming to be 21 years old) and circumvent the requirements intended to protect them,” he said.
Moreover, the rules fail to prescribe specific verification methods besides digital locker systems, he said, adding that whether phone calls, video meetings, signed forms, financial information, or other approaches will suffice to establish a parent or guardian’s identity remains unclear. “This ambiguity raises concerns about the adequacy and reliability of the verification process. Also, Children without parents or lawfully appointed guardians are, in effect, excluded from the process of obtaining variable consent,” Shekar noted.
Talking about protection of children’s data, Ashwini Vaishnaw said that the entire world is looking for solutions. “…Because of the digital India programme, we have a very good digital architecture in our country… our digital architecture is better than some of the rich countries also because in our system there is no monopoly. That is why we are able to utilise the power of digital architecture in a much better way. That is why we have come up with this construct of virtual tokens, by which parents as well as children can very clearly identify themselves in a verifiable way.”
He added that this suggestion came from the industry, which thought this is the most practical way, during consultation.
“In today’s technology it is possible to leverage available digital data, without disruption, to ascertain whether a person is over 18 or not. The industry feels so,” Vaishnaw said.
He added that India now at least has a roadmap compared to many other countries to deal with this issue. “And I am sure that as we learn and evolve, it can be evolved into a much more perfect solution.”
Compliance burden
The proposed rules are also expected to increase compliance burden for the industry and social media platforms. According to Mohan, implementing a new legislation or policy will entail some costs and friction as every player adapts to the new regulatory environment, and therefore reasonable transition costs and burden is hard to avoid.
“Transitioning into a framework which empowers data principals and reshapes incentives towards more healthy data management practices is a net positive. Having said that, the ongoing consultations provide a ripe opportunity to minimise outsized negative impacts.”
Vikas Bansal, Partner, IT Risk Advisory and Assurance at BDO India, which provides advisory services, noted that there will be an increase in compliance burden for social media platforms. He pointed out that social media platforms with more than 2 crore registered users fall under the category of significant data fiduciaries and this classification imposes the requirements, including having a mandatory data protection officer, annual data audit, data impact assessment and data localisation requirements.
“These compliances are over and above the normal requirements of consent, data retention, security safeguards, child consent and much more. All these come with an additional layer of privacy protocols and social media platforms should certainly start working on a data privacy office.”
Echoing similar views, Akshaya Suresh, Partner, JSA Advocates & Solicitors, said that the rules will have a “significant operational impact” on significant data fiduciaries (SDFs) and could also impact roll out of their AI products.
“The rules prescribe an annual data protection impact assessment and audit for SDFs, with the reports to be submitted to the board. SDFs will thus have to take accountability measures seriously,” she said.
“Importantly, the rules impose an obligation of due diligence on an SDF to verify that any algorithmic software it deploys to process personal data is not likely to pose a risk to the rights of data principals. This is likely to impact an SDF employing AI in its products or services to test the models against bias or threats and attacks.”
Suresh pointed out that, previously, the Ministry of Electronics and Information Technology (MeitY) had issued advisories to intermediaries and platforms to test their models and algorithms to ensure they do not permit discrimination, inform users that outputs from models under testing may be unreliable and prohibit users from using such models. While the validity and enforceability of these advisories was debated, the draft rules now codify this requirement of due diligence in the legislation. “This is an important step for AI governance,” she said.
(Edited by Zinnia Ray Chaudhuri)
Also read: New criminal laws, data protection to women’s reservation — 10 significant laws passed in 2023