scorecardresearch
Saturday, April 27, 2024
Support Our Journalism
HomeIndiaGovernanceDelhi think tank suggests how to keep kids safe online. ‘One size...

Delhi think tank suggests how to keep kids safe online. ‘One size fits all’ not the fix

Paper by The Quantum Hub, titled ‘Navigating Children’s Privacy and Parental Consent under the DPDP Act 2023. Fostering a safe & enabling ecosystem for India’s young digital nagriks’, published Thursday.

Follow Us :
Text Size:

New Delhi: India should avoid a one-size-fits-all approach to keep children safe online, a paper by public policy think tank The Quantum Hub (TQH) has suggested. 

Instead of mandating some form of documentary evidence for age verification to access online platforms, the government should deploy different mechanisms — such as facial analysis — depending on the level of risk involved in a particular online activity, the paper, released Thursday, adds. 

The Digital Personal Data Protection (DPDP) Act, 2023, which was passed by Parliament and received presidential assent this August, requires all data fiduciaries — entities that collect and process data, such as platforms, browsers, OS providers and search engines — to obtain “verifiable parental consent” to process data of users below 18 years of age, unless they have been deemed “verifiably safe”.

While the DPDP Act does not outline the framework for implementing the parental consent guidelines, the government is currently working on framing the rules that will detail the procedure and requirements for verifiable consent.

The paper by TQH is titled ‘Navigating Children’s Privacy and Parental Consent under the DPDP Act 2023. Fostering a safe and enabling ecosystem for India’s young digital nagriks’. It has been authored by TQH co-founder Aparajita Bharti, senior analyst (policy) Nikhil Iyer, analyst (public policy) Rhydhi Gupta, and manager (public policy) Sidharth Deb.

It notes that, globally, the conversation around age assurance and age verification has been going on for over 20 years. Countries across the world have experimented with varied age-assurance mechanisms and regulatory codes, it says, adding nevertheless that what works best to protect children is still very much an unsettled debate.

“A review of global discussion on age-assurance methods highlights the complexity of the issue,” the paper says. 

“The practical, moral and legal hurdles in mandating a hard identification requirement implies that a one-size-fits-all approach is likely to impede access to the internet for young digital nagriks,” the paper notes.


Also Read: Why India should sync up its data protection law with the EU’s GDPR


What paper suggests

The authors say they propose “an approach where India allows diverse age-assurance mechanisms to be used by platforms and families”. 

“The age-assurance mechanism in use should correspond to the nature of the data processed, purposes it is processed for, risks associated with it such that the chosen mechanism causes the least detrimental impact to the child in terms of access, equity and safety on the internet.”

The paper recommends that the Ministry of Electronics and IT (MeitY) can work with industry players, parent associations, organisations working with children, and publish rules in the form of a code of practice prescribing principles for data fiduciaries to make age appropriate products and services. 

“This code could have guidance on age-assurance mechanisms that a data fiduciary should deploy, and privacy by design principles on data minimisation, transparency, default settings,” it says.

Secondly, it has suggested that platforms be encouraged to conduct and publish a self-assessment that covers the nature of risks to children emanating from their data-processing activities, and measures taken to mitigate these risks.

This assessment, it adds, should describe what data is collected, for what purpose, and how it is processed.

This sort of assessment will increase public scrutiny of the platform’s design and data-processing practices as well as enable the Data Protection Board — a regulatory body proposed under the Act — to hold platforms accountable for their own assessments and policies. 

“Based on the self-assessment of risk, and MeitY’s guidance, platforms may decide the age assurance mechanism that is most appropriate for their product or service,” it adds.

‘Chilling effect’

Apart from the risk of theft or misuse of data collected in the process of age verification, insistence on certain hard verification methods such as identity documents or credit card information leads to concerns around inequitable access and exclusion of vast swathes of society, the paper says. 

It cites estimates from a National Statistics Office (78th Round 2020-21), saying they revealed that less than 40 percent of Indians know how to copy or move files on a computer, with an even lower proportion having knowledge of internet use. 

The survey also found that digital literacy was better among younger age groups, and worse in rural households. 

“Despite this context, the Bill (sic) relies on parental consent, assuming parents to be better placed to understand the potential risks of online data processing,” it says. 

Major jurisdictions — including the European Union, the UK, the US, and Australia — are experimenting with various regulations that protect the interests of children online, it adds, citing protections against exposure to inappropriate or illegal content, cyberbullying, hate speech, addiction, exploitation, and abuse. 

“At the same time, access to the internet is an important indicator of the opportunities an individual has in today’s world,” it says. 

“Research by international experts like Danielle Citron from the University of Virginia suggests that the tendency towards constant oversight and monitoring of children’s online activities can negatively impact their personal development and long-term socioeconomic prospects,” it says. 

The paper says research also suggests that such systems cause a chilling effect on children where they may refrain from using the internet to learn new skills and explore new activities.

“Thus, the impact any regulation has on the development, well-being and prospects of children should be at the forefront of policymakers’ thinking,” it adds. “Factors such as low digital literacy among parents, shared device usage, gender divide and social norms around usage of the internet by young women also need to be considered while evaluating technical solutions for operationalising parental consent.”

Courts in countries like the US, the paper says, have ruled against hard verification mandates, arguing that they will affect citizens’ ability to freely navigate the internet and express themselves. Similarly, regulators in Australia and France have concluded that none of the existing age-assurance technologies satisfies standards of sufficiently reliable verification, complete coverage of the population, and upholding the privacy of citizens.

“Irrespective of these trade-offs, no one technology solution has emerged that is universally accessible to people across different socioeconomic backgrounds and is not reasonably prone to circumvention,” it says. “Thus, the latest evidence would suggest a one-size-fits-all approach is unable to ensure the entire ecosystem is successfully covered under the framework.”

(Edited by Sunanda Ranjan)


Also Read: India’s 1st Data Protection Act — what it could have been had proposed amendments been debated


 

Subscribe to our channels on YouTube, Telegram & WhatsApp

Support Our Journalism

India needs fair, non-hyphenated and questioning journalism, packed with on-ground reporting. ThePrint – with exceptional reporters, columnists and editors – is doing just that.

Sustaining this needs support from wonderful readers like you.

Whether you live in India or overseas, you can take a paid subscription by clicking here.

Support Our Journalism

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular