When Judith Duportail, a freelance journalist, broke up with her boyfriend, she downloaded Tinder, the dating app. She began getting addicted to the app and in the course of using it, being a journalist, she became curious about how the app was helping her find matches. With the help of privacy activist Paul-Olivier Dehaye – the founder of Personal Data.IO, which aims to ensure data protection rights – she sent an email to Tinder asking it to show all the data it had on her.
Soon, she was surprised when she received 800 pages of information – about her Facebook likes, links to her old Instagram pictures, ranking of the age group of men she was interested in, the number of friends she had on Facebook and where she had carried out each and every online conversation with her matches. She decided to write a book to describe her experience when exploring herself through Tinder called L’Amour sous algorithme (Love Under Algorithm). During this exploration, Duportail discovered that Tinder uses a desirability rank known as the ‘Elo Score’. To put it simply, the application ranks every profile to assess who is a better match for someone. Duportail’s book reveals that Tinder uses a matching process system and method patent, as it encourages dates between users with similar profiles. This system is capable of classifying users in relation to their intelligence, preferences, wealth, ethnicity, and attractiveness. Reportedly, Tinder rejected Duportail’s claims that it uses this patent, citing that this part of its application was irrelevant to the operability of its platform.
‘We don’t believe in stereotypes,’ said Tinder. However, according to Duportail, the story is quite different, pointing out that Tinder collects information that is far more than what we ordinarily expect while using an app. It knows how long we use the application, what our age filters are, what ages we typically match with, who we wish to match with, every message that we send, where and when we sent it, and even our current job. Further, when we link our Tinder to other applications such as Facebook, Instagram or Spotify, a lot more information about us can be accessed. Then, Tinder even knows which posts we may have liked.
Alessandro Acquisti, professor at Carnegie Mellon University, describes this as ‘secondary implicit disclosed information’ which becomes obvious by observing our behaviour on the application, which is able to tell what is the percentage of black men, white men or Asians that one may match with, what are the types of people that may be interested in us, what words we use and how much time we spend on someone else’s profile before swiping right or left. When asked why Tinder needs so much data about us, a spokesperson said that it aimed ‘…to personalize the experience for each of the users around the world’. By having more data, according to the spokesperson, they are able to personalize all of our browsing experiences on Tinder. However, when Tinder was asked what kind of profiles we would be shown based on this personalization (that is, what is the logic behind profiling our matches?), the spokesperson said, ‘Our matching tools are a core part of our technology and intellectual property, and we are ultimately unable to share information about [our] proprietary tools.’
According to Paul-Olivier Dehaye, the privacy activist communicating with Tinder on Judith’s behalf, this information is not just used by Tinder to observe our behaviour and shape the choices we make in our quest for love. The data is shared to further affect the jobs we are offered on LinkedIn, or to determine how much we may be willing to pay while insuring a car, or which ad we may want to see online or whether we can subscribe to a particular loan.
For example, Grindr, the dating app for gay, bi, trans and queer persons, shared information relating to one’s sexual orientation and location information with hundreds of advertising companies. These third parties reserved the right to collect and share data with a significantly large number of other players in the market. According to SINTEF, a Norwegian research organization, Grindr was sharing its users’ HIV status with Apptimise and Localytics – marketing platforms which help in optimizing the performance of applications. This information was not being shared in the interest of public health, but to serve commercial interests such as profitability. In January 2021, Grindr was fined 100 million Norwegian crowns ($11.7 million) by the Norwegian Data Protection Authority (DPA) for a reportedly illegal disclosure of data to advertising companies.
In India, however, the understanding of the requirement of consent in privacy regulations at this juncture is quite simplistic, and at times even non-existent. Privacy safeguards, at present (before the enactment of the Data Protection Bill), in relation to information collected by businesses primarily stem from the very thin and insubstantial rules notified under the Information Technology Act, 2000, known as the ‘Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011’ (Privacy Rules). These Rules suffer from two significant deficiencies. First, these apply primarily only to a few specific subcategories of personal information known as sensitive personal information (as defined and discussed below).
The Privacy Rules barely apply to other categories of personal information that generally do not constitute sensitive personal information. Before collecting personal information (excluding sensitive personal information, as defined below), these Rules do not require businesses to collect the consent of an individual. Personal information, however, is defined broadly to include information which is directly or indirectly capable of identifying an individual. The definition of personal information is broad enough to encompass phone numbers, addresses, political leaning or views, religion or religious beliefs, preferences, gender, caste, tribe, transgender status information generally relating to a person’s sex life, or even inferences about our lives derived from our preferences based on data analytics. These categories of information, which are typically sensitive and likely to make an individual vulnerable to harms (such as discrimination), remain largely unprotected under the Privacy Rules, given that these categories do not constitute ‘sensitive personal information’.
Sensitive personal information, to which the Privacy Rules primarily apply, is defined narrowly to only include specific categories of personal information such as financial information in the nature of a credit or debit card or other payment instrument details, biometric information, passwords, health information, medical records and history, and sexual orientation information.
The second significant challenge with the Privacy Rules is that, though consent for categories of sensitive personal information is required under Indian law, such consent need not be specific or informed, unlike advanced data protection regimes across the globe, like the GDPR enforced by the Norwegian DPA (discussed earlier). However, unlike most other countries with sophisticated data protection legislation in place, crucial categories of information (mentioned above) are not even categorized as sensitive personal information – such as one’s political or religious beliefs, genetic data, ethnic origin, caste or tribe, intersex or transgender status, or information generally relating to one’s sex life.
What does this mean for our privacy? Since the introduction of the Privacy Rules, despite an exponential and unparalleled increase in the number of online users in India, online platforms do not even legally require the permission or consent of Indian users to collect their personal information.
To contrast this with international best practices on data protection, most developed countries require consent for the collection of personal information, while also creating a bundle of enforceable rights in relation to that data. For instance, the GDPR in the European Union (as discussed earlier, in the context of Grindr) includes the right to be forgotten, right to erasure, right to restrict processing as well as the right to data portability (each of these rights is discussed in detail in Chapter X). However, the Privacy Rules create only a few, limited number of rights, including the right to access and correct any deficiencies or inaccuracies in relation to our personal information, and the right to withdraw one’s consent albeit only in relation to sensitive personal information. Compared with the rest of world, this puts us far behind in terms of sophistication in data protection law.
In fact, since the introduction of the Privacy Rules in 2011, there have hardly been any noticeable instances where online businesses have been held adequately accountable for violating user privacy.
Further, countries with sophisticated data protection laws prescribe penalties for failing to comply with security protocols to protect user privacy. For instance, under the EU GDPR, data protection authorities can issue fines up to 20 million or 4 per cent of the global turnover of the previous financial year of a business, whichever is higher. To explain how high the stakes are in the EU to violate someone’s privacy, let’s look at the figures – Facebook’s annual turnover in 2018 was $55 billion, 4 per cent of which could, for instance, mean a fine of $2.2 billion. In fact, the Federal Trade Commission (an independent agency of the United States Government for consumer protection) recently imposed a fine of $5 billion on Facebook for allowing Cambridge Analytica to collect data from its platform, which was approximately 9 per cent of Facebook’s worldwide turnover.
In India, however, the maximum penalty under law for failing to comply with several obligations under the Privacy Rules relating to personal data and sensitive personal data, for which no penalty is separately prescribed, can only be `25,000 ($350 approx.). It is probably a lot more expensive to comply with data security norms for big technology companies with billions of dollars than to pay the penalty in the first place; given on balance, it is perhaps far more profitable to disregard privacy in India as a consideration. Consequently, many big tech companies have thinner privacy policies in India as compared to Europe and the United States.
The absence of a framework that holds platforms adequately accountable for failure to respect privacy norms results in the absence of incentives for businesses to adopt privacy-respecting practices.