On 26 August, citing an infographic published by Forbes India, Delhi Chief Minister Arvind Kejriwal tweeted, “Feel proud to say that Delhi beats cities like Shanghai, NY n London with most CCTV cameras per sq mile.” The infographic was based on a report by Comparitech, which says that Delhi has 1826.5 CCTV cameras per square mile. Chennai has also found itself in the top 10. With 609.92 cameras per square mile, Chennai is placed in third.
In the bizarre jubilation over having some of the most surveilled cities in the world, what our politicians fail to mention is the impact this might have on our right to privacy.
The rapidly increasing number of closed-circuit cameras in Indian cities becomes highly concerning when read along with the growing use of facial recognition technology (FRT) in the country. From the Ministry of Home Affairs to the Airports Authority of India, FRT is being used by several government agencies at both Union and state levels.
Western Railways, one of the busiest rail networks in India, recently announced the commissioning of 470 cameras with FRT. These cameras will be installed in railway stations across Maharashtra and Gujarat, including the Mumbai suburban rail network. The camera system, developed by Russia-based firm NtechLab, is intended to be used for counting passenger traffic and shape strategy as well as to identify criminals and missing persons. Meanwhile, The National Crime Records Bureau (NCRB) is in the process of creating an ‘Automated Facial Recognition System’.
As per a database maintained by the independent research organisation AI Observatory, there are at least 23 different government-funded projects to install facial recognition systems in various parts of the country.
Critics of facial recognition and other similar surveillance technologies back their arguments with the example of China. The country, which is also India’s fiercest rival, has set up a ruthless and sophisticated surveillance network to monitor every move of its citizens. According to reports, China has been aggressively using such systems on Uighur Muslims in Xinjiang province. These tools are used by the Chinese authorities to identify possible or repeat ‘offenders’, individuals who have been deemed ‘suspicious’ for something as simple as having left the area their household is registered to.
In India, most justifications made by the Union government when it comes to surveillance technologies concerns ‘national security’. Whereas, when it comes to state governments, the justifications have been largely about deterring crimes. But in the absence of a robust personal data protection regulation, the use of FRT and other similar surveillance technologies comes at the cost of the citizens’ right to privacy — lack of which would inadvertently impact other fundamental rights such as the freedom of speech and freedom to lawfully protest as well.
During the protests against Citizenship Amendment Act (CAA), Delhi and Uttar Pradesh police used FRT to screen and identify and protesters. UP police arrested more than 1,100 protesters with the help of the technology. Facial recognition is now being used by Delhi Police to crackdown on those protesting against the Union government’s agricultural reforms. Interestingly, when Delhi police had initially acquired FRT, its intent was to identify missing children.
The overt usage of FRT by authorities becomes more sinister when one considers their abysmal accuracy rates. The accuracy of facial recognition by the software used by Delhi Police during its trials in 2018 was just two per cent. In 2019, it reportedly fell to just one per cent and could not even differentiate between boys and girls — although the software was originally procured to find missing children. This raises concerns of arrests being made due to ‘false-positives’ as the police are increasingly using FRT to identify suspects. In countries such as the United States, several such arrests have already been reported — leading to innocent people spending time in jail.
As FRT is also being increasingly used in public welfare schemes, the inaccuracy of such systems could lead to deserving people being left out. The accuracy rates are especially lower when it comes to racial and sexual minorities. Since people from minority groups are in high need of such welfare schemes, ‘false-positives’ or ‘false-negatives’ in facial recognition will lead to further exacerbation of inequality in society.
The Indian government needs to come up with robust regulations to control the usage of FRT by both government agencies and private enterprises, in order to ensure the protection of our right to privacy. Civil society and advocacy groups need to do more to make people aware of the pitfalls of this technology. The authoritarian society of China should not be something we should aspire to emulate, and we should resist any attempts to do so.
Sreedev Krishnakumar is a student at Asian College of Journalism, Chennai. Views are personal.