Skip to content

AFRS: Is this the rise of the surveillance state?

A piece like this starts typically with an Orwell quote or a tribute to him. We, however, are no longer a part of such a romantic journey where we can muse about something this sinister. A grand plan like this needs no unnecessary romanticism. Case in point, the somersault National Automated Facial Recognition System or AFRS and NCRB’s displayed in its RFP process.

Recap:

So what is the issue actually? Facial Recognition Technology or FRT is being imposed on India through the garb of security and pre-emptive counter-threat mechanism. This is being done in spite of the proposal having severe flaws and loopholes. A 100% accurate FRT will only be possible if all individuals are classified and queried. This is where the rabbit hole of errors starts. 

First, a claim by the MHA back in 2019 that an efficient FRT is only a technological up-gradation that the police force badly needs is flawed logic. That the MHA specified this will not come in the way of a citizen’s privacy is flawed even as explained above.

Second, we already know that an existing system being used by Delhi Police has an accuracy of 2%, as per the submission before a bench of justices S Muralidhar and Vinod Goel made by the Delhi Police counsel, Rahul Mehra back in 2018. Quoting Mehra, here’s what he had said back then – “The software is not good. There is only two per cent match,” – the court was attending to a habeas corpus plea. This circles back to the above point where all of us are, at some point, potential criminals till proven innocent. All of our faces would be sources of biometric IDs tracked and traced until the point we receive a no-match from the criminal database, the CCTNS for example. 

Third, as a corollary to the second point, Facial and Image recognition can give rise to both false positives and false negatives; either of the two can prove fatal in a country like India. A false positive case will have an innocent picked up for a time-period till she/he gets highlighted as a mistake. A student, for example, returning from a coaching class gets involved in an agitation unwillingly. The agitation turns violent; police searches the CCTV footages for potential defaulters and picks up the student. This is a false positive case. On the other hand, a false negative case will have the system not recognize this person in spite of legal documents maybe. Here’s another example, a citizen declared invalid by a biometric-based system, in spite of the fact that she/he is a legitimate citizen is a case of false negative. Both the cases will have huge implications on a mass-scale in this case even.

Fourth, and a crucial brick to this wall is the myth-buster that new tech is an essential harbinger of hope in modernizing a force. A system like that of AFRS without data protection laws in India is almost suicidal. Over and above, this belief that this special tech is going to benefit an understaffed Indian police force is flawed argument once more. With an upgraded and highly motivated force like the San Francisco police putting a ban on facial recognition system, should we pin high hopes on such tech?

Coming back to the point:

First up, it is to be understood once and for all, mass-surveillance is the only way forward in making this over-arching, uber ambitious AFRS work. A positive match from a sea of innocents is the mechanism to nab a criminal and for that every negative (individuals in this case) has to be identified and tracked.

Second, this is 2020 and we are still not adhering to the concept of consent. We all know that funny CCTV surveillance messages in public spaces look cool; do we necessarily agree if we are informed that our faces would be kept on record for a longer period or maybe on a permanent basis? Should we nod our head in unison even then? I believe we would then vehemently oppose the move because our images were not taken with consent. Herein lays the catch of this tech which simply reduces the concept of privacy and consent to mere buzzwords. 

Third, and most critical of all, such a robust framework without the legal safety net will be purely a shot in the foot. Without a strong data protection law, there is no accountability. With so many central and nodal and regional agencies and units working on security, law, and order, a complete absence of accountability will lead to compromised civil rights and liberties. End of the day, we do not need another Aadhar-like scenario that rubs salt into an existing wound.

While Orwellian terminologies are cool to hear and use even, in reality, we do not need a Big Brother. The landmark case of Puttaswamy v. Union of India (2017 (10) SCALE 1 has given us a unanimous ruling that privacy is a constitutionally protected right in India, and we should not let go of that. An unchecked behemoth like AFRS by the NCRB can lead to arbitrariness, further shaking up the social fabric of the nation. What we require before such fancy steps are a) presence of legal-bindings and legal frameworks that legitimately authorizes AFRS’ existence b) clear highlighting on how consent would be generated and approved and, c) presence of strict safeguards that generate accountability. In the absence of legal bindings, there can be a loss of accountability even. That Orwellian ending will not be very romantic. Period.

With additional inputs from:

IFF – link 1 

IFF – link 2

IFF – link 3

Business Insider

The Print

The Hindu