The government’s pilot project to use Aadhaar-based facial authentication for its ongoing inoculation drive is just a “proof of concept” at this stage, and won’t be made mandatory even if it is rolled out across the country, National Health Authority chief RS Sharma said while speaking exclusively to Entrackr.
In an interview earlier this week, Sharma had revealed that the government was running a pilot project in Jharkhand to test the efficiency of the Aadhar-based facial authentication system to ascertain the identity of people who received COVID-19 vaccinations.
Sharma’s comments had set alarm bells ringing among India’s civil society, with people questioning the need to deploy such a system—which depends on the Aadhaar infrastructure which studies have found has led to the exclusion of genuine beneficiaries—especially when the country is grappling with a public health crisis.
While speaking to Entrackr, Sharma clarified three important things. First, that the project isn’t a “pilot” but a “proof of concept” essentially downplaying how strongly the government was following its results. Second, that the system will not be made mandatory and third, that the process involves face authentication and not facial recognition — there being a fundamental difference in how the two work.
In the former, a face is compared with a picture of the same face in a database to see whether they match. In the latter, a face is compared against an entire database to find a match.
Addressing issues of the facial authentication system’s accuracy and efficiency, Sharma, who is also the chairperson of the empowered committee for administration of the COVID-19 vaccine, said that it won’t be the last word in verifying the identity of a person.
“What’s the worst case scenario? That the system won’t recognise a person. Aadhaar has multiple other ways of authenticating a person’s identity. So in case the facial authenticator fails, people can establish their identity using their fingerprints, iris scan, or OTP,” Sharma told Entrackr.
For those following Sharma’s outlook towards technology policy issues, his support of Aadhaar in this particular case comes as no surprise. He was after all the chief of the Unique Identification Authority of India or UIDAI, India’s Aadhaar agency. His confidence in Aadhaar is so profound that he had once made his Aadhaar number public and challenged hackers to show the harms that could befall him.
And Sharma’s trust on the Aadhaar system was apparent. “The picture that will be taken during the face authentication process won’t be saved anywhere. Only a log of it will be saved. Aadhaar doesn’t know why that particular picture was taken,” Sharma said.
In fact, the use of Aadhaar’s digital infrastructure for vaccine administration has been pitched several times now, including by the likes of technocrat Nandan Nilekani and Kiran Mazumdar Shaw, the chairperson and managing director of Biocon Limited.
Sharma had earlier said that the idea behind experimenting with the technology was to reduce the number of touchpoints and move towards a more “contactless” way of administering vaccinations.
Following the experiment, the National Health Authority will submit its results to the UIDAI which will then take a call on whether or not the face authentication process should be rolled out across the country, Sharma said.
When asked about the number of samples the authority will have to submit to the UIDAI, Sharma declined to comment saying that he can’t prescribe the exact number.
While Sharma’s clarification that the system, if rolled across the country, won’t be made mandatory, and uses facial authentication instead of facial recognition is a welcome one, experts said that it could still be plagued by issues and cautioned against depending on it at all.
“The issue with 1:1 authentication [face authentication] remains that inaccuracies in the technology will result in false negatives,” Anushka Jain, associate counsel for transparency and right to information at the Delhi-based digital rights group Internet Freedom Foundation, told Entrackr.
Experts also questioned the legal basis, or the lack thereof, under which these biometric systems are being deployed.
“This is a case of supposedly shiny, minority-report style tech being used to fuel promises of efficiency, scale and neutrality, even if the reality is starkly different,” Vidushi Marda who leads Article 19’s research and engagement on AI and human rights told Entrackr. Article 19 is a British human rights organisation.
Such systems are also currently being used in the absence of a valid legal basis, making their implementation even more precarious and fundamentally problematic, Marda said.
“Making it the gatekeeper for an urgent public health issue is thus logically, legally, technically flawed,” she added.
Even if the system may not be mandatory on paper, it can still raise concerns, Smriti Parsheera, a fellow at the CyberBRICS Project told Entrackr. The CyberBRICS Project works on developing policy suggestions in the areas of cybersecurity, internet access, and digitised public administration in the BRICS countries.
“Even in the best of times it is difficult for people to really understand the long term implications of facial authentication tech—for themselves and for the groups that they represent—so relying on consent as the basis for processing in a pandemic situation is certainly not adequate,” Parsheera said.
There’s also a significant question mark around the quality of images present in the Aadhaar database, which are often taken using poor quality webcams in poorly lit conditions, and the impact it might have on the results of the facial authentication.
“This is definitely a concern—as is the general issue of imperfect authentication, pictures of poor quality, etc. I am wary of simply arguing against this on an accuracy basis, though—even if we had perfectly accurate facial recognition systems, even in that hypothetical scenario, using Aadhaar based face recognition authentication doesn’t pass the tests of legality, necessity and proportionality that rights-infringing tech ought to,” Marda said.
“The photographs that will be used will be from, I am assuming, the Aadhaar database which has its own inaccuracy problems. If, for example, the technology fails to match me with my Aadhaar photograph due to some change in my appearance, like the Uber incident, I will lose out on access to the COVID-19 vaccine.” Jain said.
Parsheera also questioned the need to test the efficiency of a facial authentication system in the middle of a public health crisis. She said that this seems like a way of exploiting people’s vulnerabilities and desperation to get vaccinated in order to aid the general progression of facial recognition technology.
All this begs an important question — where does it stop?
In the absence of a wider data protection law and a specific facial recognition legislation, the probability of function creep of the technology arises, Jain said.
Function creep is essentially when a technology is used for purposes not originally intended for. Aarogya Setu, originally developed as a contact tracing app, becoming a crucial cog in the wheel of the Digital Health Stack is a case in point.