22 year old Neradi Srikanth, an Uber driver, says he really dislikes tech and haircuts.
Srikanth visited the Venkateshwara Temple at Tirupati on February 25, and as the tradition goes, got his head shaved to offer his hair in return of a favour. He said he prayed for a life of happiness and prosperity.
But when he returned home to Hyderabad on February 27 and tried logging in to the Uber app for its driver partners, the facial recognition algorithm which verifies whether it’s the same person logging into a particular account, failed to recognise him with his tonsured head.
He panicked and attempted to log in to the system multiple times, trying to capture a picture of his face from multiple angles hoping that the algorithm would recognise him but to no avail, Srikanth recounted while speaking to Entrackr.
He was ultimately banned from logging in to the app following “multiple login attempts,” Srikanth said.
Friday, April 2, marks the 35th day since Srikanth has been locked out of his account following what he called “a faulty face matching process”.
In that time, he has made multiple visits to the Uber Partner Seva Kendra in Kondapur, Hyderabad. “No one at Uber has even tried to help me properly. No one seems to know exactly why the app failed to identify me. It has been more than a month since I took a ride,” Srikanth said.
“I’ve given so many hours of my life to Uber. But now that my account has been blocked for so long, I’m finding it hard to make ends meet. The person who has helped finance my car has been asking me for the month’s payment, but I don’t know how to pay him,” Srikanth, who joined Uber in 2019, said.
Over a video call, Entrackr saw Srikanth’s problem. When he pressed a button on the Uber app that would normally activate the phone’s camera and initiate the face matching process, the app returned with an error message saying: “Your requests are paused. If you think there has been a mistake, reach out to our support team”.
Uber has denied that its face recognition tool was at fault and said in a statement to Entrackr that Srikanth’s access to the app was removed due to “repeated violations of our community guidelines”.
However, the company’s statement did not clarify the provisions of its community guidelines that Srikanth had violated.
Uber also did not answer our questions on the accuracy rate of its face-matching algorithm and the kind of dataset that was used to train the system.
“Uber’s facial recognition tool is capable of detecting natural changes in a person’s appearance such as long or cropped hair. In case drivers face a problem logging in due to any technical issue with the selfie verification process, they have the option to visit the nearest Uber Partner Seva Kendra for a manual review of their profile,” an Uber spokesperson said.
The tech in question here is Uber’s face verification algorithm called the ‘Real-Time ID Check’. The system prompts drivers to share a selfie before going online and matches that image to a picture the company has in its file. The tool is based on Microsoft’s Face Detection API. It was launched in India in March 2017, according to an Uber blog post.
Aside from matching a facial image with another image in Uber’s database, the tool can also detect human faces in an image and determine additional attributes such as whether or not a person is wearing glasses, Uber said in the post.
However, Srikanth’s case is not the only example where the system has seemingly failed to work as advertised. In the UK, taxi unions have called for suspending Uber’s facial verification tool after multiple cases of misidentification were reported. The company has however refuted the claims made by drivers in the UK.
In general, facial recognition systems deployed around the world suffer from a number of issues, including poor accuracy rates and biases against underrepresented communities, research has shown.
And it also throws up serious data privacy issues.
“Facial recognition is a very invasive form of surveillance,” Divij Joshi, an independent lawyer, researcher and tech policy fellow at Mozilla told Entrackr.
“Employers have legitimate reasons to ensure different kinds of mechanisms for attendance but requiring someone to depart with their biometrics for accessing their app is very privacy-invasive and has significant potential for misuse,” Joshi said.
Even sophisticated facial recognition systems are highly prone to errors in real-time environments, according to Joshi, and this can lead to instances like Srikanth’s where someone is wrongly identified and deprives them of their rights, he said. However, it is often hard to decipher why systems like this fail since the companies don’t open them for independent audits, he added.
Despite the clear issues with the technology, there is currently no law in India that prohibits the government or private companies like Uber to deploy facial recognition systems.
“There is no law explicitly prohibiting this kind of worker verification or attendance system, or even preventing facial recognition in general [in India],” Joshi said. “There is a need to incorporate strong protections for biometric data in any forthcoming privacy laws,” he added.
Entrackr’s queries to Uber asking them about the safeguards they have put in place to deal with a large amount of facial data they have access to, remained unanswered until publication.
India is currently working on a data protection law, a draft bill that is being deliberated upon by a Joint Parliamentary Committee. However, Joshi said that even if the bill were to be enacted in its current form, it could be of limited utility for cases like these.
“The Personal Data Protection Bill, 2019 considers facial images as biometric data and requires different kinds of structural protections as well as provides individual user rights, such as the right of access and correction. However, the present bill would be of limited utility because here the problem is with the algorithm that verifies the live photo with the database,” Joshi said.
In fact, the bill proposes to dilute privacy protections for employees which makes it more difficult to act against such technologies, Joshi added.