views
Most major railway stations in India will use facial recognition to fight crime by the end of 2020, a senior official said, in a move that digital rights campaigners on Tuesday warned could breach people’s privacy in the absence of stringent laws. The system is being trialled in the tech hub of Bengaluru, formerly known as Bangalore, where about half a million faces are scanned every day and - using artificial intelligence (AI) - matched against faces stored in a police database of criminals.
“The railways will become like a virtual fortress,” a senior railways official told the Thomson Reuters Foundation. “Without a physical, brick and mortar boundary wall, we will be able to make the whole system more secure,” said the official who declined to be named as he was not authorised to speak to the media.
Stretching from the foothills of the Himalayas to sandy southern beaches, India’s railway network is one of the biggest in the world, carrying about 23 million people - or the population of Taiwan - every day. But it is also used by traffickers to lure millions of women and children to cities with the promise of good jobs, only to sell them into sex slavery or trap them in bonded labour where they are forced to work to repay a debt.
The rise of cloud computing and AI technologies have popularised the use of facial recognition globally, from tracking criminals to counting truant students. While supporters of the software say it promises greater security and efficiency, some technology analysts say the benefits are unclear and come at the cost of privacy losses and greater surveillance.
India is readying to install a nationwide facial recognition system, likely to be among the world’s largest, but its use in some airports and cafes since last year in the absence of a data protection law has triggered criticism from human rights groups. While the Supreme Court ruled in 2017 that individual privacy is a fundamental right, a data protection bill currently in parliament gives the government power to ask tech companies to hand over users’ data, which human rights groups oppose.
The railway official said that images of people’s faces will be stored remotely for up to 30 days and accessible to the Railway Protection Force, which handles security, after approval from “authorised persons”. He did not elaborate further. Raman Jit Singh Chima, Asia policy director at digital rights group Access Now, called the railways’ plan “dangerous” and said it did not address concerns such as who could access passengers’ data or which third parties were involved.
“How do you know they are not leaking data? How do you know they are keeping it safe and not using it for other purposes?” said Chima on Data Privacy Day on Tuesday. “What the railway authorities seem to be doing right now is definitely intruding on privacy, not matching up to global standards on privacy or even facial recognition that democracies are adopting.”
Authorities say that in a country with 1.3 billion people, such technology is needed to bolster an under-resourced and under-staffed policing system. “There is no other alternative. We have to use technology. In a voluminous country like India, with such multifarious problems, human intervention can only do so much. Policing has to be tech-based,” the railway official said.
Plans are also afoot for the use of facial recognition on board trains, with surveillance cameras installed in 1,200 out of 58,000 carriages so far, he said, while authorities were also testing sensors to detect sounds, from arguments to screams.
As India ramps up facial recognition use, a backlash has grown elsewhere. San Francisco and Oakland in the United States have banned city personnel from using it, while the European Union is considering a similar move in public areas for up to five years.
Comments
0 comment