[ad_1]
As cases of violence against women and girls have surged in South Asia in recent years, authorities have introduced harsher penalties and expanded surveillance networks, including facial recognition systems, to prevent such crimes.
Police in the north Indian city of Lucknow earlier this year said they would install cameras with emotion recognition technology to spot women being harassed, while in Pakistan, police have launched a mobile safety app after a gang rape.
But use of these technologies with no evidence that they help reduce crime, and with no data protection laws, has raised alarm among privacy experts and women’s rights activists who say the increased surveillance can hurt women even more.
“The police does not even know if this technology works,†said Roop Rekha Verma, a women’s rights activist in Lucknow in Uttar Pradesh state, which had the highest number of reported crimes against women in India in 2019.
“Our experience with the police does not give us the confidence that they will use the technology in an effective and empathetic manner. If it is not deployed properly, it can lead to even more harassment, including from the police,†she said.
Lucknow is one of eight cities implementing a Safe City project that aims to create a “safe, secure and empowering environment†for women in public places, and curb crimes with “safer urban infrastructure and efficient access†to police.
But the project — alongside the 100 Smart Cities program that relies on technology to improve services — is being used to exponentially increase surveillance, said Anushka Jain, an associate counsel at the Internet Freedom Foundation in Delhi.
“Authorities have used crimes against women as a justification to step up surveillance, but the massive spends on CCTV and facial recognition technology do not correlate to a corresponding drop in crimes against women,†she said over the phone.
“By targeting women disproportionately (authorities) are creating new problems in a society where women are already constantly tracked in their homes and for whom anonymity in public places is so important,†she said.
Lucknow Police Commissioner D.K. Thakur declined to give details on how the technology will be deployed, and how the data will be monitored or used.
Watched constantly
Worldwide, the rise of cloud computing and artificial intelligence technologies has popularized the use of facial recognition for a range of applications from tracking criminals to admitting concertgoers.
In Pakistan and India, these systems are being touted as necessary to modernize understaffed police forces and aid their information gathering and criminal identification processes.
But technology and privacy experts say the benefits are not clear and that they could breach people’s privacy, and that without data protection laws, there is little clarity on how the data is stored, who can access it and for what purpose.
The technology is also plagued with issues of accuracy, particularly in identifying darker-skinned women, nonbinary people and those from ethnic minorities.
The Delhi Police, in 2018, reported that its trial facial recognition system had an accuracy rate of 2%. The Ministry of Women and Child Development later said the system could not accurately distinguish between boys and girls.
“We must question the efficacy of this solution and the dependence on digital infrastructure to solve socio-technical challenges,†said Ashali Bhandari, a senior urban planner at Tandem Research in Goa.
“It’s ironic that shielding women from unwanted attention involves watching them constantly through digital technology networks. It’s not empowering women, rather it promotes the idea that women need to be watched for their own safety,†she said.
At least 50 facial recognition systems are in place across India, and the government plans to roll out a nationwide network. Dozens of cities have also introduced mobile safety apps.
Meanwhile, a rape is reported every 15 minutes, according to government data, and crimes against women nearly doubled to more than 405,000 cases in 2019, compared to about 203,000 in 2009.
Trading privacy
There is a growing backlash in North America and Europe against the use of facial recognition technology. But in Asia, it is being widely deployed.
Under Pakistan’s Safe Cities project, thousands of CCTV cameras have been installed in Lahore, Islamabad, Karachi and Peshawar.
Images from cameras in Islamabad of couples traveling in vehicles were leaked in 2019, while women at Balochistan University said they were blackmailed and harassed by officials with images from campus CCTV cameras the same year.
Following a gang rape last year on a highway with CCTVs, the Punjab Police launched a mobile safety app that collects the user’s personal information when she sends an alert to the police during an emergency.
That includes access to phone contacts and media files – leaving women vulnerable to further harassment, say privacy rights groups.
“Technological interventions that seek to increase surveillance of women in order to ‘protect’ them often replicate familial and societal surveillance of women,†said Shmyla Khan, director of research and policy at the Digital Rights Foundation.
“Women cannot be expected to trade privacy for vague assurances of safety without proper mechanisms, and transparency on the part of the government,†she added.
The Punjab Police did not respond to a request for comment.
‘Surveillance-centric project’
The Indian cities of Chennai, Hyderabad and Delhi are among the top 10 cities with the most surveillance globally, according to virtual private network firm Surfshark.
Chennai, which topped the index with 657 CCTV cameras per square kilometer compared to Beijing at the bottom with 278, is implementing the Safe City project by mapping high-crime areas and tracking buses and taxis with CCTV networks and “smart†poles.
“The government did not want to just do more surveillance, but look at it more holistically to address challenges women face at home, on their commute, at work and in public places,†said Arun Moral, a director at consulting firm Deloitte, which is advising the city on the project.
“There is a tech intervention for every challenge.â€
An audit of Delhi’s Safe City project last year noted that the efficacy of cameras to prevent crimes against women had not been studied. Only about 60% of CCTVs installed were functional, and fewer than half were being monitored.
The “heavily surveillance-centric project of Delhi Police needs to be reviewed,†the audit said.
Yet there has been little progress on tackling violence against women with measures such as education and increasing the numbers of female police officers, who make up less than 10% of the force, according to official data.
“Authorities think technology alone can solve problems, and there is little scrutiny of the so-called solutions because they are being sold on safety,†said Jain at the Internet Freedom Foundation.
“Authorities — even your own family — can cite safety as a justification for more surveillance because safety is a bigger concern than privacy,†she said.
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.
[ad_2]
Source link