EFF: Activists Worldwide Face Off Against Face Recognition: 2019 Year in Review
We’ve all heard the expression, “What happens in Vegas, stays in Vegas.” We might hope that what we do and where we go will only be known to those who were there in person. Yet maintaining such anonymity and privacy in public spaces is becoming ever more difficult. 2019 has marked the year where a growing digital rights network around the world is pushing back against governments and companies’ use of face recognition technologies in public spaces. This year, in an attempt to prevent people from having their movement and actions meticulously tracked, these activists took action against face recognition in countries all over the world.
Ban on Mass Use of Face Recognition
Digital rights activists have long argued that face recognition constitutes mass surveillance when used to track the movements of entire populations in public spaces by matching faces obtained from CCTV cameras, drones or other devices against existing databases. In October, more than 90 NGOs and hundreds of experts gathered in Albania at the International Conference of Data Protection and Privacy Commissioners and called for a global moratorium on mass surveillance by face recognition. The Public Voice coalition urged countries to review all face recognition systems “to determine whether personal data was obtained lawfully and to destroy data that was obtained unlawfully.” In addition, the Fundamental Rights Agency of the European Union (FRA) has also published a paper recognizing that “given the novelty of the technology as well as the lack of experience and detailed studies on the impact of facial recognition technologies, multiple aspects are key to consider before deploying such a system in real-life applications”. It further said that “[f]orms of facial recognition that involve a very high degree of intrusion into fundamental rights, compromising the inviolable essential core of one or more fundamental rights, are unlawful.” And the United Nations Special Rapporteur on Freedom of Expression, David Kaye, called for an immediate moratorium on the sale, transfer, and use of surveillance technology, including face recognition, until legal frameworks are established that meet human rights standards.
In Russia, Roskomsvoboda launched a campaign calling for a moratorium on government mass use of face recognition until the technology’s effects are studied and the government adopts legal safeguards that protect sensitive data. In the United Kingdom, 25 NGOs including Big Brother Watch, Article 19, Open Rights Group, and Privacy International called on U.K. police and private companies to immediately stop using live face recognition for public surveillance. In 2016 and 2018, face recognition trials in London erroneously identified individuals as potential criminals in 96 percent of scans, a pervasively high rate of false-positive matches. Also this year, Big Brother Watch launched a legal challenge against the London Metropolitan Police and the Home Secretary to demand an immediate end to the police’s use of live face recognition. In France, La Quadrature du Net (LQDN) called for a ban on the mass use of face recognition to identify protesters. In the last six years, the French government has adopted several decrees—without any public debate—that allow for automatic identification of protesters. And in the United States, local activists took up the fight against face recognition by successfully passing face recognition bans at the city level. Oakland, San Francisco, Berkeley, and Somerville, Massachusetts all passed bans on government use of face recognition technology. Earlier this year, California prohibited the use of face recognition on law enforcement body-worn cameras, causing San Diego to end its long-running mobile face recognition program.
La Quadrature du Net and other French NGOs also filed an action to ban the use of face recognition in two high schools in Nice and Marseille. Those actions led CNIL, France’s data protection authority, to conclude that the use of face recognition at the entrance of the schools to target mostly minors is not “necessary or proportionate,” and that the goals of the program could "be achieved by much less intrusive means in terms of privacy and individual freedoms." In a similar case in Sweden, the Swedish Data Protection Authority (DPA) imposed a General Data Protection Regulation (GDPR) fine of approximately 20,000 euros on a municipality after a school conducted a pilot using face recognition technology to track students’ attendance. The Swedish DPA rejected the municipality's argument that the school had consent to process sensitive biometric information, as required under the GDPR, indicating that “consent was not a valid legal basis given the clear imbalance between the data subject and the controller.” Unfortunately, only a few months later, the same Swedish DPA issued another decision allowing police departments to use face recognition to compare face images from CCTV footage to criminal biometric databases. The decision clarified that police must set a retention period for biometric information collected from cameras.
Ending the Culture of Secrecy
This year, Latin American NGOs have been fighting back against a deeply rooted culture of secrecy surrounding face recognition providers’ identity, data sources, data collection methods, applications, and customers. TEDIC, the main digital rights organization in Paraguay, filed a lawsuit challenging the constitutionality of a Ministry of the Interior resolution that denied TEDIC’s public records request for further details about the Ministry of Interior and the National Police’s use of face recognition technology. Face recognition has been used in Asunción’s downtown area, airport, and bus stations since 2018, and is now planned to be expanded throughout Asunción.
In Argentina, Asociación por los Derechos Civiles (ADC) filed a lawsuit against the Government of the City of Buenos Aires challenging the constitutionality of Resolution 398/19, which introduced a face recognition system linked to the city’s security camera infrastructure and monitoring centers. ADC filed the lawsuit after receiving responses to two information requests about the face recognition system.
Access Now, in collaboration with ADC and the Observatorio de Derecho Informático Argentino, sent an information request to the Argentine province of Córdoba related to the October 2019 announcement of a test of a biometric recognition system linked to video cameras that use artificial intelligence. In Peru later that month, Access Now and Hiperderecho sent similar information requests to La Victoria, San Martín de Porres, and Miraflores, municipal districts of Lima. Among the information requested are the technology provider’s identity, the system’s technical specifications, and the procedures for identification and apprehension of suspicious persons.
Latin American NGOs also launched advocacy campaigns against face recognition. Derechos Digitales launched an advocacy campaign to shine a light on the different face and biometric recognition proposals being considered in Latin America. And in Brazil, inspired by the images Gu Da Cei received through an information request, the artist carried out human intervention campaigns at bus stations in Brasilia to expose photos taken by a bus face biometrics system and reinforce the right of public transport users to their own image.
Role of Private Companies In The Use of Face Recognition for Surveillance
This year, reports have also come to light about the role of private companies in the public use of face recognition. A New York Times report revealed how Chinese companies such as C.E.I.E.C. were successfully commercializing versions of China’s mass face recognition system by exporting them to developing countries, in particular, Ecuador, Bolivia, Angola, and Venezuela. The implementation of the system in Ecuador, ECU911, has simultaneously been popular among Ecuadorians worried about street crime and amplified fears about abuse of the system for political repression. Just before the end of the year, 78 facial recognition CCTVs linked to ECU911 were installed in the Historic Center of Quito, the site where hundreds of indigenous activists recently protested the Ecuadorian government. Derechos Digitales published a report finding that C.E.I.E.C. was also active in Bolivia’s security program and that the funding for the program came from a Chinese national bank. Internet Bolivia told Derechos Digitales that Bol-110, the ambitious project to acquire surveillance technologies in Bolivia, “will be in everything: in schools, taxis, hospitals.” And although Bol-110 was not approved by the Bolivian Congress, the face recognition system has already been purchased.
In Serbia, SHARE Foundation submitted a request for information about a new video surveillance system with face recognition and license plate reader technology. Huawei, a Chinese company, was revealed to be the Serbian government’s main partner in the endeavor. Additionally, SHARE Foundation unearthed a case study published on Huawei’s website about new generation surveillance cameras that have already been installed in Belgrade. Huawei removed the case study from its website soon after SHARE Foundation’s revelation was made public. In November, SHARE called for the immediate suspension of Serbia’s face recognition program. In a recent report, SHARE, along with NGOs Partners Serbia and Belgrade Center for Security Policy, concluded that the Ministry of Interior’s privacy impact assessment of the surveillance cameras does not meet the standards required by the Serbian data protection law. Brazilian legislators, meanwhile, received an all-expenses-paid trip to China to learn about and view demonstrations of face surveillance technology that Chinese firms hoped Brazil would also choose to acquire.
Security and Data Leakage
The Brazilian Institute of Consumer Defense (IDEC) sent a demand to Dataprev, a Brazilian public company responsible for the security of Brazilian social security information, requesting that it halt its bid for the acquisition of face recognition and fingerprint technology until cases of beneficiaries’ existing data leaks are resolved. IDEC explained that while the company aimed to integrate face recognition into an app to help people with disabilities access their banking and social security information remotely, the technology’s high risk of breach would compromise the personal information of approximately 35 million Brazilians.
In the Netherlands, Bits of Freedom launched an activism campaign to demonstrate the insecurity of a Dutch Face Recognition pilot program in Amsterdam’s central square, Dam, where a webcam is live-streaming to YouTube and the website webcam.nl. Bits of Freedom downloaded images of its members at the Dam, and then ran the images through Amazon’s face recognition software, Rekognition. The software was able to identify the members. Bits of Freedom concluded that face recognition software, combined with mass surveillance in public spaces, threatens the privacy and security of vulnerable people, including victims of stalking and domestic violence.
Conclusion
This year, governments around the world have moved quickly to adopt face recognition technologies for use in public spaces. But activists have been quick to respond, demanding transparency and winning moratoria and bans on the use of this powerful technology. As we look forward to 2020, the tensions between the government’s use of this technology for public safety and individuals’ right to privacy will continue to heighten. EFF will remain vigilant and continue the global fight against the government adoption of face recognition technology.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2019.
Like what you're reading? Support digital freedom defense today!
Comments
Post a Comment