Post

Workshop report: People, not experiments – why cities must end biometric surveillance

We debated the use of facial recognition in cities with the policymakers and law enforcement officials who actually use it.

The NGI Policy Summit hosted a series of policy-in-practice workshops, and below is a report of the session held by European Digital Rights (EDRi), which was originally published on the EDRi website.

We debated the use of facial recognition in cities with the policymakers and law enforcement officials who actually use it. The discussion got to the heart of EDRi’s warnings that biometric surveillance puts limits on everyone’s rights and freedoms, amplifies discrimination, and treats all of us as experimental test subjects. This techno-driven democratic vacuum must be stopped.

From seriously flawed live trials of facial recognition by London’s Metropolitan police force, to unlawful biometric surveillance in French schools, to secretive roll outs of facial recognition which have been used against protesters in Serbia: creepy mass surveillance by governments and private companies, using people’s sensitive face and body data, is on the rise across Europe. Yet according to a 2020 survey by the EU’s Fundamental Rights Agency, 80% of Europeans are against sharing their face data with authorities.

On 28 September, EDRi participated in a debate at the NGI Policy Summit on “Biometrics and facial recognition in cities” alongside policymakers and police officers who have authorised the use of the tech in their cities. EDRi explained that public facial recognition, and similar systems which use other parts of our bodies like our eyes or the way we walk, are so intrusive as to be inherently disproportionate under European human rights law. The ensuing discussion revealed many of the reasons why public biometric surveillance poses such a threat to our societies:

• Cities are not adequately considering risks of discrimination: according to research by WebRoots Democracy, black, brown and Muslim communities in the UK are disproportionately over-policed. With the introduction of facial recognition in multiple UK cities, minoritised communities are now having their biometric data surveilled at much higher rates. In one example from the research, the London Metropolitan Police failed to carry out an equality impact assessment before using facial recognition at the Notting Hill carnival – an event which famously celebrates black and Afro-Carribean culture – despite knowing the sensitivity of the tech and the foreseeable risks of discrimination. The research also showed that whilst marginalised communities are the most likely to have police tech deployed against them, they are also the ones that are the least consulted about it.

• Legal checks and safeguards are being ignored: according to the Chief Technology Officer (CTO) of London, the London Metropolitan Police has been on “a journey” of learning, and understand that some of their past deployments of facial recognition did not have proper safeguards. Yet under data protection law, authorities must conduct an analysis of fundamental rights impacts before they deploy a technology. And it’s not just London that has treated fundamental rights safeguards as an afterthought when deploying biometric surveillance. Courts and data protection authorities have had to step in to stop unlawful deployments of biometric surveillance in Sweden, Poland, France, and Wales (UK) due to a lack of checks and safeguards.

• Failure to put fundamental rights first: the London CTO and the Dutch police explained that facial recognition in cities is necessary for catching serious criminals and keeping people safe. In London, the police have focused on ethics, transparency and “user voice”. In Amsterdam, the police have focused on “supporting the safety of people and the security of their goods” and have justified the use of facial recognition by the fact that it is already prevalent in society. Crime prevention and public safety are legitimate public policy goals: but the level of the threat to everyone’s fundamental rights posed by biometric mass surveillance in public spaces means that vague and general justifications are just not sufficient. Having fundamental rights means that those rights cannot be reduced unless there is a really strong justification for doing so.

• The public are being treated as experimental test subjects: across these examples, it is clear that members of the public are being used as subjects in high-stakes experiments which can have real-life impacts on their freedom, access to public services, and sense of security. Police forces and authorities are using biometric systems as a way to learn and to develop their capabilities. In doing so, they are not only failing their human rights obligations, but are also violating people’s dignity by treating them as learning opportunities rather than as individual humans deserving of respect and dignity.

The debate highlighted the worrying patterns of a lack of transparency and consideration for fundamental rights in current deployments of facial recognition, and other public biometric surveillance, happening all across Europe. The European Commission has recently started to consider how technology can reinforce structural racism, and to think about whether biometric mass surveillance is compatible with democratic societies. But at the same time, they are bankrolling projects like horrifyingly dystopian iBorderCTRL. EDRi’s position is clear: if we care about fundamental rights, our only option is to stop the regulatory whack-a-mole, and permanently ban biometric mass surveillance.

Post

Workshop report: Data-sharing in cities

The NGI Policy Summit hosted a series of policy-in-practice workshops, and below is a report of the session held by the Coalition of Cities for Digital Rights, written by Beatriz Benitez and Malcolm Bain.

The NGI Policy Summit hosted a series of policy-in-practice workshops, and below is a report of the session held by the Coalition of Cities for Digital Rights, written by Beatriz Benitez and Malcolm Bain.

At the Next Generation Internet Policy Summit 2020 organised by Nesta and the City of Amsterdam  on 28th September 2020, Daniel Sarasa (Zaragoza) and Malcolm Bain (Barcelona), on behalf of the Coalition of Cities for Digital Rights, hosted a round table for city and regional stakeholders that explored the application of digital rights in local data-sharing platforms, throughout the stages planning, implementation, and evolution of digital data sharing services.

Data sharing platforms are playing an important role in the cities by integrating data collected throughout or related to the city and its citizens from a wide variety of sources (central administration, associated entities, utilities, private sector) to enable local authorities, businesses and even occasionally the public to access this data produced within the City and use it for limited or unlimited purposes (open data). 

Malcolm introduced the session, highlighting that while Cities are keen to share data and use shared data in city digital services, they are (or should be) also aware of the digital rights issues arising in these projects related to citizens’ privacy, transparency and openness of the data used, accessibility and inclusion of citizens as well as the existence of bias in the data set used and the privatization of the use of city-related data. Luckily, cities are also in the best position to introduce the concept of ‘digital rights by design’ in these projects, and correct issues such as bias, privacy intrusions, fairness, profiling and data misuse.  He briefly show-cased the Coalition work in this area in the Data Sharing Working Group, focusing on the ‘building blocks’ for rights-compliant data sharing projects to extract value from urban big data while respecting residents and visitors’ rights, including policies, processes, infrastructures, and specific actions and technologies.

Daniel highlighted the work of Eurocities on their Citizens Data Principles, which aim is to offer guidance to European local governments on more socially responsible use of data, and recognise, protect and uphold the citizens’ rights on the data they produce. The principles support how to use data-generated knowledge to improve urban life and preserve European values through scientific, civic, economic and democratic progress. Daniel presented one of his own city’s data-sharing project, Periscopio, a framework for sharing information contained in urban data (public and private) in such a way that it allows social agents and citizens to be involved to create social, scientific, economic and democratic value, as well as enabling the creation of better urban services.

Then, the cities of San Antonio, Long Beach, Portland, Toronto, Rennes, Helsinki, Amsterdam and Barcelona each presented some case studies from their cities, highlighting different issues raised by their data-sharing platforms and projects.

  • For the City of San Antonio, USA, Emily B. Royall addressed the issue of data bias and the need to listen to the community under the theme ‘Leveraging Data for Equity’.
  • Johanna Pasilkar of Helsinki shared with us the work of ‘MyData’ operator initiative and how to ease the daily life of the residents by consolidating data collected by the city’s departments and organisations and enabling sharing across several municipalities (data portability).
  • On behalf of the City of Amsterdam, Ron Van der Lans told us about the collaboration with the city by sharing traffic data with navigation companies such as Google, Waze and BeMobile to improve the mobility and quality of life of citizens.
  • Hamish Goodwin from the City of Toronto, Canada explained how they are attempting to integrate digital rights principles into the city digital infrastructure and the municipalities’ decision-making and how to put a policy framework into practice – the results of this are just coming out.
  • From the city of Rennes, Ben Lister introduced us to the RUDI, a local, multipartner, data sharing platform which goes beyond open data and connects users and producers to create new or/and better services.
  • Héctor Domínguez from the city of Portland, USA told us about the importance of ‘Racial Justice’ as a core value to regulating emergent technology, based on the respect for privacy, trusted surveillance and digital inclusion.
  • Ryan Kurtzman on behalf of the City of Longbeach, USA spoke about positive and negative associations of smart cities, and how participatory design of citizens in digital services can leverage the positive aspects: personal convenience, engagement and solving social challenges.
  • To conclude the round, Marc Pérez-Battle from Barcelona presented several data sharing and open data projects led by the City Council.

The City participants highlighted the need for embedding digital rights at design time (privacy, transparency, security, accessibility, etc.), citizen participation, and having the flexibility to adapt and correct any issues that may arise – something that may be more difficult when the technologies are embedded in the city infrastructure, and thus all the more important for correct design. Common themes among the projects include the importance of citizen involvement in projects, the respect for privacy and security, and the need for transparency and avoiding data bias. In addition, listeners to the session in the online chat also raised the issue of data ‘ownership’, and if this is a useful concept or rather misleading – cities are more stewards of data for the public, rather than an owner of data that they gather and use.

The session concluded stating that much work was still to be done, but just by raising Cities’ awareness of digital rights issues in data-sharing projects, we are making a big first step.  The Coalition will shortly be releasing the Data Sharing Concept Note, and associated case studies that were briefly presented during the round table.

Post

Can Cities Be Guardians of Digital Rights?

Everybody who’s professionally involved in technology in cities and communities agrees that the debate on digital rights has moved beyond the implementation of smart technologies. The European General Data Protection Regulation (GDPR) turned ‘Privacy’ into a hot topic, and the Cambridge Analytica scandal catapulted the debate on ethical use of data high up the political […]

Everybody who’s professionally involved in technology in cities and communities agrees that the debate on digital rights has moved beyond the implementation of smart technologies. The European General Data Protection Regulation (GDPR) turned ‘Privacy’ into a hot topic, and the Cambridge Analytica scandal catapulted the debate on ethical use of data high up the political agenda.

As a result of this global politicisation of digital affairs, local councils are increasingly becoming aware of their political power to decide and shape the digital development of their cities. For example, 5G infrastructures are not a matter of ‘neutral smart city efficiency’; city councils across the world – the closest democratic representatives of citizens – have a choice how and which data can be collected and by whom.

Policy making in the Digital Public Space

Approaching local governments as ‘caretakers’ for their citizens, some common approaches between the represented cities emerge, and local governments are taking action. Local governments exist to regulate the use of collective resources and public space. We see that our physical public space is digitising. The question that emerges for cities is: How should local governments make policies for the digital public space?

When physical and digital public space are blending – bridges equipped with sensors, public squares offering free WIFI access –  local governments have a key role to set the terms and conditions for their city to flourish digitally.

It is nearly impossible for citizens to opt-out of digital tracking when using public spaces in cities. Therefore, it is crucial to know what happens with their data after it has been collected, or in which framework commercial re-use, privacy and benefits are managed.

Privacy considerations might slow down the possibilities for digital industries to innovate, but privacy and innovation are not mutually exclusive. A common understanding and implementation of privacy and ethics could level the playing field. Cities are welcoming a strong Europe to develop a fair digital marketplace, based on equality of opportunities for competitors and consumers/citizens.

To achieve a level-playing field, four key actions for local governments to take are:

1.Explaining Digital Rights

Citizens have to understand that they have digital rights. Often, digital rights are not clear, or expressed in language that’s difficult to grasp. Amsterdam and Barcelona took the initiative and have started a cities coalition to define clear digital rights for everyone.

2. Using Procurement to Enforce Digital Rights

Local governments can use their procurement frameworks to enforce data privacy. With their ‘data sovereignty’ programme, Barcelona has already demonstrated the effectiveness of procurement when it comes to guaranteeing data sovereignty. For example, data collected in assignment of the local government in public space will become available to share in a ‘data commons’.

With an annual budget of €2.1 Billion for procurement, cities like Amsterdam can guide the market rather than following it.

3. Regulating digital markets that impact public space

In digital markets, the interaction between consumers, workers and platforms generated new ways to organize, domains like mobility in cities and set new challenges for city governments: What is the role of public transport when people who can afford it are using car services? How can  data and insights collected by platforms become available to policymakers, citizens and interest groups?

In order to guarantee a fair marketplace and equal society, cities need to regulate digital markets when they are impacting public space and the lives of their residents. Collaboration with national and international authorities is needed to create a digital single market. Cities are also looking to counteract the information asymmetry between (local) governments and global digital platforms. This asymmetry influences how local governments can implement and enforce policies.

4. Be Transparent

Citizens are demanding solutions and clarity from their local government. Cities have to be transparent about how they are using data collected in public spaces. There are several ways to achieve transparency. The City of Porto, for example, is providing an application where citizens can see check where IoT devices or cameras are installed and for what purposes, when it was decided to install them or who has approved it. The application also allows citizens to ask questions about the device or report new devices to the municipality.

Following these four actions, it becomes clear that municipalities have to involve citizens to manage concerns, demands and technical possibilities. To define next steps, cities need a deeper understanding of privacy concerns of citizens and the assumptions and expectations of technical partners.

Citizens Demand Clarity About Their Data

There is no such thing as a digital invisibility cloak. But are there alternatives for digital business models based on the collection of personal data?

There are concepts being developed to give more control to citizens, users or visitors over their data. One of them is Solid, a project promoted by Tim Berners-Lee, founder of the World Wide Web. Additionally, several EU-sponsored projects like DECODEhave aimed to create scalable open source solutions that respect the digital rights of citizens.

One of the more tangible efforts is the recently launched the Cities Coalition for Digital Rights. New York, Amsterdam and Barcelona founded the coalition, which is supported, among others by UN Habitat, Open & Agile Smart Cities. More than 50 cities have already joined this alliance to create a framework where policies, best practices and technical solutions can be developed, implemented and shared. The goal is to set a common baseline where the basic securities that we can expect in the street finds its equivalent in the digital public sphere.

Bart Rosseau, Chief Data Officer, City of Ghent, and Tamas Erkelens, Programme Manager Data Innovation, City of Amsterdam, who are co-leading the working group on Digital Rights within Open & Agile Smart Cities (OASC).

This blog was originally published on the Open and Agile Smart city website

>