Police bodycams as equiveillance tools? Reflections on the debate in the Netherlands

Lotte Houwing and Gerard Jan Ritsema van Eck ‘Police bodycams as equiveillance tools? Reflections on the debate in the Netherlands’ (In press) Surveillance & Society.

Abstract

In the United States of America, police bodyworn cameras (bodycams) were introduced to protect civilians against violence by law enforcement authorities. In the Netherlands, in contrast, the same technology has been introduced to record and discipline the behavior of the growing number of citizens using their smartphone cameras to film the (mis)conduct of police. In answer to these citizens sousveilling the police and publishing their images on social media, the bodycam was introduced as an objective referee that also includes the perspective of the police officer. According to this view the bodycam is a tool of equiveillanc: a situation with a diversity of perspectives in which surveillance and sousveillance are in balance (Mann 2005).

Various factors, however, hamper the equiveillant usage of bodycams in the Netherlands. Firstly, the attachment of the bodycam to the uniform of the agent leads to an imbalanced representation of perspectives. The police perspective is emphasized by the footage that is literally taken from their perspective, in which others are filmed slightly from below, making them look bigger and more overwhelming. Also, the police officers movements create shaky footage with deceptive intensity that invoke the image of a hectic situation that calls for police action. Secondly, it is the officer who decides when to wear a camera and when to start and stop recording. This leaves the potential to not record any misconduct. Thirdly, access to the recorded images, whilst in theory open to police and citizens alike, is in practice exclusively for the police.

Within the current regulatory framework, bodycams are thus not neutral reporters of interactions between civilians and the police. We will end our contribution to this Surveillance & Society Dialogue section with suggestions for improvement of those rules, and reflect on the question whether bodycams can ever be objective referees.

Capturing licence plates: police-citizen interaction apps from an EU data protection perspective

Jonida Milaj and Gerard Jan Ritsema van Eck ‘Capturing licence plates: police-citizen interaction apps from an EU data protection perspective’ (2019) International Review of Law, Computers & Technology, doi: 10.1080/13600869.2019.1600335

Abstract

A Pokémon Go-like smartphone app called ‘Automon’ was unveiled in October 2017 as one of several new initiatives to increase the public’s contribution and engagement in police investigations in the Netherlands. Automon is designed in the form of a game that instigates participants to photograph license plates to find out if a vehicle is stolen. The participants in the game score points for each license plate photographed, and may also qualify for a financial reward if a vehicle is actually stolen. In addition, when someone reports that a vehicle has been recently stolen, game participants that are in the vicinity receive a push notification and are tasked with searching for that particular vehicle and license plate. This paper studies the example of the Automon app and contributes to the existing debate on crowdsourced surveillance and the involvement of individuals in law enforcement activities from an EU law perspective. It analyses the lawfulness of initiatives that proactively require individuals to be involved in law enforcement activities and confronts them for the first time with European Union (EU) data protection standards. It is concluded that the Automon app design does not meet the new legal standards.

Full text

Available open access here.

Algorithmic Mapmaking in ‘Smart Cities’: Data Protection Impact Assessments as a means of protection for groups

Gerard Jan Ritsema van Eck, ‘Algorithmic Mapmaking in ‘Smart Cities’: Data Protection Impact Assessments as a means of protection for groups’ in Angela Daly, S. Kate Devitt and Monique Mann (eds), Good Data (Institute of Network Cultures 2019).

Abstract

Maps are powerful communication tools, and mapmaking used to be a privileged affair. In recent times this has changed as “smart cities” have been outfitted with video, audio, and other kinds of “Internet of Things” sensing devices. The data-streams they generate can be combined with volunteered data to create a vast multitude of interactive maps on which individuals are constantly (re)grouped on the basis of abnormality, deviation, and desirability. Many have argued that under these circumstances personal data protection rights should be extended to groups.

However, group rights are an awkward fit for the current European data protection framework which is heavily focused on individuals. One possible opening for better protection is offered by Data Protection Impact Assessments (DPIAs), which are mandatory to carry out when the ‘systematic monitoring of a publicly accessible area on a large scale’ necessary for mapmaking takes place. They form an opportunity to recognize the risks of e.g. discrimination at an early stage. Furthermore, by including representatives of local (disadvantaged) groups, the strong performative qualities of maps can offer occasions for groups of citizens in smart cities to proactively shape the environments in which they live.

There are serious limitations. Although DPIAs are mandatory, the inclusion of affected data subjects and their representatives is not. This undermines many of the possible advantages. Finally, the high costs associated with the process might mean many companies engage with it only superficially and temporarily. Establishing effective data protection for groups negatively impacted by mapmaking software through DPIAs thus seems nigh on impossible in lieu of substantial legislative change.

Full text

You can download the full book here at the Institute of Network Cultures, or just my chapter here.

Mobile devices as stigmatizing security sensors: The GDPR and a future of crowd-sourced ‘broken windows’

Oskar Josef Gstrein and Gerard Ritsema van Eck ‘Mobile devices as stigmatizing security sensors: The GDPR and a future of crowd-sourced ‘broken windows” (2018) 8(1) International Data Privacy Law 69-85, doi: 10.1093/idpl/ipx024.

Abstract

Various smartphone apps and services are available which encourage users to report where and when they feel they are in an unsafe or threatening environment. This user generated content may be used to build datasets, which can show areas that are considered ‘bad,’ and to map out ‘safe’ routes through such neighbourhoods. Despite certain advantages, this data inherently carries the danger that streets or neighbourhoods become stigmatized and already existing prejudices might be reinforced. Such stigmas might also result in negative consequences for property values and businesses, causing irreversible damage to certain parts of a municipality. Overcoming such an “evidence-based stigma” — even if based on biased, unreviewed, outdated, or inaccurate data — becomes nearly impossible and raises the question how such data should be managed.

Full text

Published version freely available at Oxford Academic.
Accepted manuscript (‘post-print’) freely available here.