Algorithmic Mapmaking in ‘Smart Cities’: Data Protection Impact Assessments as a means of protection for groups

Gerard Jan Ritsema van Eck, ‘Algorithmic Mapmaking in ‘Smart Cities’: Data Protection Impact Assessments as a means of protection for groups’ in Angela Daly, S. Kate Devitt and Monique Mann (eds), Good Data (Institute of Network Cultures 2019).

Abstract
Maps are powerful communication tools, and mapmaking used to be a privileged affair. In recent times this has changed as “smart cities” have been outfitted with video, audio, and other kinds of “Internet of Things” sensing devices. The data-streams they generate can be combined with volunteered data to create a vast multitude of interactive maps on which individuals are constantly (re)grouped on the basis of abnormality, deviation, and desirability. Many have argued that under these circumstances personal data protection rights should be extended to groups.
However, group rights are an awkward fit for the current European data protection framework which is heavily focused on individuals. One possible opening for better protection is offered by Data Protection Impact Assessments (DPIAs), which are mandatory to carry out when the ‘systematic monitoring of a publicly accessible area on a large scale’ necessary for mapmaking takes place. They form an opportunity to recognize the risks of e.g. discrimination at an early stage. Furthermore, by including representatives of local (disadvantaged) groups, the strong performative qualities of maps can offer occasions for groups of citizens in smart cities to proactively shape the environments in which they live.
There are serious limitations. Although DPIAs are mandatory, the inclusion of affected data subjects and their representatives is not. This undermines many of the possible advantages. Finally, the high costs associated with the process might mean many companies engage with it only superficially and temporarily. Establishing effective data protection for groups negatively impacted by mapmaking software through DPIAs thus seems nigh on impossible in lieu of substantial legislative change.

Full text
You can download the full book here at the Institute of Network Cultures, or just my chapter here.

Advertisements

Emergency calls with a photo attached: The effects of urging citizens to use their smartphones for surveillance

Gerard Jan Ritsema van Eck, ‘Emergency Calls with a Photo Attached: The Effects of Urging Citizens to Use Their Smartphones for Surveillance’ in Bruce Clayton Newell, Tjerk Timan and Bert-Jaap Koops (eds), Surveillance, Privacy, and Public Space (Routledge 2018).

Abstract
Various kinds of media and metadata, such as pictures, videos, and geo-location, can be attached to emergency reports to the police using dedicated platforms, social networking sites, or general communication apps such as WhatsApp. Although potentially a very useful source of information for law enforcement agencies, this also raises considerable concerns regarding surveillance and privacy in public spaces: It exhorts citizens to establish a supervisory gaze over anyone, at any time, and anywhere.
This chapter analyses these concerns using theories from surveillance studies. It considers the (surprisingly high) applicability of panoptical theories by Foucault and others to the effects of increased visibility of citizens in public spaces. This analysis importantly reveals how discriminatory tendencies might be introduced and exacerbated. Attention is then paid to Deleuze’s ‘societies of control’ and related notions such as database surveillance, surveillance assemblages, and predictive policing. This analysis shows that the enrichment of emergency reports with media and metadata from smartphones can pressurize people into conformity, erode the presumption of innocence, and diminish societal trust. Furthermore, this process will disproportionality affect already disadvantaged groups and individuals. Policy makers are advised to implement enriched emergency reports carefully.

Full text
Get the hardcopy book at Routledge, get access to the digital edition, or ask your favourite librarian and/or local bookshop. Alternatively, download the accepted manuscript here.

Mobile devices as stigmatizing security sensors: The GDPR and a future of crowd-sourced ‘broken windows’

Oskar Josef Gstrein and Gerard Ritsema van Eck ‘Mobile devices as stigmatizing security sensors: The GDPR and a future of crowd-sourced ‘broken windows” (2018) 8(1) International Data Privacy Law 69-85, doi: 10.1093/idpl/ipx024.

Abstract
Various smartphone apps and services are available which encourage users to report where and when they feel they are in an unsafe or threatening environment. This user generated content may be used to build datasets, which can show areas that are considered ‘bad,’ and to map out ‘safe’ routes through such neighbourhoods. Despite certain advantages, this data inherently carries the danger that streets or neighbourhoods become stigmatized and already existing prejudices might be reinforced. Such stigmas might also result in negative consequences for property values and businesses, causing irreversible damage to certain parts of a municipality. Overcoming such an “evidence-based stigma” — even if based on biased, unreviewed, outdated, or inaccurate data — becomes nearly impossible and raises the question how such data should be managed.

Full text
Published version freely available at Oxford Academic.
Accepted manuscript (‘post-print’) freely available here.