Open North is pleased to publish a timely report on the intersectional privacy risks related to data sharing between police services and local government. Funded by the Office of the Privacy Commissioner, this report responds to the lack of empirical research and critical analysis of this rapidly growing data ecosystem, and provides an analysis of risks and governance measures. Below is a summary of the report and an invitation to join us in the next phase of our research and advocacy.
The last few years have seen concurrent yet unnecessarily disconnected developments in the Canadian discourse around the governance of public use of data and digital technologies in cities. There is a growing realization that while cities can benefit immensely from more granular data collection, algorithmic analytics, platform-based tools, or networked systems, a far greater emphasis on data governance is necessary. At the same time a series of public scandals has revealed the degree to which law enforcement has also embraced data technologies, but frequently in secrecy and in contravention of the norms of privacy, if not the laws. The most recent such scandal, around the use of the facial recognition technology Clearview AI, prompted the development of an artificial intelligence governance policy by the Toronto Police Service (Brandusescu et al., 2021) — however, no other police services are publicly engaging with this growing problem. Cities, on the other hand, are doing so, albeit on a limited, internally focussed fashion. The ‘smart city’ project of Sidewalk Labs collapsed in 2020, in part under the weight of sustained criticism of its data governance policies (Artyushina, 2020), but the federal Smart Cities Challenge emphasized the values and principles of open and responsible governance (Valverde & Flynn, 2020). Subsequently, the cities of Toronto and Montreal have released frameworks outlining their guiding principles for the ethical and responsible use of data and emerging technology and are currently working on operationalizing these frameworks.
This disconnection between cities and law enforcement is problematic. In Canada there is a long-standing dispute over the relationship between police services and municipalities. Many services, like the Toronto Police Service, are funded by the municipality but are not governed by it, in theory answering instead to a separate entity, the nominally civilian police service board. This has led to tensions around questions of funding and lack of democratic oversight over, and insight into, what that funding supports. In addition, there are ongoing disagreements over what the police service boards have governance power over, and what remains “operational’ purview of the police services themselves (Roach, 2022). These debates over responsibility and governance make it clear that, at least in terms of society’s rapidly growing data and digital technology ecosystem, cities and law enforcement are deeply intertwined (Artyushina & Wernick, 2021; Lorinc, 2021). Not only are they facing similar regulation issues, but their data and technology systems are becoming rapidly integrated: common databases, closed-circuit television (CCTV) access, data-sharing projects, similar analytics tools, all operating in and adding to cities’ digital infrastructure (Linder, 2021). The risks these technologies can pose, particularly to marginalized communities, have also become unavoidably apparent: biased data, privacy violations, secretive procurement and use of new and invasive technologies, unethical sharing of data, illegitimate use cases, and discriminatory algorithms all pose significant risks to people and communities (Brayne, 2020; Linder, 2021). In addition, the widespread public realization of these ungoverned practices and their harms has contributed to a decline in public trust in the use of data and technology by government services (Bannerman & Orasch, 2019).
Through 27 interviews with practitioners in law enforcement and local government as well as two expert workshops, we shed empirical light on the state of data sharing and governance between law enforcement and municipal authorities. The report documents what is known about this kind of data sharing, the state of data sharing’s governance, and the kinds of privacy risk assessment frameworks that are in place. Our goal is to catalyze deeper conversation about how to openly, democratically, and responsibly govern this intersection and to protect residents facing intersectional risks from deeply embedded systemic biases.
Findings
What this research has brought to light is that there is growing concern across the board about the intersectional privacy risks of law enforcement data sharing and usage, yet this concern is very unevenly distributed, rarely present in existing governance tools, and almost entirely undiscussed in the policy development processes. Our interviews uncovered more confusion than clarity, more ambiguity, contradiction, and heterogeneity than a clear landscape of even policy differentiation, let alone policy coherence. However, as we considered these findings we realized that this muddle is an important insight that requires timely and intentional action. The fact that some interviewees in positions of significant seniority and responsibility for information and communications technology and data policy knew exactly how the data sharing was being governed, while many did not or provided contradictory or unclear responses, is an important research outcome. This led us to the first two conclusions:
- The actual amount of ongoing data sharing is not well known in terms of metrics nor well understood conceptually, but
- many saw it as very important to the functioning of government services, and
- most agreed that it is set to rise precipitously in the near future.
- Yet, on an institution-wide level data-sharing governance is something that is frequently not considered important enough to warrant standardization or strong oversight.
Further, our research shows that important collaboration between government services is still deeply siloed and fragmented, and that there exists a wide, and quite ambiguous, range of approaches to governing data sharing – one that practitioners themselves clearly state needs comprehensive reform to meet the needs of growing digital interconnection. The current state of both existing policy and the processes by which policy innovation occurs needs improvement to fulfill the needs for responsible governance in these times of rapid technological change. Interrelated issues around PII, risk assessment, data governance of sharing processes need more considerate attention, particularly given the inherently more intersectionally dangerous practice of policing.
In addition, there are significant privacy and intersectional risk issues with these data-driven and digital technologies, and the scale of their usage is growing while the awareness of these issues both within and without policing climbs too. The lack of clarity and coherent, open public discussion is driving tensions between criticism, secrecy, and the need for responsible policy reform – including potential reforms in data and technology governance in which technological capacities are better circumscribed and apportioned to the most appropriate governmental agencies. From this ine of inquiry we drew further conclusions:
- What governance does exist is frequently ad hoc on a project-by-project basis. There may be a template for an MOU or a legally binding data sharing agreement, but rarely an overarching policy.
- In terms of privacy risk assessment, there is little awareness of, or consideration for, going beyond the measures prescribed by FIPPA around PII and successfully conducting a PIA.
- Amongst the minority of respondents who did recognize the salience of comprehensive governance and/or of risks posed by data sharing beyond those covered by individualistic conceptions of privacy, there was a concomitant appreciation for the need for improved data governance and risk assessment policy.
- However, very few were able to articulate suggestions for what that policy might contain.
What this report shows is that data sharing is far too risky — particularly for intersectionally impacted communities — , and far too ubiquitous, incentivized, and complex in the social, economic, and technological web of a digitized society, to be left to be (un)governed in silos. The solutions to this challenge have only just begun to be debated (Ada Lovelace Institute, 2022; “Disrupting Data Governance,” 2023; Linder, 2023), and there are tools in existence and in development that would help conceptualize and operationalize these considerations, but without better governance that gives regulatory power to these tools their use remains sporadic or incomplete.
Conclusion and Call To Action
A much more comprehensive, transparent, and societally and democratically inclusive conversation is necessary to ensure that this situation does not deteriorate, resulting in further erosion in public trust in good governance. In this current situation of intersecting crises in the legitimacy of law enforcement, this is an opportunity to develop significant new approaches to data, privacy, and the risks they entail, particularly to the most vulnerable.
It is into this regulatory space and this rapidly growing need for significant innovation that we wish to insert this report, so that it may serve as a springboard for discussion and new development. The goal of our ongoing work is to begin mapping out what a better governance of law enforcement data sharing would encompass, how to take the growing issue of intersectional risk into account, and how to start developing strategies for implementing reforms. This is an open call for individuals and organizations who would like to work with us on this urgent subject! If interested, contact us thomas@opennorth.ca.