Safe for whom?: On NSW Police’s automatic policing software

In 2021, NSW Police signed two contracts with software companies to develop their Integrated Policing Operating System (IPOS), a system of automated policing that makes impossible, contradictory promises to keep everybody safe behind the facade of ‘neutral’ automation. The first contract is with Microsoft Azure to develop the cloud-based infrastructure, and the second is with Mark43, a US-based policing platform and software company to provide a Computer-Aided Dispatch system, Record Management, and Data Analytics services. Mark43, built on the Azure Cloud, will be providing the primary software that makes up the IPOS, and as such has been called the ‘final piece of the puzzle’ in the replacement of the state’s previous Core Operational Policing System (COPS).

The recent IPOS rollout has involved few public statements, and the majority of the press has come from Microsoft or from technology industry publications. Not only does the absence of coverage speak to the state of the media and the ability for the public good to be debated, it could also be read as a tactic capitalising on the ‘new normal’ of algorithmic technology in order to introduce these new systems without too much critical attention. COMPAS, Geolitica, PredPol, HunchLab, and ShotSpotter, to name a few, are all part of the growing suite of data-driven technologies being adopted by law enforcement globally.

Mark43 asserts that it will be bringing its ‘ tried and true approach’ to the NSW context, and boasts that it has become the first US-based ‘public safety’ software company to sign a contract of this size in the international market. Mark43 is already used by more than 120 law enforcement agencies, mostly in the United States and including the Boston and Washington DC Metropolitan Police Departments. NSW Police will be using Mark43 for everything from officer dispatch to data management, intelligence gathering, evidence analysis, and pressing charges. However, policing in the settler-state of Australia—whether it’s digitised or not—will always be entangled with the logics and preservation of propertied-whiteness and state efforts to secure land. Since 1788, white possession of Australia and the production of a national narrative have been tethered to carcerality and policing, as the premise of the colony itself was to deal with the excess criminal population, and further colonies have long been crucibles for testing out techniques and technologies of social control.

The Mark43 IPOS that NSW Police is adopting has three main components: a Records Management System database (RMS), a Computer-Aided Dispatch system (CAD), and Analytics. Marketed on Mark43’s website as slick, easy, software that will ‘save time’, the RMS works algorithmically to collect, correct, and tabulate data. Advertised as streamlining workflows, the CAD system works to algorithmically aggregate and display data by using ranking algorithms to access information, and offering dispatch recommendations to officers. Mark43 Analytics is described as making ‘sense of the mountains of data you generate everyday’ through automated graph generation and reports.

Scrolling down Mark43’s website, viewers are confronted with titles from recent Mark43 press mentions and testimonials from police departments in short bold quotes about ‘innovation, ease of use’ and the ‘future of policing’. Any explanations of their software, presented in short bullet-pointed text, are shrouded in appeals to safety, speed, and data-driven effectiveness. The aesthetic of computational smoothness, presented by companies like Mark43’s visual design and text, converges to both sell and grow the market share of their software and to convey the impression that techno-policing is a safe and streamlined process.

Microsoft wrote that its partnership with NSW Police aims to ‘digitally enable and keep police safer, to prevent and disrupt crime and to speed up justice outcomes for victims of crime and the community’. This implores the question—safer for whom? Data-driven policing technologies in Australia have already proven to perpetuate racially biased harm. Take, for example, the Suspect Target Management Plan (STMP) which NSW Police has, since 2005, used to predict the likelihood of individuals committing crime by using police data. People deemed as risky by the STMP have been Aboriginal and Torres Strait Islander people, who have been almost nineteen times more likely to be singled out by the STMP. An Aboriginal child under the age of fifteen is almost thirty-one times more likely. Once someone is placed on the STMP they are subject to increased surveillance and can be randomly and repeatedly detained.

Across Mark43’s website and the press related to these partnerships, the discursive invocation of ‘safety’ is slippery, as it is used interchangeably as the rhetoric shifts from keeping the public safe to keeping first responders safe to keeping police safe to keeping the community safe. Rhetorical appeals to (classed, white) safety have historically sanctioned the expansion of the police and policing, as has been seen from Jim Crow to the global war on terror. Mark43’s website routinely insists on the importance of data and safety: ‘leverage the data your agency already collects to keep first responders safer and better informed’. NSW Police’s Chief IT Officer, Gordon Dunsford, claims the force’s amassing of new technologies ‘ultimately means a safer NSW’.

Reinforcing this logic, Mal Lanyon, NSW Police’s Deputy Commissioner Corporate Services, repeats that NSW Police’s top priority is ‘ensuring the safety of the community of New South Wales’. According to Lanyon, Mark43 will equip police with ‘the best digital technology and capabilities available in the world’ and guarantee they can ‘prevent, disrupt and respond to crime and serve the needs of our community’. Lee Hickin, a representative from Microsoft Australia, echoes this sentiment and assures us that the integration of these new systems will be ‘a clear example of AI and ML being used for the good of the entire community’. If the discourse from Microsoft, Mark43, and NSW Police is reading as repetitive, that is almost precisely the point, as repetition is central to what appears to be natural or commonplace, and in this scenario it works to obscure the brutal realities of policing.

Enhancing community safety through policing could be understood as paradoxical, as it always involves the criminalisation, elimination and dispossession of people who are defined as disrupting and threatening property and social cohesion. The shameful hundreds of Indigenous deaths in custody—500 since the 1991 Royal Commission into Deaths in Custody and all the families and communities without justice—indicate this truth.

Confronting the question of bias, NSW Police’s lead ICT architect, Raj Bhaskaran, remarked to journalists that testing of Azure had been done to ensure ‘biases’ were avoided. He elaborates that NSW Police have engaged with key ‘stakeholders in this space’ to ensure they’re deploying ‘this leading-edge technology right’. Bhaskaran’s vague declaration to address public concern gives little information on who those stakeholders are, or the nature of engagement. However, the idea of a bias-free policing technology, as Bhaskaran suggests, is not only an inherently and technically contradictory idea but a dangerous one. As Legal Scholar and Indigenous Activist Irene Watson has made clear, the state, even while ‘styling itself liberal and multicultural’, cannot provide a platform to recognise Indigenous autonomy because that would be a challenge to its sovereignty and political economy.

Furthermore, the recent trends from Big Tech companies and their sustained attempts to control narratives around values and ethics does not so much reflect a desire for ethical technologies as it instead speaks to the ongoing lucrative, neoliberal drive to expand market profits. Tech start-ups such as Mark43 are fundamentally invested in the criminalisation and incarceration of people so that their companies can to continue to profit, and their success relies on the procurement of these tools by police departments. While the vision of streamlining police work might look like austerity measures, or a reformation of the police, the reality of these systems is that mass amounts of money are in fact funnelled into both tech companies and police forces. In other words, neoliberal logics of ‘smaller, smarter government’ pioneered by Margaret Thatcher and Ronald Reagan have produced what Ruth Wilson Gilmore calls the ‘anti-state state’, which advanced a rhetorical but illusory shrinkage of state budgets. The anti-state state involves the slashing of welfare, public education, public housing (which will only ever amplify social problems) and taxes on the rich and corporations while increasing spending on the military, police and prisons. Techno-policing can be seen as an extension of this process, as neoliberal logics of reform and appeals to resource-saving measures, efficiency, neutrality, and smart governance, expand, entrench, and occlude the harm of law enforcement programs.

As the abolitionists such as Wilson Gilmore and Debbie Kilroy argue, redirecting state funding away from policing and towards crucial infrastructure programs such as housing, health and crisis support services would be a more generative way of addressing existing inequalities and crime. Police, whether algorithmic or not, are still fining people, isolating people, locking people up, and upholding the settler-state. The problem with algorithmic policing is not that it can become biased, but rather that it continues and technologically enhances the project of law enforcement that is foundationally biased.

Indigenous Child Abuse Continues in Australia

John B. Lawrence SC, 29 Jun 2022

No human being should be held in such a dystopian relic, let alone a troubled Indigenous child.

About the author

Audrey Pfister

Audrey Pfister (sometimes-) writes, edits, Djs, and works in the arts mostly on Gadigal and Thawaral lands. They have a Bachelor of Art Theory and First Class Honours in Arts (Media, Culture, Technology). Audrey's thesis discursively analysed the re/production and obfuscation of structural inequities produced by Big Tech and Techno-Policing companies in the US and Australia.

More articles by Audrey Pfister

Categorised: Arena Online

Tagged:

1 Comment

Support Arena

Independent publications and critical thought are more important than ever. Arena has never relied on or received government funding. It has sustained its activities largely through the voluntary work and funding provided by editors and supporters. If Arena is to continue and to expand its readership, we need your support to do it.

Comments

Unfortunately, this article contains many inaccuracies – not least of which is the fact that the NSW Police Force have torn up their contract with Mark43 some time ago. You also mention a number of people in your story who are no longer employed by the NSW Police Force or are not in the same positions anymore. You may wish to review and amend the article.

Leave a Reply