Algorithms increasingly govern interactions between state and citizen, and as the 'digital by default' model of government-citizen interaction spreads, this will increase. This, combined with the value of data science and how AI and machine learning is being embraced as a way to achieve efficiency and carry out public policy, means we need to consider how algorithms mediate real-world relationships between the state and individuals.

Artificial intelligence (AI) and machine learning (ML) is often embraced as a way to achieve operational efficiency and as a means to carry out public policy. Without confidence in the legitimacy and credibility of algorithms, however, the trust between government and citizens will dramatically degrade. People Powered Algorithms (PPA) looks at renewing our understanding of individual-state algorithmic interactions, and emphasises the need for productive and trusted relationships between those designing and deploying algorithms and the communities affected by decisions made using the algorithms.

This website documents the work-in-progress of the cross-disciplinary People Powered Algorithms (PPA) research project. PPA works across three key public policy areas of significant public spending where algorithmic decision-making is being and will be used in policy deployment: refugee resettlement, welfare and healthcare provision. These services are at the forefront of Digital by Default, where algorithms often determine if there is legitimate service access and use, and thus also to evaluate spending on these services. Decisions about legitimacy and fairness are based on the underlying public and political policies that these algorithms implement.  Algorithmic decision-making has consequences for the security and well-being of the recipients of the digital services and PPA studies how we might conceptualise and design algorithmic decision-making in such a way that adverse impacts are minimised. 

As our selected case studies illustrate, re-designing the system interactions and the communication of the political and economic logic have the potential to enhance the security and well-being of individuals, while protecting the security of the state and increasing the confidence people have in digital service design.

This project has been funded by EPSRC project grant EP/R033382/1, People Powered Algorithms for Desirable Social Outcomes.

Lead Research Organisation 

Cranfield University. Centre for Electronic Warfare, Information and Cyber, Information Operations Group, Defence Academy of the United Kingdom, Shrivenham. Dr Duncan Hodges 

Project Partners

University of East Anglia. Dr Oliver Buckley 

University of Portsmouth. School of Computing. Professor Debi Ashenden 

Royal Holloway University of London. Information Security Group; Politics and International Relations: Dr Will Jones, Professor Lizzie Coles-Kemp, Dr Amelia Morris

Collaborators

Boston University 

Norwich University of the Arts

University of Gothenburg

Voices of Tomorrow

Parallel Systems

Territory Studios

StoryFutures