Algorithmic Accountability for the Public Sector – Report
Aug 17, 2021
The Ada Lovelace Institute (Ada), AI Now Institute (AI Now), and Open Government Partnership (OGP) have partnered to launch the first global study to analyse the initial wave of algorithmic accountability policy for the public sector.
As governments are increasingly turning to algorithms to support decision-making for public services, there is growing evidence that suggests that these systems can cause harm and frequently lack transparency in their implementation. Reformers in and outside of government are turning to regulatory and policy tools, hoping to ensure algorithmic accountability across countries and contexts. These responses are emergent and shifting rapidly, and they vary widely in form and substance – from legally binding commitments, to high-level principles and voluntary guidelines.
This report presents evidence on the use of algorithmic accountability policies in different contexts from the perspective of those implementing these tools, and explores the limits of legal and policy mechanisms in ensuring safe and accountable algorithmic systems.
Read the executive summary for key findings and the full report below for further detail on these findings and practical case studies of implemented policies.