Impact Assessments: Supporting AI Accountability

Impact Assessments: Supporting AI Accountability

Artificial intelligence’s applications span everything from healthcare and education to manufacturing and energy.

Automation is influencing strategies across industry, but the rapid pace of deployment raises as many issues as these technologies claim to solve.

Law enforcement, mortgage lending, and video hiring processes are just some of the areas where the limits of current systems have been exposed, with significant consequences. Tools like deepfake software threaten to accelerate the spread of disinformation even further, inflaming geopolitical tensions and jeopardising national security.

The world stands at a regulatory crossroads. For all that algorithms have promised to augment our decision-making, judgements on how to mitigate the negative impacts of AI while preserving its value rest in human hands alone. Where do we go from here?

Access Partnership’s latest report, ‘Impact Assessments: Supporting AI Accountability & Trust’, helps to guide this debate by providing an overview of existing processes, as well as recommendations on how to proceed. Developed in partnership with Workday, the report outlines three main accountability frameworks – algorithmic impact assessments (AIAs), third-party auditing, and conformity assessments – to explain why AIAs should become the global standard.

To date, a patchwork of regulatory responses has emerged. New York and California have proposed mandatory third-party auditing of AI systems. Companies in Virginia, Colorado, and Connecticut must conduct ‘data protection assessments’ for activities that demonstrate heightened consumer risk. The EU’s landmark Artificial Intelligence Act will require conformity assessments for ‘high-risk’ systems.

This report analyses existing and emerging frameworks in detail, drawing on our industry-leading expertise to identify the limits of each in terms of technical standards and adaptability.

Policymakers hold the key to harmonising AI’s future. The challenge of balancing transparency with the need to drive automated technologies’ advantages for economic development is essential to societal progress. To learn how to do both, download our report.

Download report 

Related Articles

Shifting National Security Needs in Technology, Critical Minerals, and Energy

Shifting National Security Needs in Technology, Critical Minerals, and Energy

Over the past decade, the national security landscape has undergone a significant transformation, driven by the intersection of technology, critical...

20 Feb 2025 Opinion
Building the Backbone of US Digital Leadership

Building the Backbone of US Digital Leadership

This opinion piece is part of Access Partnership’s  ‘A Digital Manifesto’  initiative, which recommends a framework to develop US global leadership on...

12 Feb 2025 General
Access Alert: White House Seeks Input on AI Action Plan

Access Alert: White House Seeks Input on AI Action Plan

The Trump Administration is charting a new course for US AI leadership following its rescinding of the Biden AI Executive...

6 Feb 2025 Opinion
12 Key Principles for Sustainable AI

12 Key Principles for Sustainable AI

Making AI sustainable is crucial to ensuring its accessibility in all countries and reducing the digital divide. Understanding a country’s...

5 Feb 2025 Opinion