AI Governance

AI Audit

An independent evaluation of an AI system's compliance, performance, fairness, and governance practices. Audits can be internal (conducted by the organization's own team) or external (by independent third parties), and may be required by regulation for high-risk systems.

Why It Matters

Self-assessment has limits. Independent audits provide the credibility that regulators, customers, and the public need to trust that AI systems actually work as claimed and don't cause hidden harm.

Example

A fintech company hires an external auditor to evaluate its AI credit scoring model, testing for disparate impact across racial groups, verifying that the model's documentation matches its actual behavior, and assessing whether the governance processes described in policy are followed in practice.

Think of it like...

An AI audit is like a financial audit — the company keeps its own books, but an independent auditor verifies that the numbers are real and the controls actually work.

Related Terms