Systemic Risk (AI)
Under the EU AI Act, a classification for general-purpose AI models whose capabilities or reach are significant enough to pose risks to public health, safety, security, or fundamental rights across the EU. Currently triggered when a model's training compute exceeds 10^25 FLOPs, or by European Commission decision based on other criteria.
Why It Matters
Systemic risk classification triggers the EU AI Act's most stringent GPAI obligations: model evaluation, adversarial testing, incident tracking, cybersecurity protections, and energy consumption reporting. Only a handful of models currently meet the threshold, but it's a moving target.
Example
A frontier AI lab whose latest model was trained using more than 10^25 FLOPs must conduct and document adversarial red-teaming exercises, track and report serious incidents, implement cybersecurity protections, and report the model's energy consumption during training.
Think of it like...
Systemic risk in AI is like 'too big to fail' in banking — when a single model is so widely used that its failures could cascade across industries and borders, regulators impose extra safeguards.
Related Terms
General-Purpose AI Model (GPAI)
Under the EU AI Act, an AI model trained on broad data that can perform a wide range of tasks rather than being designed for a single purpose. GPAI providers face transparency obligations, with additional requirements if the model poses systemic risk — currently triggered at a training compute threshold of 10^25 floating-point operations (FLOPs).
EU AI Act
The European Union's comprehensive regulatory framework for artificial intelligence, establishing rules based on risk levels. It categorizes AI systems from minimal to unacceptable risk with corresponding compliance requirements.
Red Teaming (AI)
A structured adversarial testing exercise where testers deliberately attempt to find failures, vulnerabilities, biases, or harmful outputs in an AI system. Unlike standard testing that checks if the system works, red teaming checks how the system breaks.