Frequently Asked Questions
Common questions about the EU AI Act, high-risk AI compliance, and how AI Attest works.
What is the EU AI Act?
The EU AI Act (Regulation 2024/1689) is the first comprehensive legal framework for artificial intelligence. It entered into force on August 1, 2024, and establishes requirements for AI systems based on their risk level. High-risk AI systems listed in Annex III must comply with Articles 9 through 17, covering risk management, data governance, transparency, human oversight, accuracy, robustness, and cybersecurity.
Does the EU AI Act apply to my AI system?
If your AI system falls under one of the eight Annex III categories (biometric identification, critical infrastructure, education, employment, essential services, law enforcement, migration, or justice), it is classified as high-risk and must comply with Articles 9-17. General-purpose AI models have separate requirements under Title IIIA. Systems that pose unacceptable risk are prohibited entirely.
When does compliance become mandatory?
For Annex III high-risk systems, the current enforcement date is August 2, 2026. The Digital Omnibus on AI may extend this to December 2, 2027. Regardless of the timeline, the compliance requirements are identical. The penalty for non-compliance is up to 3% of global annual turnover or 15 million euros, whichever is higher.
What does AI Attest do?
AI Attest is a compliance platform for the EU AI Act. It helps you audit your existing AI documentation against Articles 9-17, fill compliance gaps using article-mapped templates, and generate regulator-ready compliance reports backed by cryptographic audit trails. The platform tracks dependencies between compliance artifacts and automatically flags what needs updating when something changes.
How does the compliance audit work?
Upload your existing AI documentation in any format (PDF, Word, Markdown, HTML). The audit engine uses AI-powered semantic analysis to classify each document against the 13 compliance artifact types and evaluates field-level coverage against EU AI Act requirements. You receive a gap report showing per-article readiness scores, specific missing fields, and an actionable remediation plan.
What are the 13 compliance artifacts?
The 13 artifacts map to EU AI Act Articles 9-17 and Annex IV: PRD (requirements), Architecture, API Contracts, Data Governance, Bias Assessment, Model Evaluation, Test Plan, Security Review (threat model and implementation audit), QA Report, Deployment Plan, Deployer Guide, and Monitoring Plan. Standard tier uses 8 artifacts for general software governance; Enterprise tier uses all 13 for full EU AI Act compliance.
What is cascade invalidation?
Cascade invalidation is the mechanism that prevents silent compliance drift. When you change an upstream artifact (like revising your architecture), every downstream artifact that depends on it is automatically flagged as stale with specific instructions for what to re-evaluate and why. This ensures your compliance documentation stays internally consistent even as your system evolves.
What is the cryptographic audit trail?
Every artifact submission, approval, phase transition, and cascade event is recorded as an immutable event with a SHA-256 content hash, dependency verification, and timestamp. The events form a hash chain that can be independently verified for integrity. When a regulator asks you to demonstrate that your compliance process was followed, you can point to mathematically verifiable evidence.
How is this different from a compliance checklist?
A checklist tells you what to do. AI Attest enforces that you did it, in the right order, with verified dependencies, and keeps track of what breaks when things change. The dependency pipeline prevents you from skipping steps. Cascade invalidation prevents compliance drift. The audit trail proves the process was followed. A checklist cannot do any of these things.
What AI providers does AI Attest support?
AI Attest supports Anthropic (Claude), OpenAI (GPT-4o), and Google (Gemini) for the compliance audit. You bring your own API key, which is encrypted at rest using AES-256-GCM and never displayed after saving. The platform is model-agnostic: the same audit can be run with any supported provider.
How much does AI Attest cost?
AI Attest is free during beta. Full access to the audit, build pipeline, impact analysis, and compliance report generation is included at no cost. Pricing will be introduced after the beta period.
Does AI Attest support GDPR and the Cyber Resilience Act?
Yes. GDPR DPIA (Data Protection Impact Assessment) and Cyber Resilience Act compliance are both live. Each regulation gets its own artifact pipeline, dependency graph, and audit trail. The engine architecture is regulation-agnostic: adding a new regulation requires configuration and templates, not code changes.
Is my data secure?
API keys are encrypted at rest using AES-256-GCM. Documents are processed for analysis and stored per-project. The platform runs on a dedicated VPS with SSL/TLS encryption in transit. No data is shared with third parties beyond the AI provider you select for the audit analysis.
Can I export my compliance report?
Yes. Compliance reports can be downloaded as PDF, HTML, or JSON. The report maps every EU AI Act requirement to specific evidence from your artifact submissions, including timestamps, content hashes, and approval records. The report is designed for regulator review.