Explore Legal.io

For Clients
Legal.io company logo
Hire Talent
Find the best fit for any legal role
For Members
Jobs
The best legal jobs, updated daily
Salaries
Benchmark compensation for any legal role
Learn
Learn and grow with our community
Events
Connect with peers at exclusive events
Apps
Tools to streamline legal work
Advertise on Legal.io
Post a job for free
Reach more qualified applicants quickly
Advertise with Us
Reach a targeted audience

For Clients

Hire Talent
Legal.io company logo
Solutions
Find the best fit for any legal role
New Hire
Get highly qualified candidates in days
Popular Roles
Data & Tools
Budget Calculator
Plan and manage your legal budget
Salary Insights
Compensation data for legal roles
Vendor Directory
The ultimate list of legal tech tools

The EU’s one-two punch: AI Act & GDPR

High-risk AI providers must ensure GDPR & EU AI Act compliance by Aug 2026 or face fines up to 11% of global revenue.

The EU’s one-two punch: AI Act & GDPR

Tom Corbett

Tom Corbett

AI, Product & Privacy Counsel; Certified AI Governance Professional (AIGP); CIPP/E; CIPP/US

If you provide a high-risk AI system that processes the personal data of EU residents, and you intend to offer that system in the EU or allow its outputs to be used there, you must comply with both the EU AI Act and the General Data Protection Regulation (GDPR) within the next 18 months. [Don’t know whether your AI system is high risk? See Article 6 and Annex III of the Act. Some examples: AI systems used in medical devices, vehicles, children’s toys, employment and hiring, education and student assessments, banking and credit applications, and biometric identification.] You’ll need to not only manage the system’s AI risks under the EU’s landmark AI law but also ensure personal data is handled lawfully and transparently under the EU’s pivotal privacy law.

AI Act requires system-level GDPR compliance

The EU AI Act (Art. 47) requires that providers of high-risk AI systems sign a declaration of conformity. This signed declaration must be made available to EU authorities upon request, and it must be kept up to date. The Act (Annex V) also requires that where the high-risk AI system involves the processing of personal data, the declaration of conformity must include a statement that the system complies with GDPR. This requirement is easy to overlook, as it appears in just a few words in an annex to the Act. The person responsible for signing this declaration (most likely a company executive) will be looking to your legal and compliance teams to assure them that their representations, including about GDPR compliance, are accurate.

Potential “double-whammy” fines

Be aware that, for violating both laws, a company could face a “double-whammy” fine of up to 11% of its annual global revenue.

  • EU AI Act: fines up to 7%
  • GDPR: fines up to 4%

We have seen that fines under GDPR can be eye popping, including Meta’s €1.2 billion fine. Worse yet, these laws may be enforced by different regulatory authorities, so a business could face duplicative investigations, resulting litigation, and settlement negotiations.

Compliance tensions between the EU AI Act and GDPR

The world has changed dramatically since 2016, when GDPR was adopted, perhaps nowhere more than in developments in AI and the mainstream adoption of LLMs. Unfortunately, certain core principles behind GDPR present problems in the age of AI. For example:

When processing personal data, GDPR requires . . .

  1. Data minimization; use only what you need
  2. Limiting the use of personal data to discrete, identifiable purposes and reasonably-related purposes (purpose limitation)
  3. Limiting the storage period of the personal data
  4. Transparency about how personal data is being processed
  5. Accuracy and, if inaccuracy is found, correction
  6. Strict limits on automated decision making of critical decisions
  7. Control over one’s personal data, including a “right to be forgotten” right of deletion

But AI system providers  . . .

  1. Seek to train their models on all the data available
  2. Use novel and and sometime unforeseen data techniques to discover patterns, provide insights, and generate other outputs
  3. Often want to hang on to all the data indefinitely because they believe they may need it later
  4. May produce models that are black boxes which even their developers may struggle to explain, including in terms of why and how certain outputs are generated
  5. Create probabilistic models (e.g., LLMs) where 100% accuracy is not the aim; these models may hallucinate and become less accurate over time due to model drift
  6. Push the boundaries of agentic AI, which holds the promise of making, and helping us make, better decisions
  7. Have scraped the Internet and collected personal data to create training data; removal of an individual’s personal data from a training data set and retraining is impractical, costly, and in many cases ineffective

Fortunately, since GDPR was adopted, many advances have been made in privacy-enhancing techniques, including differential privacy, synthetic data, and anonymization techniques. These advances can help both protect privacy and speed innovation. But as the above examples illustrate, compliance with GDPR presents fundamental challenges for AI providers.

Compliance burden allocation between the EU AI Act and GDPR

Another critical difference between GDPR and the EU AI Act is where each law places most of its compliance burden. Under GDPR, when a company is acting as a service provider (i.e., a processor under GDPR), such as a SaaS vendor, the law places most of the regulatory burden on the customer (i.e., a controller under GDPR). The core obligations for processors under GDPR can be identified in just one article of of that law: Article 28. The EU AI Act is just the opposite: it places most of the regulatory burden on high-risk AI system providers, not the customer (i.e., a deployer under the Act). The core obligations for deployers under the EU AI Act can be identified in just one article of that law: Article 26.

What high-risk AI system providers can do

Just like a skilled boxer, here is what you and your company can do now to defend against the EU’s tough one-two combination:

  • Make sure your footwork is up to speed. If you are not yet “floating like a butterfly” in privacy compliance, fix that now. GDPR requires being on one’s toes and practicing privacy-by-design and privacy-by-default with strength and agility. This law has been fully enforceable for over 5 years. If you do not yet have a c-suite supported, cross-functional culture of privacy, and you’re trying to simultaneously meet the challenges of the age of AI, your weak privacy stance will leave you ill-positioned and off balance.
  • Be able to get your guard up quick. Both GDPR and the EU AI Act require demonstrable compliance and at-the-ready compliance documentation. These are your “gloves” to deflect and defend incoming blows. Your documentation must be kept up to date, and – it cannot be over-emphasized – your actions must align with your documentation. In other words, don’t talk the talk if you can’t walk the walk. Making claims about your defensive readiness means nothing if you can’t demonstrate it when it's most needed.
  • Don’t just rely on bobbing and weaving. If your plan is to avoid contact altogether, that is a high-risk plan. It’s true that few companies have been investigated for GDPR compliance, but the EU AI Act presents new openings for EU regulators. Providers of high-risk AI systems must register with EU regulators and declare system-level GDPR compliance. GDPR has no such requirement. As a result, EU regulators will have a database of high-risk AI systems to review and potential targets to choose from. This can assist them in picking their fights strategically and developing their own “fight plans”.
  • Have a plan and adjust as needed. Mike Tyson famously said, “Everyone has a plan until they get punched in the mouth." True, but note that the person getting punched at least had a plan going into the ring, which they could adjust. Tyson didn't say, “wait until you get punched in the mouth to come up with a plan.” Have a compliance plan and start executing it. EU regulators will issue additional guidance this year and next about the EU AI Act, so adjust your plan and efforts accordingly. If your company is investigated under GDPR and/or the EU AI Act, you will certainly need to roll with the punches, but at least you will be prepared.
  • Have the right team behind you. A boxer does not prepare alone. Getting bout-ready requires working with the right cross-functional team, including trainer, manager, and dietician. Likewise, privacy and AI governance subject matter experts and other cross-functional resources can support you in creating and seeing through your compliance readiness plan.

August 2, 2026 is when the main event is scheduled – the date the EU AI Act’s provisions governing high-risk AI systems become effective. That may seem like a long time away, but as Muhammad Ali advised, "Don’t count the days, make the days count.” Even if your company is more lightweight than heavyweight, in terms of its size and compliance resources, your steady work, dedication, and discipline can help you punch above your weight and ensure you have what it takes to go the distance.

Legal.io Logo
Welcome to Legal.io

Connect with peers, level up skills, and find jobs at the world's best in-house legal departments