Personal Data

Privacy by Design GDPR: Art. 25 Guide

Art. 25 GDPR requires privacy by design and by default. Learn the 7 foundational principles, implementation steps, and real enforcement cases from CNIL and ICO.

Also available in:Deutsch

Art. 25 GDPR makes privacy by design a legal obligation, not a best practice. Controllers must implement appropriate technical and organisational measures — both at the time of determining the means of processing and at the time of processing itself — to embed data protection principles into every system that handles personal data. Failure to comply exposes organisations to fines of up to EUR 10 million or 2% of global annual turnover under Art. 83(4).

Key Takeaways

  • Art. 25(1) GDPR requires controllers to implement data protection by design, considering the state of the art, implementation costs, and risks to data subjects.
  • Art. 25(2) GDPR requires data protection by default — only personal data necessary for each specific purpose should be processed.
  • The 7 foundational principles (Cavoukian) provide a practical framework, but the legal obligation comes from Art. 25, not the principles themselves.
  • CNIL fined Clearview AI EUR 20 million in 2022 for systemic privacy by design failures; the ICO fined British Airways GBP 20 million for inadequate security architecture.
  • Privacy by design GDPR compliance requires action at the design stage — retrofitting is more expensive and less effective.

What Art. 25 GDPR Actually Requires

Art. 25 creates two distinct obligations:

Art. 25(1) — Data protection by design: The controller must, both at the time of determining processing means and at the time of processing itself, implement appropriate technical and organisational measures designed to implement data-protection principles (such as data minimisation) effectively. The regulation requires considering:

  • The state of the art of available technology
  • The cost of implementation
  • The nature, scope, context, and purposes of processing
  • The risks of varying likelihood and severity for the rights of natural persons

Art. 25(2) — Data protection by default: The controller must implement appropriate technical and organisational measures to ensure that, by default, only personal data necessary for each specific purpose is processed. This applies to the amount of data collected, the extent of processing, the period of storage, and accessibility. By default, personal data must not be made accessible to an indefinite number of persons without the individual’s intervention.

Art. 25(3) adds that an approved certification mechanism under Art. 42 can be used as an element to demonstrate compliance with these requirements.

The 7 Foundational Principles of Privacy by Design

Dr. Ann Cavoukian’s framework predates the GDPR but directly influenced Art. 25. These seven principles translate legal requirements into operational practice:

1. Proactive not Reactive — Preventative not Remedial

Anticipate and prevent privacy risks before they materialise. Do not wait for breaches to occur.

Practical example: Before deploying a new customer analytics platform, conduct a Data Protection Impact Assessment (DPIA) under Art. 35 to identify and mitigate risks. A retailer discovered during a pre-launch DPIA that its proposed loyalty programme would collect excessive location data — the scope was reduced before a single record was created.

2. Privacy as the Default Setting

Systems must protect privacy automatically, without requiring user action. Data collection should be minimal by default.

Practical example: A SaaS platform ships with analytics tracking disabled; users opt in to sharing usage data. Registration forms collect only the fields required for account creation — additional profile data is optional. This aligns directly with data minimisation under Art. 5(1)©.

3. Privacy Embedded into Design

Privacy must be built into the architecture of systems and business processes, not bolted on after development.

Practical example: An HR system is designed with role-based access controls from day one — managers see only their direct reports’ data, payroll staff see only compensation data, and no single role has access to complete employee records. Data fields are encrypted at the column level in the database schema.

4. Full Functionality — Positive-Sum, Not Zero-Sum

Privacy and functionality are not trade-offs. Well-designed systems achieve both.

Practical example: A marketing platform implements consent-based personalisation that performs better than invasive tracking because users who opt in are genuinely engaged. Conversion rates improve while data collection decreases.

5. End-to-End Security — Full Lifecycle Protection

Data must be protected from collection through deletion. Security covers the entire data lifecycle.

Practical example: Implement encryption at rest and in transit, automated data retention enforcement that purges records after defined periods, and secure deletion procedures that overwrite rather than merely flag data as deleted. Align with the storage limitation principle.

6. Visibility and Transparency

Organisations must be transparent about data practices. Users and regulators should be able to verify claims.

Practical example: Publish a machine-readable privacy notice. Maintain an up-to-date Record of Processing Activities (ROPA) under Art. 30. Provide a privacy dashboard where users can see exactly what data the organisation holds about them. Ensure data accuracy through regular verification.

7. Respect for User Privacy — Keep it User-Centric

Design systems that empower individuals. Provide clear, accessible controls for data access, correction, and deletion.

Practical example: A healthcare portal lets patients download their complete medical records in machine-readable format, request corrections directly through the interface, and delete their account with a single confirmed action — all within Art. 15-17 GDPR timelines.

Art. 25 Implementation Checklist

Use this checklist when designing or reviewing any system that processes personal data:

Design Phase:

  • [ ] DPIA conducted for high-risk processing (Art. 35)
  • [ ] Data minimisation applied — only necessary fields collected
  • [ ] Legal basis documented for each data element
  • [ ] Retention periods defined and automated
  • [ ] Access controls designed with least-privilege principle
  • [ ] Encryption at rest and in transit specified

Development Phase:

  • [ ] Privacy settings default to most protective option
  • [ ] Consent mechanisms meet Art. 7 requirements (granular, withdrawable)
  • [ ] Data subject rights (access, rectification, erasure, portability) technically supported
  • [ ] Logging and audit trails implemented for compliance monitoring
  • [ ] Sub-processor data flows mapped and controlled

Deployment and Operations:

  • [ ] Privacy notice updated to reflect actual processing
  • [ ] Staff trained on data handling procedures
  • [ ] Incident response plan covers breach detection and Art. 33 notification
  • [ ] Regular reviews scheduled (at least annually)
  • [ ] Art. 30 ROPA maintained and current

Maintaining this level of documentation across multiple systems and processing activities is where manual processes break down. Legiscope automates ROPA generation and DPIA tracking, reducing what typically takes weeks of manual effort to a structured, audit-ready output.

Enforcement: Real Cases on Privacy by Design Failures

Supervisory authorities have imposed significant fines specifically for Art. 25 violations:

CNIL v Clearview AI (2022) — CNIL Deliberation SAN-2022-019, 17 October 2022, EUR 20 million fine. Clearview scraped billions of facial images from the internet without any data protection consideration in its system design. The CNIL found a complete absence of privacy by design measures: no legal basis analysis, no data minimisation, no transparency mechanism, and no means for individuals to exercise their rights. The system was designed to collect everything, from everyone, indefinitely.

ICO v British Airways (2020)GBP 20 million fine (reduced from initial GBP 183 million). The ICO found that BA’s systems lacked adequate security architecture — attackers were able to divert customer payment data through a modified JavaScript file. The ICO specifically noted failures in security design: inadequate monitoring, insufficient access controls, and no testing for the type of attack that occurred. These are Art. 25(1) failures — security should have been embedded from design.

CNIL v Amazon France Logistique (2024) — CNIL Deliberation SAN-2023-021, 27 December 2023, EUR 32 million fine. The CNIL found that Amazon’s warehouse employee monitoring system processed scanner data with excessive granularity and without adequate privacy by design safeguards. The system tracked individual productivity with second-level precision — far beyond what was necessary for legitimate warehouse management.

AEPD v CaixaBank (2021) — The Spanish DPA fined CaixaBank EUR 6 million for processing customer data for marketing purposes without embedding adequate consent mechanisms into the system design. Customers were opted in by default — a direct violation of Art. 25(2).

Common Mistakes in Privacy by Design Implementation

  1. Treating PbD as a one-time exercise. Art. 25(1) applies both “at the time of the determination of the means” and “at the time of the processing itself.” Systems must be reviewed and updated continuously.

  2. Confusing PbD with security. Security (Art. 32) is one component. Privacy by design also covers data minimisation, purpose limitation, transparency, and user control. A perfectly secure system that collects excessive data still violates Art. 25.

  3. No documentation. Without records of design decisions, risk assessments, and the rationale for chosen measures, organisations cannot demonstrate compliance to a supervisory authority.

  4. Ignoring Art. 25(2) defaults. Many organisations implement privacy controls but leave them turned off by default. Art. 25(2) requires the most protective setting as the default.

FAQ

What does privacy by design mean under GDPR?

Art. 25(1) GDPR requires controllers to implement appropriate technical and organisational measures — at the design stage and throughout the processing lifecycle — to embed data protection principles (minimisation, purpose limitation, accuracy, storage limitation) into systems and processes. It is a legal obligation, not a voluntary framework.

Is privacy by design mandatory under GDPR?

Yes. Art. 25 GDPR makes data protection by design (paragraph 1) and by default (paragraph 2) legally binding obligations for all controllers. Non-compliance can result in fines of up to EUR 10 million or 2% of annual worldwide turnover under Art. 83(4)(a).

What is the difference between privacy by design and privacy by default?

Privacy by design (Art. 25(1)) requires embedding data protection into system architecture from the outset. Privacy by default (Art. 25(2)) requires that the default settings of any system process only the minimum personal data necessary. A system can be designed with privacy in mind but still fail Art. 25(2) if users must manually enable privacy protections.

How do you demonstrate privacy by design compliance?

Document your design decisions and the rationale behind them. Maintain DPIAs for high-risk processing. Keep records of technical measures implemented (encryption, access controls, minimisation). Show that default settings are the most protective. An Art. 42 certification can serve as evidence of compliance. Legiscope’s automated compliance tracking generates the documentation trail that supervisory authorities expect during audits.

Conclusion

Privacy by design GDPR compliance under Art. 25 requires embedding data protection into every system that handles personal data — from initial architecture through ongoing operations. The seven foundational principles provide a practical framework, but the legal obligation is specific: consider the state of the art, the cost, the risks, and implement measures that make data protection principles effective in practice. With fines reaching EUR 32 million for design failures and regulators explicitly citing Art. 25 in enforcement actions, treating privacy by design as optional is a measurable financial risk.

Legiscope automates this for you

Stop doing compliance manually. Legiscope's AI handles ROPA creation, DPA audits, and gap analysis — in minutes, not weeks.

Start free trial
TD
Written by
Fondateur de Legiscope et expert RGPD

Docteur en droit de l'Université Panthéon-Assas (Paris II), 23 ans d'expérience en droit du numérique et conformité RGPD. Ancien conseiller de l'administration du Premier ministre sur la mise en œuvre du RGPD. Thiébaut est le fondateur de Legiscope, plateforme de conformité RGPD automatisée par l'IA.

View full author profile →