Data Privacy

GDPR Article 32: Security of Processing

GDPR Article 32 requires appropriate technical and organizational security measures: encryption, pseudonymization, integrity, availability, regular testing.

In one sentence. GDPR Article 32 requires controllers and processors to implement appropriate technical and organisational measures (TOMs) to ensure a level of security appropriate to the risk, including (a) pseudonymisation and encryption, (b) ongoing confidentiality, integrity, availability, resilience, © ability to restore data after an incident, (d) regular testing and evaluation. The bar is risk-proportionate — no fixed checklist — but the EDPB and national DPAs have published progressively detailed guidance making the practical floor measurable.

Article 32 is the most-frequently invoked GDPR provision in breach notifications. When a breach happens, the first question is always: were security measures appropriate? If yes, the breach is misfortune. If no, it’s negligence — and a separate sanction. Most major GDPR breach fines (British Airways, Marriott, Capital One) cite Article 32 inadequacy alongside the breach.

For the breach notification obligations themselves, see Article 33 GDPR. For related obligations, Article 25 privacy by design, Article 35 DPIA.

Key takeaways

  • Article 32(1) lists four illustrative measures: pseudonymisation/encryption, ongoing CIA + resilience, restoration capability, regular testing.
  • The bar is appropriate to the risk — risk-proportionate, not “best possible.”
  • Article 32(2) requires considering the risks presented by accidental or unlawful destruction, loss, alteration, unauthorized disclosure, or access.
  • Article 32(4) requires controllers to ensure that anyone with access processes data only on instructions.
  • ISO 27001/27701 + ENISA recommendations provide the practical floor.

1. Article 32 text — the four illustrative measures

Article 32(1) lists technical and organisational measures as appropriate:

  • (a) Pseudonymisation and encryption of personal data
  • (b) Ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services
  • © Ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident
  • (d) A process for regularly testing, assessing and evaluating the effectiveness of TOMs

The four are illustrative — not exhaustive. Other measures may be appropriate depending on context.

2. Risk-proportionate — what it means

The level of security must be appropriate to:

  • State of the art
  • Cost of implementation
  • Nature, scope, context, purposes of processing
  • Risk of varying likelihood and severity for the rights and freedoms of natural persons (Article 32(2))

Translation:

  • Processing 50 customer emails for a small e-commerce → basic measures (HTTPS, MFA on admin, encrypted backups)
  • Processing 10M records of health data for a hospital → enterprise-grade (encryption with HSM keys, ISO 27001 certified, 24/7 SOC, annual penetration testing)

The Article 32 violation isn’t “you didn’t have an HSM” — it’s “you didn’t have the controls appropriate to your risk.”

3. Practical floor: the consensus baseline

For any organization processing personal data, regardless of size, the practical floor accepted by EU DPAs:

Domain Minimum measure
Encryption in transit TLS 1.2 minimum (TLS 1.3 preferred)
Encryption at rest AES-256 for sensitive data
Authentication MFA mandatory for admin and remote access
Access control Role-based, least privilege, periodic review
Logging Access to sensitive data logged, logs retained ≥1 year
Patch management Security patches applied within reasonable SLA
Backups Tested restoration, retained ≥30 days, isolated
Incident response Documented playbook, breach notification within 72h
Vendor security DPA + due diligence (ISO 27001, SOC 2)
Awareness Annual security training documented

Below this floor → assume Article 32 violation.

4. Encryption — the most-cited measure

Article 32(1)(a) names encryption explicitly. The CNIL and EDPB consider encryption “state of the art” — failure to encrypt sensitive data when the technology is widely available is generally treated as inadequate security.

Data class Encryption expectation
Authentication credentials Hashed (bcrypt, argon2) — not encrypted, hashed
Special category data (health, biometric) Encryption at rest mandatory
Financial / payment data Encryption at rest mandatory + PCI DSS
General personal data Encryption at rest strongly expected
Backups Encryption at rest mandatory
Data in transit TLS 1.2+ universal

5. Pseudonymisation — the underused measure

Pseudonymisation (Article 4(5)) means processing data so it can no longer be attributed to a specific person without additional information held separately. It’s not anonymisation (which is irreversible) but it significantly reduces risk.

Common patterns:

  • Identifiers separated from observations (user_id table + activity table with only user_id)
  • Tokenization for credit card numbers (Stripe pattern)
  • Salt + hash for analytics aggregates
  • Differential privacy for published statistics

For deep dive: pseudonymisation guide (FR).

6. Regular testing (Article 32(1)(d))

The ability to verify TOM effectiveness over time. Practical implementations:

  • Penetration testing: annual minimum, more for high-risk
  • Vulnerability scanning: continuous (Nessus, Snyk, etc.)
  • Access review: quarterly minimum
  • Backup restoration test: monthly
  • Incident response drill: annual table-top
  • Internal audit: annual

The CNIL has specifically sanctioned controllers who couldn’t demonstrate regular testing — one-time setup isn’t enough.

7. Article 32(4) — instructions to staff

Article 32(4) requires the controller to ensure that any natural person acting under their authority who has access to personal data processes them only on instructions from the controller.

Translation:

  • Charter of usage signed by employees
  • Documented data handling procedures
  • Training on what’s allowed
  • Sanctions for unauthorized processing

Internal employee misuse (e.g., looking up celebrity records) is a frequent breach trigger — Article 32(4) compliance shifts liability somewhat.

8. ISO 27001 and ISO 27701 alignment

  • ISO 27001: information security management — closely aligned with Article 32
  • ISO 27701: privacy information management — extension of 27001 covering GDPR specifically
  • Certification doesn’t automatically prove Article 32 compliance but is strong evidence in audits

For the gap between ISO 27001 and full GDPR, see ISO 27001 vs GDPR compliance.

9. Sanctions for Article 32 violations

Year Sanction Article 32 inadequacy
2019 British Airways (ICO) — £20M (reduced from £183M) No multi-factor authentication, weak network segmentation
2020 Marriott (ICO) — £18.4M Inadequate due diligence on acquired Starwood IT
2022 Hôpital de Bourges (CNIL) — €60K Health data without adequate encryption
2023 Optical Center (CNIL) — €250K Unencrypted database accessible from internet
2024 Multiple SaaS (CNIL) — €50K-€500K Various — no MFA, no access logs, no patching

Article 83(4)(a) places Article 32 violations at the lower fine tier — up to €10M or 2% of global annual turnover. But Article 32 violations frequently combine with breach notification (Article 33) and other failures, escalating total exposure.

10. Implementation checklist

  • ☐ Risk assessment documented per processing activity
  • ☐ TOMs documented in ROPA Annex
  • ☐ Encryption: at rest (AES-256), in transit (TLS 1.2+), in backups
  • ☐ MFA mandatory for admin + sensitive accounts
  • ☐ Access logs for sensitive data, retained ≥1 year
  • ☐ Patch management SLA documented and met
  • ☐ Backup with tested restoration
  • ☐ Incident response playbook + 72h notification process
  • ☐ Annual penetration test conducted
  • ☐ Quarterly access review
  • ☐ Annual security training for all staff
  • ☐ Charter of usage signed by employees (Article 32(4))
  • ☐ Vendor DPA + due diligence (ISO 27001 / SOC 2)

11. Tooling

Legiscope maintains the TOMs documentation alongside the ROPA, alerts on missing measures per processing activity, and integrates with vendor security questionnaires.

For related deep-dives: Article 33 RGPD breach notification, Article 25 privacy by design, Article 35 DPIA, ISO 27001 vs GDPR.

Conclusion

Article 32 is risk-proportionate but not optional. The practical floor — TLS, MFA, encryption, logs, backups, testing — is achievable for any organization. The Article 32 sanctions arrive after a breach exposes inadequate security; building proportionate measures before is significantly cheaper than after.

FAQ

What does GDPR Article 32 require?

Appropriate technical and organisational measures to ensure security of processing — proportionate to the risk. Illustrative measures include pseudonymisation, encryption, ongoing confidentiality/integrity/availability/resilience, restoration capability, and regular testing.

Is encryption mandatory under GDPR?

Article 32(1)(a) lists encryption as an example of an appropriate measure. While not strictly mandatory, the EDPB considers encryption state of the art — failing to encrypt sensitive data when the technology is widely available is generally treated as inadequate security and sanctioned accordingly.

What’s the practical security floor for a small business?

TLS 1.2+ for all traffic, MFA on admin accounts, encrypted backups, access logging, documented incident response, annual security training, vendor due diligence with signed DPAs. Below this floor, Article 32 inadequacy is likely on first inspection.

How often must I test security measures?

Article 32(1)(d) requires “regular” testing — typically annual penetration testing, monthly backup restoration tests, quarterly access reviews, continuous vulnerability scanning. The exact cadence depends on risk.

Does ISO 27001 certification satisfy Article 32?

It’s strong evidence but not automatic compliance. ISO 27001 covers many Article 32 controls. ISO 27701 extends it for privacy specifically. The DPA still assesses actual implementation, especially in breach contexts.

Legiscope automates this for you

Stop doing compliance manually. Legiscope's AI handles ROPA creation, DPA audits, and gap analysis — in minutes, not weeks.

Start free trial
TD
Written by
Fondateur de Legiscope et expert RGPD

Docteur en droit de l'Université Panthéon-Assas (Paris II), 23 ans d'expérience en droit du numérique et conformité RGPD. Ancien conseiller de l'administration du Premier ministre sur la mise en œuvre du RGPD. Thiébaut est le fondateur de Legiscope, plateforme de conformité RGPD automatisée par l'IA.

View full author profile →