Privacy & data protection — DMDU approach

One coupling model, multiple privacy standards

ISO 27701 gives us the privacy control structure. The surrounding standards — 27557, 29100, 29134, 31700 — each add a methodological lens. GDPR and NIST Privacy Framework provide the regulatory and framework context. Here's how they connect.

27557 Privacy risk

ISO 27557 — Organisational privacy risk management

Extends ISO 31000 risk management to privacy. Our scenario discovery approach replaces the privacy risk matrix with computational exploration — the same methodological upgrade we apply to 27005 for information security.

29100 Principles

ISO 29100 — Privacy framework

Defines 11 privacy principles. Every 27701 control maps to one or more principles. Our coupling analysis reveals which principle violations cascade through the control structure.

29134 Impact assessment

ISO 29134 — Privacy impact assessment

Guidelines for conducting PIAs. Our ABM provides computational PIA — simulating privacy impact propagation across the PIMS rather than estimating it from a single perspective.

31700 Privacy by design

ISO 31700 — Privacy by design for consumer products

30 requirements for embedding privacy into product design. Maps directly to 27701's A.1.4 (privacy by design and default) controls, with product lifecycle coupling to development and deployment.

Regulation — future build

GDPR

Articles 5–43 map to 27701 controls via Annex D. Full article-level coupling analysis planned as separate linked project.

Framework — future build

NIST Privacy Framework

Identify-Govern-Control-Communicate-Protect structure. Cross-mapping to 27701 planned as separate linked project.

ISO 27557

Privacy risk management — scenario discovery replaces the risk matrix

ISO 27557 applies the ISO 31000 risk management framework to privacy. Traditional implementation involves identifying privacy threats, estimating their likelihood and impact on individuals, and treating the highest-ranked risks. The DMDU approach replaces every step:

27557 stepTraditional approachDMDU approachTool
Context (5.3) Define processing scope and stakeholders Map the consent-chain, purpose-limitation, and data-lifecycle couplings that define the processing context Privacy coupling →
Risk identification (5.4) Enumerate privacy threats Identify coupling pathways — consent withdrawal cascades, cross-border transfer chains, rights activation propagation Privacy coupling →
Risk analysis (5.5) Score likelihood × impact on individuals Simulate privacy control degradation across scenario space — test how consent management failure cascades through purpose limitation into subject rights into breach notification ABM simulation →
Risk evaluation (5.6) Rank risks, compare to appetite Scenario discovery (PRIM/CART) — identify which combinations of conditions produce privacy failures that affect individuals ABM simulation →
Risk treatment (5.7) Select 27701 controls Design control configurations robust across the privacy risk scenario space — not optimised for the most likely breach scenario Cluster ABM →
Traditional 27557

Estimate → rank → treat top risks

Lists known privacy threats, assigns subjective probabilities, ranks by expected harm to individuals. Misses compound failures: what happens when consent management AND cross-border transfers AND processor oversight all degrade simultaneously.

DMDU 27557

Map → explore → discover failure conditions

Maps privacy coupling structure, explores computationally, discovers which parameter combinations cause privacy protection to fail. Finds the "consent withdrawal rate × processor sub-contracting depth × jurisdiction complexity" thresholds.

ISO 29100

Privacy principles — the objectives our controls must achieve

ISO 29100 defines 11 privacy principles. These are not controls — they're the objectives that 27701's controls are designed to achieve. Our coupling analysis reveals which controls serve which principles, and critically, which principle violations cascade through the control structure.

Consent & choice

Informed, freely given consent with easy withdrawal

A.1.2.3 · A.1.2.4 · A.1.3.4

Purpose legitimacy

Processing only for specified, legitimate purposes

A.1.2.1 · A.1.2.2 · A.2.2.2

Collection limitation

Collect only what is necessary for the purpose

A.1.4.1 · A.1.4.4

Data minimisation

Process the minimum PII necessary

A.1.4.2 · A.1.4.4 · A.1.4.5

Use limitation

Limit processing to stated purposes

A.1.2.1 · A.1.4.2 · A.2.2.2

Accuracy & quality

PII must be accurate, complete, up-to-date

A.1.4.3 · A.1.3.7

Openness & transparency

Clear information about processing practices

A.1.3.1 · A.1.3.2 · A.1.3.3

Individual participation

Rights to access, correct, delete PII

A.1.3.5 · A.1.3.6 · A.1.3.7

Accountability

Demonstrate compliance with policies and obligations

A.1.2.5 · A.1.2.6 · A.1.5.3

Information security

Protect PII with appropriate safeguards

27001 Annex A (full)

Privacy compliance

Comply with applicable privacy legislation

A.1.2.2 · A.1.5.1 · A.1.5.5
The coupling analysis shows that "accountability" is the most connected principle — it depends on outputs from nearly every other principle. When any principle degrades, accountability evidence becomes incomplete.

The DMDU insight: principles don't fail independently. A degradation in consent & choice (A.1.2.3 fails → consent records incomplete) cascades through purpose legitimacy (processing may no longer have valid basis) into accountability (can't demonstrate compliance) and individual participation (can't respond to subject access requests accurately). The coupling model captures this cascade structure; the ABM simulates how fast it propagates.

ISO 29134

Privacy impact assessment — computational PIA via ABM

ISO 29134 provides guidelines for privacy impact assessments. The traditional PIA is a document-based exercise: describe the processing, identify privacy risks, assess their impact on individuals, and recommend mitigations. Our ABM extends this into a computational PIA.

29134 phaseTraditional PIAComputational PIA (DMDU)
Threshold analysis Determine if a PIA is needed (based on data types, scale, novelty) Map the control's coupling connectivity — high-connectivity processing operations need deeper assessment
Identify privacy risks Workshop-based threat brainstorming Trace the coupling pathways from the processing activity — consent chains, data lifecycle dependencies, transfer risks
Assess impact on individuals Qualitative assessment (low/medium/high) Simulate privacy control degradation: how does failure in this processing activity cascade through subject rights, notification obligations, and accountability chains?
Identify mitigations Select additional controls from 27701 Design mitigations robust across the scenario space — test them in the ABM before implementation
Document and review Static document, periodic review Living model — re-run with updated parameters as processing activities evolve

The key advantage of computational PIA: you can test mitigations before implementing them. Toggle a control on in the ABM, re-run the scenario sweep, and see whether the failure conditions shift. A traditional PIA tells you a mitigation is recommended; a computational PIA tells you whether it actually works across the uncertainty space.

ISO 31700

Privacy by design — from principles to testable architectures

ISO 31700 defines 30 requirements for embedding privacy into consumer product design. These map directly onto 27701's privacy by design controls (A.1.4.x) but extend them with product lifecycle considerations.

31700 domain27701 controlsDMDU application
Design (Cl. 4–6) A.1.4.1 (limit collection), A.1.4.2 (limit processing), A.1.4.4 (minimisation objectives) Test minimisation configurations in the ABM — does reducing collection actually reduce downstream risk, or does it create data quality issues that cascade into accuracy failures?
Production & release (Cl. 7) A.1.2.5 (PIA), A.1.4.5 (de-identification at end of processing) Pre-deployment simulation: run the ABM with the product's specific data flows and processing activities to identify privacy failure modes before launch
Operation (Cl. 8–14) A.1.3.x (obligations to principals), A.1.2.3/A.1.2.4 (consent), A.1.5.x (transfers) Operational monitoring: track ABM early warning indicators (consent withdrawal rate, subject rights volume, processor compliance scores) during live operation
End of life (Cl. 15) A.1.3.9 (retention and disposal), A.1.4.5 (de-identification and deletion) End-of-life simulation: test whether disposal procedures actually achieve complete erasure across all data stores, backups, processor systems, and cached copies

31700 is specifically aimed at consumer-facing products. The DMDU approach is particularly valuable here because consumer products operate at scale — a privacy design flaw affects millions of individuals simultaneously. The ABM can model this scale effect: how a single consent mechanism failure at 10% of users propagates through purpose limitation, data quality, and subject rights at population scale.

Integration

One model, multiple privacy standards

Like the 27000 family, these privacy standards aren't separate systems — they're different lenses on the same PIMS. Our coupling model captures the privacy control structure once; each standard provides a specific analytical perspective:

StandardWhat it addsPrimary toolKey coupling types
ISO 27701 The base — 47 privacy controls, 9 coupling types, controller/processor distinction Privacy coupling → All 9 types
ISO 27557 Privacy risk methodology → scenario discovery on the same model ABM simulation → breach-cascade, rights-cascade, transfer-risk
ISO 29100 Principle compliance → principle-to-control mapping + cascade analysis Privacy coupling → consent-chain, purpose-limitation, accountability
ISO 29134 PIA methodology → computational PIA via ABM scenario testing ABM simulation → data-lifecycle, rights-cascade, breach-cascade
ISO 31700 Product lifecycle PbD → pre-deployment simulation + operational monitoring Privacy coupling → purpose-limitation, data-lifecycle, consent-chain
Future build-outs

GDPR & NIST Privacy Framework

These are planned as separate linked projects with full control-level coupling analysis:

GDPR — Article-level coupling analysis

Full mapping of GDPR Articles 5–43 to 27701 controls with coupling analysis. Article 5 principles map to 29100 principles (already covered). Articles 12–23 (data subject rights) map to A.1.3.x controls. Articles 24–31 (controller/processor obligations) map to A.1.2.x and A.2.2.x controls. Article 35 (DPIA) maps to A.1.2.5. Articles 44–49 (international transfers) map to A.1.5.x and A.2.5.x controls.

Scope: ~40 article provisions → 27701 control mappings with coupling weights · Status: Planned

NIST Privacy Framework — Cross-mapping

Identify-P (inventory, mapping, risk assessment) → A.1.2.1, A.1.2.5, 27557. Govern-P (policies, risk strategy, awareness) → A.1.2.2, A.1.3.1, A.1.3.2. Control-P (data processing management, disassociated processing) → A.1.4.x. Communicate-P (communication policies, data processing awareness) → A.1.3.x, A.1.5.x. Protect-P (data protection, identity management, security) → 27001 Annex A (shared security controls).

Scope: 5 functions, ~18 categories, ~100 subcategories → 27701 + 27001 mappings · Status: Planned

Explore the privacy tools

Start with the privacy coupling discovery to see how the 47 controls interact, or explore the 27001 base layer that underpins every privacy control.

Privacy coupling → 27001 coupling → 27000 family →