Blog
Security
GDPR
April 27, 2026
Approx 9 min read

DPIA: The UK Guide to Data Protection Impact Assessments

A data protection impact assessment is the document that determines, under Article 35 of the UK GDPR, whether a new processing activity clears the regulator's risk threshold before it goes live. In 2024 the Information Commissioner's Office issued enforcement notices and warnings against the Home Office and Serco Leisure for DPIA failures, with the regulator's criticism falling into two camps: either no DPIA existed, or the assessment had been run without real rigour. A reprimand against Chelmer Valley High School followed in July of that year.

This guide covers what a DPIA is under UK law in 2026, when one is required, and how to run one so it holds up to ICO scrutiny. It is written for founders and product leaders in regulated SMEs. Worked examples are drawn from healthtech and fintech, with defence supplier guidance where the ICO's high-risk criteria bite hardest.

What is a DPIA under UK GDPR?

A data protection impact assessment is the process by which a controller identifies and mitigates data protection risks before a new processing activity begins. The obligation sits in Article 35 of the UK GDPR, retained via the Data Protection Act 2018 and amended by the Data Use and Access Act 2025 (source: legislation.gov.uk, eur/2016/679/article/35).

Article 35(1) sets the threshold. A DPIA is required "prior to the processing" where processing "is likely to result in a high risk to the rights and freedoms of natural persons," particularly where new technologies are involved. The test is forward-looking, completed before personal data starts moving.

Article 35(11) requires the controller to keep the DPIA under review "at least when there is a change of the risk represented by processing operations." Treat the DPIA as a living document that gets reviewed when processing or risk changes. A static PDF signed off two years ago and never reopened will not satisfy the regulator.

The UK regime tracks the EU text closely, with one carve-out: Article 35(6) on cross-border consistency does not apply in the UK.

When is a DPIA required in the UK?

Three categories of processing automatically trigger a DPIA under Article 35(3):

  1. Systematic and extensive automated evaluation, including profiling, that produces legal or similarly significant effects on individuals.
  2. Large-scale processing of special category data under Article 9, or criminal convictions data under Article 10.
  3. Systematic monitoring of publicly accessible areas on a large scale.

Beyond those three, the ICO maintains a list of high-risk processing types under Article 35(4). The combination of any Article 35(3) trigger with any of the ICO criteria puts the DPIA obligation beyond doubt. Even without an Article 35(3) trigger, two or more of the ICO criteria will usually require a DPIA.

A frequent source of confusion is the relationship between a DPIA and a legitimate interests assessment. An LIA is required where Article 6(1)(f) is the lawful basis. A DPIA is triggered by the risk profile of the processing, regardless of lawful basis. A high-risk Article 6(1)(f) activity needs both.

Transfer impact assessments work similarly. A TIA is required for restricted international transfers under Chapter V. Its output feeds the risk section of the DPIA and sits alongside it.

The ICO's 10 criteria for high-risk processing

Published under Article 35(4) and based on the European Data Protection Board's endorsement of WP29 Guidelines WP248, the ICO applies ten indicators of high risk (source: ico.org.uk, examples of processing likely to result in high risk):

  1. Innovative technology, including AI and machine learning
  2. Denial of service, where automated decisions control access to a service or benefit
  3. Large-scale profiling
  4. Biometrics used to uniquely identify an individual
  5. Genetic data
  6. Data matching across sources
  7. Invisible processing, where individuals are unaware their data is being used
  8. Tracking of geolocation or behaviour
  9. Targeting of children or other vulnerable individuals
  10. Risk of physical harm, including where a breach could enable violence or stalking

Two or more of these indicators generally point to high risk. A new healthtech product running an AI triage model against patient symptoms is likely to hit three of them before the developer has written a single line of processing documentation.

The four statutory elements a DPIA must cover

Article 35(7) sets a hard minimum. Every DPIA must contain, at a minimum, four things:

  1. A systematic description of the envisaged processing operations and the purposes of the processing.
  2. An assessment of the necessity and proportionality of the processing in relation to those purposes.
  3. An assessment of the risks to the rights and freedoms of the data subjects.
  4. The measures envisaged to address those risks, including safeguards, security measures and mechanisms to ensure the protection of personal data.

These four elements are the legal floor. A document presented to the ICO that misses any of them sits outside the Article 35(7) definition, and enforcement at that point tends to follow as a matter of process.

The statutory minimum sits separately from the ICO's practical nine-step process covered next, which sets out how to build a DPIA that satisfies those four elements in operating practice.

How to run a DPIA: the ICO's nine-step process

The ICO publishes a nine-step process for running a DPIA in practice. It reflects the statutory minimum and adds the operational steps regulators expect to see evidenced:

  1. Identify the need for a DPIA. Screen the activity against Article 35(3) and the ICO criteria.
  2. Describe the processing. Data flows, categories of data, categories of data subjects, retention, recipients.
  3. Consider consultation. The DPO if one is in place, data subjects or their representatives where proportionate, and processors where they hold material information.
  4. Assess necessity and proportionality. Lawful basis, purpose limitation, data minimisation, transparency and rights.
  5. Identify and assess risks. Likelihood and severity of impact on data subjects.
  6. Identify measures to address the risks. Technical and organisational controls, safeguards, and residual risk commentary.
  7. Sign off and record outcomes. Formal approval by the accountable decision-maker.
  8. Integrate outcomes into the project. Controls actually implemented, carried through beyond the document.
  9. Keep under review. Triggered by change in processing, risk profile or external guidance.

The ICO publishes a Word template that maps cleanly onto these steps (source: ico.org.uk, DPIA template). Using it does not make the assessment correct, but it reduces the risk of missing one of the statutory elements.

If residual risk remains high after mitigation, Article 36 requires prior consultation with the ICO before processing begins. The regulator has eight weeks to respond, extendable by six weeks for complex cases (source: legislation.gov.uk, eur/2016/679/article/36). That timeline has to sit inside product and procurement plans from the outset.

DPIA triggers in regulated sectors

The Article 35 test is sector-neutral in theory. In practice, different sectors trip different triggers.

Healthtech and NHS suppliers. Health data is special category under Article 9(1). Any large-scale processing triggers Article 35(3)(b) automatically. The NHS Data Security and Protection Toolkit v8, with a 30 June 2026 deadline for compliance, requires documented DPIAs within the toolkit evidence. The Digital Technology Assessment Criteria v2, in force since 6 April 2026, requires DPIA evidence in the data protection section. A symptom-checker app using AI triage to inform a care pathway hits special category data and innovative technology in a single product feature.

Fintech. Automated credit decisions, fraud scoring and behavioural underwriting trigger Article 35(3)(a). Consumer Duty reinforces the expectation of DPIA-backed evidence that outcomes are fair. Onboarding flows combining KYC biometrics, liveness checks and fraud profiling typically hit the biometrics, automated evaluation and innovative technology criteria at once. The FCA's February 2025 research note on AI in credit decisions frames explainability as a supervisory concern.

Defence suppliers. MOD Secure by Design applies to any capability handling defence data. DPIA workbooks sit inside the Secure by Design pathway for activities involving personnel or personal data (source: digital.mod.uk, secure by design). Biometric access control, supply chain personnel tracking and clearance management typically hit Article 35(3)(a) or (c), along with the ICO biometrics and tracking criteria.

The practical point across all three sectors is that if you are selling into regulated buyers, your DPIA becomes part of the sales evidence pack.

AI and DPIAs: what the ICO expects in 2026

The ICO's AI guidance states plainly that AI, machine learning and deep learning are examples of innovative technology that may require a DPIA (source: ico.org.uk, guidance on AI and data protection). Any product where outputs influence decisions about individuals should assume a DPIA will be required and work backwards from there.

The ICO's AI DPIA expectations extend the standard risk section into specific areas: provenance of training data, statistical accuracy, bias and discrimination, explainability, human oversight and security of the AI system itself. ISO 42001 for AI management systems is complementary. It provides governance infrastructure for the AI system, while the DPIA assesses the data protection impact of using it.

The Data Use and Access Act 2025 has shifted the edges of this in 2026. Royal Assent 19 June 2025, key commencement 5 February 2026 via SI 2026/82. The Article 22-style prohibition on solely automated decisions now applies only where decisions rest entirely or partly on special category data. The scientific research definition has broadened, and recognised legitimate interests has been introduced as a narrow lawful basis. Article 35 is not repealed. DPIAs remain mandatory. ICO guidance is under review pending further DUAA updates, so 2026 DPIA work should be checked against live ICO guidance before sign-off.

Penalties for missing a DPIA: recent ICO enforcement

The ceiling for a UK GDPR infringement is £17.5m or 4% of global annual turnover, whichever is higher. For DPIA failings, the ICO has leaned towards enforcement notices, reprimands and formal warnings, with the financial penalty held in reserve.

In early 2024 the ICO issued an enforcement notice and formal warning to the Home Office over GPS electronic monitoring of migrants, citing Articles 35 and 5(2) and finding that the department had "failed to sufficiently assess the privacy intrusion of the continuous collection of people's location information" (source: ico.org.uk, March 2024).

On 23 February 2024 the ICO ordered Serco Leisure and seven trusts to stop using facial recognition and fingerprint technology to track 2,000+ employees across 38 facilities. A DPIA had been completed. The regulator concluded it failed to identify a lawful basis under Article 9(2)(b) and did not demonstrate necessity or proportionality (source: ico.org.uk, February 2024).

In July 2024 the ICO reprimanded Chelmer Valley High School for introducing facial recognition for canteen payments without any DPIA, without consulting the DPO, and using an opt-out slip rather than affirmative consent. An SME-scale failure drew a public, named enforcement outcome.

The Clearview AI case was remitted to the First-tier Tribunal by the Upper Tribunal on 6 October 2025, which confirmed the ICO had jurisdiction to issue the notices. The substantive findings on Clearview's DPIA failings fall to be re-examined, so any enforcement outcome remains live.

Across these cases the regulator looked past whether paperwork existed. The test was whether the assessment identified real risks and shaped how the processing was designed.

Making DPIAs part of the operating model

DPIAs fail most often where they sit outside the product and procurement rhythm. Assessments tend to trigger late and end up filed without subsequent review. Three practical moves address that:

  • Trigger DPIAs from product and procurement tickets. If a new feature or supplier changes the risk profile, that is the signal to open the assessment.
  • Reuse evidence across frameworks. The security measures that satisfy ISO 27001:2022 Annex A 5.34, Cyber Essentials and NHS DSPT v8 all feed the measures section of a DPIA. A control mapped once should support many assessments.
  • Version the DPIA alongside the product. Material change in processing means a reviewed DPIA, updated in place against the live risk profile.

The Naq platform is built to automate compliance across UK GDPR, ISO 27001, Cyber Essentials, NHS DSPT v8 and DTAC v2 from a single dashboard. Controls are mapped across frameworks, so one piece of evidence satisfies UK GDPR, ISO 27001 and Cyber Essentials at the same time, rather than being collected three times. The platform simplifies Data Protection Impact Assessment generation and keeps assessments current against live processing.

Where teams want named expert support, Naq's in-house Clinical Safety Officers and virtual DPOs sit alongside the platform for the controller decisions a regulated launch cannot delegate to software.

To see how DPIA evidence maps across your existing tooling and frameworks, book a 15-minute demo at naqcyber.com.

Written by
The Naq Team