What does the Data Protection Community consider to be challenging in the DUAA?
ProvePrivacy published a poll to gather DUAA 2025 Insights to determine what the data protection and information governance community felt would be the toughest challenges going forward.
Whilst in general many aspects of the DUAA are designed to reduce red tape in order to enable digital innovation, there are still remaining concerns around how certain aspects of our roles will be delivered in future.
We identify that the biggest concern for professionals is A.I., which might come as no surprise given that it is still a largely misunderstood technology.
Strategic Insights: 2026 LinkedIn Poll Results
The following table summarises the key findings from our research into industry readiness and concerns.
| Category | Key Metric / Finding |
|---|---|
| Primary Concern | AI Governance and ADM (16 of 28 votes) |
| Participant Seniority | 48% Senior or Director level |
| Enterprise Engagement | 36% have over 1,000 employees |
| Industry Balance | 14% Legal Services / 14% IT Services |
| Low Priority Area | Internal Complaints Handling (14% priority) |
How does the Data (Use and Access) Act 2025 change UK data protection?
The Data (Use and Access) Act 2025 (DUAA) represents a significant shift in UK data protection following the GDPR in 2018. It should help to transition organisations from defensive compliance to a posture of strategic oversight.
One of the stated benefits of the regulation was that it would help create a frictionless regime to support innovation.
Recent research via ProvePrivacy’s LinkedIn poll highlights the urgency of this transition for senior leaders. The data shows that AI Governance is the dominant concern for nearly 60 percent of professionals.
The poll was significant:
- Boardroom engagement was high, with nearly half of respondents holding Senior or Director-level positions.
- Over one-third of poll participants represent enterprises with more than 1,000 employees.
- Interest is balanced between legal and IT services. This reflects the need for technical and legal alignment.
How does the DUAA 2025 regulate AI and Automated Decision-Making?
The DUAA moves the UK away from a general prohibition of automated processing. It introduces a permissive framework under Article 22A for non-special category data. This change allows for increased efficiency in recruitment and service allocation.
The DUAA reformulates Article 22 of the UK GDPR, replacing it with a more flexible approach that permits solely automated decisions with legal or similarly significant effects, provided specific safeguards are in place.
- Scope of Permission: Organisations can now deploy AI-driven tools for significant decisions (such as recruitment or credit scoring) more confidently using “standard” personal data.
- Special Category Data Restriction: The permissive framework does not apply to sensitive data (e.g., health or biometric data), which remains subject to stricter legacy requirements such as explicit consent or specific legal authorisation.
- Defining Solely Automated: The Act clarifies that a decision is only “solely automated” if there is no meaningful human involvement.
What is the Mandatory Safeguard Architecture?
To manage AI and ADM legally under the new framework, professionals can implement a four-stage safeguard architecture:
- Information: Individuals must be provided with clear details about how automated decisions are reached.
- Representation: Individuals must have the ability to express their views on the automated decision.
- Human Review: Controllers must enable individuals to request and obtain human intervention for significant decisions.
- Contestability: Individuals retain the right to challenge the final outcome of an automated process.
What is the guidance on “Meaningful Human Involvement”?
A critical aspect of future AI management is moving away from “symbolic” oversight to substantive review. Sources indicate that to meet the statutory threshold, professionals should:
- Audit the Decision Path: Verify human intervention occurs at a stage where it can actually influence or override the AI output.
- Document Competency: Maintain records of the training and expertise of the staff members tasked with performing human reviews.
- Define Criteria: Establish clear internal criteria for what constitutes a “substantive” review rather than a superficial “rubber-stamp”
Is a DPIA still required for AI implementations following the DUAA?
A Data Protection Impact Assessment (DPIA) remains a statutory requirement for AI and other high-risk processing activities. The Data (Use and Access) Act 2025 (DUAA) retains existing DPIA obligations despite earlier reform proposals. Controllers must carry out a DPIA whenever processing is likely to result in a high risk to individuals.
AI systems frequently involve Automated Decision-Making (ADM) that produces significant effects on individuals. These activities are typically classified as high-risk, making a DPIA an essential compliance tool. The DUAA requires DPOs to document mandatory safeguards for significant decisions within these assessments. These safeguards include information, representation, human intervention, and contestability.
DPIAs provide a common legal language for assessing the significant effect of an algorithm. Professionals should use DPIAs to define and document what constitutes meaningful human involvement. This documentation ensures that AI systems remain accountable and transparent as they scale.
The Information Commission now has expanded powers to demand specific documents, including DPIAs. Failure to maintain detailed assessments could lead to enforcement action or Interview Notices. Organisations must ensure DPIAs are robust enough to support the technical reports regulators can now request
What were the reactions to non-ADM aspects of the DUAA?
Data professionals currently exhibit a selective focus regarding the DUAA 2025. While AI governance dominates the agenda, operational changes like Recognised Legitimate Interests (RLI) offer efficiency gains. Recognised Legitimate Interests statutory processing purposes that exempt organisations from performing balancing tests normally undertaken in a legitimate interest assessment.
Professionals view Recognised Legitimate Interests as a welcome efficiency win. These interests cover fraud prevention and safeguarding tasks. DPOs can now bypass complex balancing tests for these routine activities. Whilst other legitimate interest tests will remain, this shift to recognising certain activities will help professionals to determine a a balance.
Other ProvePrivacy research suggests that reactions to new DSAR standards are generally positive. The Act moves toward reasonable and proportionate search requirements. A new stop the clock mechanism helps manage vague requests. This prevents the statutory deadline from expiring during clarification periods.
What could be a challenge for many are the new requirements for data protection complaints handling, in the main a well defined data subjects rights procedure could ensure organisations are prepared and ready to manage, but if volumes follow a similar path to recent DSAR trends there may be issues here. Having a solution for logging and tracking complaints will be inevitable.
Conclusion: Becoming an Architect of Compliance
The DUAA 2025 is a mandate to modernise your data protection strategy. Professionals must transition from reactive administration to proactive strategic oversight. This requires a deep focus on accountability and risk mitigation.
Whilst AI and ADM might be simplified, a Data Protection Impact Assessment (DPIA) will serve as a vital tool for professionals managing ADM requirements. They provide a common legal language for assessing the significant effects of an algorithm. Using DPIAs ensures that safeguards are built into the technical architecture from the start.
Professionals must remain vigilant regarding sensitive personal data. The permissive ADM framework under Article 22A applies only to non-special category data. Special category data still represents a high-risk area requiring additional protections.
The ProvePrivacy platform serves as the essential partner in this transition. It replaces manual solutions such as spreadsheets or SharePoint lists with a unified and collaborative digital environment, providing MI to help gain the senior stakeholder support data protection teams need. This technology helps transform the DPO into a true Architect of Compliance.
External Sources
- DUAA (2026) Regulation: https://www.legislation.gov.uk/ukpga/2025/18/contents
- ICO Resource: https://ico.org.uk/about-the-ico/what-we-do/legislation-we-cover/data-use-and-access-act-2025/






