On 3 October, the Information Commissioners Office (ICO) held its annual Data Protection Practitioners' Conference, covering a wide range of data protection and privacy issues.
The DAC Beachcroft Data, Privacy and Cyber team attended the event, and a number of key overarching themes were apparent from the keynote addresses and panel discussions.
The 'people dimension' of data protection developments will be crucial in coming years. Shared literacy across privacy, data, communications and technical teams will assist with collaboration. Organisations need a common vocabulary around AI, data, and risk to bridge technical and non-technical perspectives. Standard templates, common frameworks and clear audit trails (logs, agreements, policies) all help operationalise accountability across teams and organisations.
More generally, the law should be seen as an enabler, not a barrier, to innovation. Updates introduced via legislation such as the Data (Use and Access) Act ("DUA Act") provide pathways for data protection practitioners to implement responsible data use. This will allow for sustainable data and artificial intelligence innovation; these developments depend on public trust, and transparency must be built into user journeys, they should not be surprised how their data is used.
Keynote speeches
The Information Commissioner
John Edwards, speaking at his last DPPC as Information Commissioner, emphasised the importance of agility in a rapidly evolving data protection landscape. Ahead of the organisational transition to the Information Commission, Edwards assured the audience that minimal disruption to the regulator's functions were expected. The ICO will continue to provide clarity and certainty for data protection practitioners, such as upcoming guidance on complex areas such as Automated Decision-Making (ADM) following changes introduced by the DUA Act.
The ICO remains responsive to emerging trends and challenges, deploying the full range of regulatory tools such as guidance, audits, enforcement actions, and, if necessary, criminal prosecutions. The Commissioner emphasised there will be an increased focus on emerging technologies such as artificial intelligence and biometrics which add to the complexities of the data protection environment, and ICO staff will be upskilled to future-proof for these changes.
Reflecting the theme of agility, Edwards highlighted the need for organisations to proactively address cybersecurity threats, such as testing staff with phishing emails. Organisations must expect to be targeted, and address vulnerabilities before they can do harm. Close collaboration between Data Protection Officers and information security teams is crucial. This should include regular vulnerability assessments, patch management, data flow monitoring and engagement with initiatives such as the NCSC Cyber Essentials scheme.
Answering question on a number of issues, the Commissioner noted that the ICO does not have enough details about any legislation or technical information in respect of the recently announced plans for digital ID. Also discussing subject access requests (SARs) Edwards noted that these are fundamental rights for individuals and organisation should be more effective in how they deal with SARs.
Edwards concluded that effective data protection is ultimately about people and maintaining public trust could be achieved by ensuring individuals feel confident their data is secure.
Keynote on the power of AI and privacy
The keynote speech, delivered by Ivana Bartoletti, focused on 'AI power and privacy: Building trust for the future'.
From a regulatory and organisational perspective, Bartoletti noted that as AI advances rapidly toward general intelligence, there is pressure to avoid over-regulation. However, the intrinsic privacy risks to AI such as bias, surveillance, and erosion of democratic values mean that a balance must be found between sustainable innovation and meaningful guardrails creating accountability for ethical AI use. This is particularly important as existing laws continue to lag behind the evolving capabilities of AI.
Innovation requires stronger governance, and future collaboration is required organisations and regulators. Bartoletti noted that although best practices such as audit logs and assessment criteria are emerging, organisations' privacy teams have been left carrying the disproportionate burden of mitigating deep AI risk with technical fixes alone.
Future oversight must therefore involve both private and public actors, ensuring cross-border accountability. Bartoletti closed by emphasising that organisations need to lead with literacy. Teams must be encouraged to question, test, and critically engage with AI systems. Privacy professionals will be crucial within organisations to achieving trustworthy and sustainable use of AI. They will bring awareness of how to implement legislation, combine business need with safeguarding people's rights, and manage trade-offs to bring together different areas of businesses.
Data protection and privacy
ICO response to the DUA Act
A panel of data protection experts discussing the DUA Act, and its expected impact on the operation of the ICO.
As an amending piece of legislation, the DUA Act introduces a few areas of change which the ICO is responding to through policy and guidance for organisations. Updated guidance on PECR and breach reporting has already been published, and 'health alerts' have been added to other materials to confirm whether the guidance has been updated to reflect the provisions of the DUA Act. A number of consultations on updated guidance are underway and expected in the coming months, such as the recently commenced call for views on 'charitable purpose soft opt-in rules'.
The DUA Act does not amend the principal duty of the ICO; protecting personal data, promoting trust and confidence. A number of new and amended powers have been introduced, such as compelling witnesses to interview, should make the ICO more effective. However, the panel emphasised there is not expected to be a fundamental change in regulatory approach, with the ICO expected to continue to use the full range of powers available to it.
The structural changes to the ICO introduced by the DUA Act, will bring the regulator into line with other regulators, bringing greater resilience, transparency and representation.
Storage and access technologies
A number of ICO colleagues discussed the regulator's ongoing approach to storage and access technologies. This is an access that we have discussed extensively in recent months, in our analysis of UK and EU cookie regimes, the EU's digital simplification drive and as part of our Data, Privacy and Cyber Digest podcast series.
The ICO’s Online Tracking Strategy builds on two decades of work, from the 2003 cookie guidance to Consent or Pay guidance. The focus has shifted beyond cookies to a wider ecosystem of storage and access technologies such as pixels, tags, and fingerprinting.
The Updated Draft Guidance on Storage and Access Technologies clarifies that PECR Reg 6 applies to all tracking technologies. The forthcoming final version (which is expected to be published in 2026) will incorporate provisions from the DUA Act, including new exemptions for statistical purposes and website appearance.
The ICO expects organisations to ensure consent mechanisms are informed, balanced, and revocable. Non-compliance includes deceptive, uninformed, undermined, or irrevocable choices. Cookie banners must make rejecting tracking as easy as accepting it.
The 2024 update adopts a structured “must / should / could” framework and expands detail on:
- The strictly necessary exemption
- Storage duration and re-consent timing
- Specific guidance on online advertising practices
The ICO is pairing guidance with audits, enforcement, and public reporting to promote fairer online tracking. Organisations are expected to design transparency into user journeys and to treat the law as a facilitator of trust, not a compliance checkbox.
The life of a data protection officer
A panel of data protection officers discussed their challenges in navigating a number of risks such as AI. The numerous forms of AI available, combined with supply chain risks, must also be considered alongside regulatory uncertainty. There is now a considerable overlap between GDPR and a variety of EU legislation, and developments in the UK too.
Ensuring fairness is difficult due to inherent data bias, and proving AI is unbiased often requires sensitive data, creating a paradox. Transparency and understanding of AI are essential, though challenging.
The data protection officer’s role remains stable but will demand increasingly close collaboration with technology teams to balance innovation and privacy risks. While AI may offer efficiency gains, successful and safe implementation requires substantial effort. The panel recommended that DPOs should be involved in strategic discussions with senior management on these developments.
Understanding the impact of AI
There was a discussion on general issues around AI, such as the type of machine learning models, generative AI and advanced AI capabilities. The risks of using AI in the workplace were also identified, such as the need to fact-check outputs from generative AI. The session noted that there were well-known general risks that accompany the use of AI, including automation bias, security vulnerabilities and lawful basis concerns.
However, for data and privacy professionals, AI presents a number of novel risks emphasising the significance of AI literacy. It is apparent that AI is here to stay, and therefore knowledge of guidelines from regulators and legislation such as the AI Act is important. Professionals should look to identify risks such as bias, misuse and overreach early, with communication in the 'same' language as engineers and vendors likely to be crucial.
Everyday data sharing
Individuals from a number of organisations discussed the importance of data sharing initiatives, with their discussion on best practice in data sharing highlighting several vital themes. The speakers collectively stressed the importance of building trust through transparency and accountability, though clear communication of the reasons and methods for data sharing.
Another major theme was the significance of organisational culture and cross-team collaboration. The speakers noted that culture is critical in demystifying data use and can help to build public confidence in data practices.
Proportionate and practical governance was also emphasised. Speakers recommended tailoring governance frameworks and controls according to the scale and nature of an organisation. They pointed to the benefits of standard templates, shared frameworks such as WASPI, and well-defined data sharing agreements in expediting safe, compliant collaboration.
The panel called for legislation to be viewed a facilitator or enabler of responsible data sharing, rather than as a restrictive force. By reframing compliance as a path to innovation and improved outcomes, organisations can better support trust and progress.
In terms of practical examples, the panel agreed that it takes time to put robust data sharing agreement into place, with both data and back-office teams needed as part of any design processes.
'The ripple effect' of data breaches
There was a panel discussion on the effect of data breaches on people's lives. The panel highlighted the profound and lasting impact data breaches can have on individuals, especially those in vulnerable circumstances, as their lives can be significantly affected. Research was cited showing many adults have experienced data breaches, with a significant proportion reporting emotional distress and a lack of adequate support from organisations.
An ICO campaign on this issue is aiming to shift organisational attitudes from merely managing data breaches to genuinely supporting affected individuals. with empathy and open communication at the forefront. The ICO itself has simplified its communications, offered clearer support resources, and adopted trauma-informed approaches.
The panel discussed examples of best practice on this issue. NHS England revising its guidance and notification templates, acknowledging the harm caused, rather than downplaying incidents as mere inconvenience. Internally, updated checklist for handling data breaches have also been adopted. A panel member from the Trussel Trust, a major foodbank charity, illustrated the importance of privacy by design and accountability in their work with vulnerable communities.
Public authorities
Sessions discussing the Freedom of Information Act and Environmental Information Regulations provided guidance from the ICO on how public authorities should respond to requests made for information under these pieces of legislation.
Public authorities and freedom of information requests
The landscape of Freedom of Information (FOI) request handling for public authorities is rapidly evolving, presenting new challenges for public authorities. Informal communication channels such as private messaging apps can fall within the scope of the Freedom of Information Act if they contain official business. Reflecting these changes, social media has become an increasingly common route for submitting FOI requests.
The ICO highlighted that delays in responding to requests are frequently occurring due to the complexity of the public interest test, necessitating multi-stakeholder consultation. The ICO stressed the importance of strong justification for any extension and timely partial responses. Authorities should also ensure that responses are composed in clear, accessible English to minimise internal reviews and complaints.
Looking to the future, the discussion highlighted that the rise of AI introduces both operational advantages and challenges. AI-generated requests and new information types add regulatory complexity, while AI tools can aid in search and triage tasks, though human oversight remains essential.
Practical application of the Environmental Information Regulations exceptions
Discussing the practical application of the EIR, the ICO emphasised their pivotal role in ensuring environmental information is accessible to the public. Routine documents, such as maintenance logs related to an environmental activity, may fall under EIR if their purpose links to environmental matters. This broad interpretation prevents disputes from being misclassified under other information regimes.
Authorities should expect to conduct thorough and documented searches, recording methodologies and results. Each request is unique and should not be judged based on previous, similar queries. Accurate documentation is vital, as failure to conduct or record proper searches can result in delays, loss of credibility, and adverse findings in the event of ICO scrutiny.
A key distinction of the EIR is its higher threshold for adverse-effect exemptions compared to the Freedom of Information Act (FOIA). It was noted that authorities must provide clear, specific evidence that disclosure would probably cause harm, rather than merely suggesting it could. Vague or speculative arguments are insufficient and will likely result in disclosure.
In terms of recommendations, the ICO noted that authorities must actively demonstrate why information should be withheld and use this presumption as a tiebreaker when public interest arguments are evenly balanced. Maintaining an audit trail, consulting third parties where appropriate, and offering advice on refining burdensome requests further strengthen compliance. Ultimately, careful application and documentation are crucial for withstanding regulatory and tribunal scrutiny.
Cyber security issues
Actionable insights from recent attacks
Considering actionable insights for organisation resulting from recent cyber attacks, cyber specialists at the ICO highlighted a number of key takeaways for organisations. Robust, proactive security measures and continual organisational vigilance were identified as crucial. Multi-factor authentication (MFA) should be implemented wherever possible as part of data protection by design and by default, aligning with best practices such as Cyber Essentials and the NCSC guidance on MFA.
Regular vulnerability scanning and penetration testing were identified as fundamental too; over 40,000 new vulnerabilities were identified in 2024 alone. Organisations must categorise these vulnerabilities by risk and document their risk assessments; mitigation strategies such as patch management and verification should be integrated into broader security programmes such as monitoring and alerting systems. Staff, such as dedicated security operations teams, must be appropriately trained to respond swiftly and effectively to incidents, tailored to organisational size and complexity. Other steps such as maintaining an accurate Information Asset Register should be undertaken.
Reflecting on cyber insurance, the ICO was unable to comment on the benefits, noting that the associated costs can be problematic for some organisations. However, consideration of appropriate cyber insurance is expected to be part of an organisation's incident response plan, particularly as consideration should be given to supply chain disruption and business interruption issues.
How to defend against social engineering
A panel of cyber specialists also discussed social engineering, a specific type of cyber security incident in which individuals are deceived into revealing sensitive information or performing actions that compromise security.
Discussing the effectiveness of these attacks, the panel agreed that attempts are becoming more sophisticated due to the use of AI, with attention shifting from senior management to developers and IT professionals. The costliest cyber attacks since July last year have been social engineering incidents. However, focusing on social engineering attacks via email and phishing is increasingly misguided. Many attacks have originated from LinkedIn, verified Teams accounts or legitimate browser advertising.
In respond, organisations can invest in individuals, whether through training or education. Businesses, however, are at greater risk where complex supply chains are in place. Significant attacks in the UK in the past year have involved outsourced managed service providers, with organisations who have been affected by social engineering attacks often faced with less sympathy than those impacted by state-sponsored attacks.
Culture is important for defending against these kinds of attacks. Employees should feel at ease reaching out for help, even if falling for a social engineering attack makes them feel embarrassed. The impact of attacks can be mitigated through the speed of response. Heather Toomey, ICO Principal Cyber Specialist, reiterated that attackers only have to get lucky once, but defenders have to be lucky all of the time.
