AI tools are arriving in workplaces at pace. Microsoft 365 Copilot, ChatGPT, Claude, and specialist AI assistants built into everyday business software. The choice is broad, the barrier to adoption is low, and for most organisations it is becoming a question of when, not if.
Before any of those tools are put to work on personal data, though, there is a step the Information Commissioner’s Office (ICO) strongly recommends you do not skip: a Data Protection Impact Assessment, or DPIA.
This post is a short look at what the ICO expects, when a DPIA is needed, and what our role at ADM is in helping customers think about this responsibly.
Please note, this post is for information only and does not constitute legal advice. You should always refer to official guidance from the relevant authority.
Why AI triggers the DPIA conversation
Under UK GDPR, a DPIA is required when processing is likely to result in a high risk to the rights and freedoms of individuals. The ICO has specifically named artificial intelligence, machine learning and deep learning as examples of “innovative technology”, the kind of processing that can trigger the need for a DPIA.
It is worth noting, this is not about whether the technology is new to your organisation. The ICO’s guidance is explicit that “innovative” refers to new developments in technological knowledge more broadly, not whether you have used it before. Early adopter or cautious follower, if you are putting personal data through an AI tool, the question “do we need a DPIA?” should be on the table.
When a DPIA is likely to be needed
The ICO’s guidance lists several types of processing where a DPIA becomes a requirement rather than just good practice. For AI specifically, this tends to apply when it is combined with things like large-scale processing, automated evaluation or scoring, systematic monitoring, or the processing of special category data such as health or biometric information.
In practice, that covers many of the scenarios businesses are now actively considering: recruitment tools, customer sentiment analysis, meeting transcription services, intelligent search across internal documents, and so on.
If you are not sure whether your intended use crosses the line, the safer approach is to carry out a DPIA anyway. The ICO has been clear that it is good practice to do so, even where the strict legal test has not been met.
What a DPIA should contain
At a high level, the ICO expects a DPIA to:
- Describe the nature, scope, context and purpose of the processing
- Assess necessity and proportionality
- Identify and assess risks to individuals
- Set out measures to address or reduce those risks
- Record the outcome and any residual risk
For AI systems, the ICO recommends treating the DPIA as a living document, one that is reviewed regularly and updated as the system or its use evolves.
Where ADM fits in
ADM Computing does not carry out DPIAs on behalf of customers. A DPIA is a document owned by the data controller, and it needs to reflect that organisation’s own processing, context and risk appetite. It would not be appropriate for us to step into that role.
What we can do is help you understand what a DPIA should contain, how it should be recorded, and how those principles align with the AI and Microsoft 365 tools we help customers introduce. If you are planning to roll out an AI tool and want to talk through the data protection questions before you go any further, we are always happy to have that conversation.
Further reading
Start with Integrity. Build with Confidence.
ADM Computing is well-placed to provide expert guidance for setting up strong security policies and data loss prevention, helping organisations control information handled by AI tools. Through risk assessments and close collaboration, we create customised policies to protect sensitive data and address threats. Our services include advice on access management, data classification, and monitoring, allowing businesses to deploy AI confidently while staying compliant and safeguarding key information.
To discuss how this applies to your business, or if you need any other guidance or support, please get in touch at marketing@adm-computing.co.uk or on 01227 473500.
About ADM
Founded in 1984, ADM Computing is Kent’s largest and longest established IT services company specialising in IT support services that help to reduce IT costs as well as improve network efficiency. We have a long history of charity work and won’t be slowing down any time soon!
To keep up to date with all our latest updates, follow us on LinkedIn: ADM Computing LinkedIn
Blog Author
Andy Cox – Associate Technical Director
Andy joined ADM in 2010 as a First Line Engineer, bringing with him over a decade of IT experience from his time in the NHS. Since then, he has become a cornerstone of the ADM team, progressing to the role of Senior Engineer.
Andy works on customer projects with a focus on Microsoft 365, Azure, and security solutions. In addition to his client-facing role, Andy plays a pivotal part in ADM’s internal operations, particularly in security and compliance. He contributes as a key member of the ADM Security, Compliance, and ISO teams, ensuring the company meets the highest standards.
When he’s not solving complex technical challenges, Andy enjoys building scale model aircraft and playing guitar, showcasing his creativity and precision both in and outside of the workplace.

