Schools are taught to be vigilant. Visitors are checked thoroughly, cupboards are locked, and everyone keeps a watchful eye on the photocopier. But it is increasingly apparent that a new visitor is getting into schools, poorly disguised and definitely not always wearing a lanyard.
Artificial intelligence (AI) is no longer a theoretical concept, it is already in schools. Writing letters, organising lesson plans, analysing student data. But has anyone noticed? And does anyone have a full understanding of where data is stored or processed? AI is starting to become embedded before anyone has formally approved its use and it can feel like entry through the back door, or maybe even a ground floor window.
In this post we explore how school and trust leaders can respond proactively, confidently, and in line with existing guidance. The technology may be new but the governance principles are not. AI usage raises questions for:
Data protection and GDPR
AI tools frequently process sensitive personal data, including pupil records and staff information. Some tools operate outside the UK or EU, and may not comply with the UK GDPR.
Safeguarding
Any tool that interfaces with pupils or stores their information must meet safeguarding expectations and not collect or infer information without clear oversight.
Risk management
Introducing new technology and especially tools that process data or influence decision-making requires proper checks, risk assessments and governance.
Workforce development
Staff need to be aware when and how they are using AI and what guardrails are in place.
What should schools be doing?
Schools need to approach AI with the right balance of risk assessment and understanding. Key things that school and MAT leaders should do include:
Map out existing use
Conduct a short audit: Which AI tools are already being used? By whom? For what purpose?
Develop or update AI guidance
Ensure this covers data protection, acceptable use, procurement, and accountability. Include AI within your wider digital, safeguarding and relevant policies or if needed create something more bespoke.
Train staff
Help staff understand what AI is, what it can do—and just as importantly, what it shouldn’t do. Include scenarios and risks.
Conduct DPIAs
Conduct a Data Protection Impact Assessment for any tool that processes personal data.
Use trusted systems
Choose technology partners who are transparent, secure, and committed to supporting governance, not bypassing it.
Where to find out more?
Below are some useful links to get you started:
DfE guidance on AI in education provides an overview of the opportunities, risks and implementation considerations for schools and colleges.
ICO guidance on the use of AI in data processing
We developed Scriba as a way to help schools and MATs to use new technology like AI to reduce manual workload without compromising trust, compliance or governance. We put in place the necessary guardrails for data processing and storage and ensure that there is a clear governance and audit trail, for example with set data retention periods and secure, role-based access controls. We don’t just plug AI in and hope for the best, we build systems that bring it in properly through the front door.