As the academic year soon comes to an end and provisions under the EU AI Act continue to become applicable, the need for educational institutions to review their use of AI across systems and classrooms has never been more pressing. From August 2025, organisations will have additional obligations under the EU AI Act in relation to AI governance, accountability and transparency. For example, AI will be governed at Union level by the AI Office as well as by Member State Competent Authorities, obligations in relation to General-Purpose AI Models will become applicable, and Conformity Assessment Bodies must be established.

With AI being increasingly relied upon for delivering more personalised learning and tutoring experiences, language support, and plagiarism detection among many other uses, we look at how educational institutions can ensure data privacy for students and how a Data Protection Officer (DPO) can support.

Being transparent about the use of AI

When it comes to using AI in educational institutions, clarity is a must.  It is more critical than ever to ensure students, parents, and teachers know exactly what kind of data is being collected, why it’s needed, and who is going to see it. This information should not be buried in pages of legal jargon, must be easy to understand and easy to locate.

Privacy notices should be user-friendly and easy to understand, particularly for students whose first language is different to the local language. With any new AI tool being rolled out, it is important to explain how it works to users and the type of personal information it might use. Additionally, clear consent must be obtained from students or their parents before collecting or sharing any sensitive data. Ultimately, open communication builds trust and helps everyone feel more comfortable with new technology in the classroom.

Choosing the right AI solutions that ensure data minimisation, security and transparency

Educational institutions, such as colleges, schools and universities, should opt for tools that have been built with student data in mind. Given that they often process sensitive information, tools that allow for private cloud options are also essential for optimising data security and, where possible, pseudonymous personal data should be used to protect identities.

However, solutions that train AI models on sensitive personal data should be avoided, as this is in breach of data protection regulations. Information that can identify a student, such as student ID numbers and full names, should only be input when absolutely necessary. To ascertain when such processing is needed, a Data Protection Impact Assessment (DPIA) should be conducted to identify the purpose of the processing, and then reviewed by the DPO.

Educating users on how AI solutions will interact with student data

Educating the educators is just as important as educating students. Teaching staff should be trained on how to use AI solutions to drive leading practice and empower students to do the same. In addition, it may be helpful to provide training on why and how AI is being used. The DPO could also provide training specific to the school or university's needs, to ultimately ensure that where consent is required to process student data, such consent is informed.

Oversight and review of AI solutions

The summer months can be an opportune time to review AI use before the new academic year begins in September. Educational institutions should establish a clear AI governance framework that sets out who is responsible for decisions related to AI. For example, this could be a digital learning officer or a dedicated AI ethics committee. An AI Use Policy will set out acceptable and prohibited uses for AI, and vendor risk management procedures should include considerations that are unique to AI. Continuous review and monitoring will ensure accountability and flag any unexpected outcomes such as bias or discrimination, allowing for mitigation.

HewardMills supports organisations with AI risk assessments from the initial procurement phase to the ongoing review stages in the education sector and beyond. As a global DPO, we are also equipped to help with establishing AI governance frameworks and provide bespoke AI and privacy training.