Artificial intelligence (AI): Dos and don'ts for data protection

Take a look at our tips to make sure you stay compliant with data protection around the use of artificial intelligence (AI) tools. Share them with your team to make sure everyone is on the same page.

Last reviewed on 25 May 2023
School types: AllSchool phases: AllRef: 46617
Contents
  1. Don't enter sensitive information into a tool you don't trust
  2. Don’t click on suspicious links or reply to an email you don’t trust
  3. Do make sure staff training is up to date
  4. Do share AI privacy information with pupils
  5. Do update your data protection policy 
  6. Do look out for updates from your suppliers
  7. Do think about the impact of new apps and services that use AI

Don't enter sensitive information into a tool you don't trust

You shouldn’t enter sensitive data into a generative artificial intelligence (AI) tool if you’re not sure how they'll use or store the data. Many tools are available as personal or consumer products, so they may not meet the legal requirements for data handling.

Technology platforms and products (such as MIS and cloud storage) are increasingly using AI, however, many of these are designed to be used by companies and will comply with your data handling requirements.

You should avoid entering data into:
  • Consumer products that aren't designed for sharing institutional data
  • Tools which don't align with your data practice processes, for example that allow your data to be used for AI training
See the section below for advice around making sure your current suppliers remain compliant.

Personal data: information from which someone can be directly identified, or