The Safer Technologies for Schools (ST4S) assessment is focused on evaluating digital products and services that are essential for the educational environment, ensuring they meet high standards for security, privacy, and interoperability. The following categories of services are not assessed by ST4S due to their specialised nature, the framework not yet having the controls to cover the features or functions, the service and any of it’s functions being deemed high risk or not compliant.
Use cases on this list are subject to change as the ST4S framework continually evolves and implements new modules to accommodate these use cases. However, not all use cases may be removed or adjusted.
General Exclusions #
Self-Hosted Services:
- Software hosted and managed by the school on their own cloud service provider or local infrastructure. However, if a service is self-hosted but fully managed by the service provider (including security scanning, support, and backups), it may be considered for assessment.
Networking Products or Services:
- Firewalls.
- Internet filters, including internet and web browsing monitoring tools.
- VPNs, proxies, and other networking services.
Monitoring / Surveillance:
- Services, applications, or devices that monitor or conduct surveillance of students, including tracking online activity, device behaviour, or physical locations. Examples include browser extensions, installable applications, and other tools that record, monitor, or log students’ interactions across various services, websites, and device activities.
Remote Device Controlling Software:
- Any software that allows for remote access and control of student or teacher devices.
Password Managers and Credential Storage Services:
- Services specifically for managing and storing passwords and other credentials.
Public Social Media Networks and Services:
- Social media platforms not specifically designed for school use (e.g., Facebook, Twitter, Discord). Forums and social media platforms offering schools their own tenancy and isolated environment may be considered.
Human Resource Services:
- Services such as reference checks or background checking services.
Purchase Management Software, Vendor Management, and Procurement Software:
- Software related to purchasing and vendor management not directly related to educational services.
Physical Building Management Services:
- Services such as physical security systems, smoke alarms, surveillance cameras, and CCTV hardware.
Professional Services:
- Consultants and agencies delivering professional services without a specific application that students, teachers, or parents register and log into.
Services with Unrestricted Communication Features:
- Services with communication features accessible to the general public, which a school cannot restrict or limit to their school or classroom. Examples include multiplayer games or social media services that do not provide an administrator account for schools to control student communication and account discovery or privacy features.
Privacy and Ethics:
- Services which may pose a risk to student privacy, online safety, and human rights/ethics as determined by the ST4S Team and/or ST4S Working Group.
Artificial Intelligence (AI) Exclusions #
The following is specific only to AI systems, models, features and functionalities.
Designed to Process Personal Information:
- Designed to process personal information (e.g. the service prompts or requests student names, gender, racial/ethnic origin or other personal information including sensitive information). Information must be de-identified before being exposed to an AI model.
- Speech to text (STT) or text to speech (TTS) which may use a machine learning model (ML) or AI model may have an exemption under accessibility reasons. Further information under the exemptions section on this page is provided.
Data Profiling:
- Processes data that may be used to create profiles of individuals or groups that are used to further develop the model or other purposes not within the primary interest of the school.
- This does not include services which may form a profile of a student to enhance learning experiences or recommend learning pathways (e.g. adaptive learning), provided the profiles are exclusively for that purpose and the benefit of the student or school.
Biometric and Attribute Processing:
- Processing biometrics, human attributes, motions, metrics or attributes whether physical or mental (e.g. facial recognition and scanning, eye tracking, detecting movement, determining or predicting emotions, student disability, learning difficulties etc.).
- Services which may produce a digital recreation of a person (e.g. voice cloning).
- Exemptions may be considered for speech to text (STT), text to speech (TTS) and on device biometric authentication.
Student Monitoring:
- Student monitoring, behaviour management and observation services. This includes monitoring activity across the internet and web browsing behaviour.
- Services which monitor physical or mental attributes or behaviour such as analysing emotions, level of concentration in the classroom etc.
Administrative Decision Making:
- Administrative support and decision making (e.g. automatically vetting enrolment or scholarship applications, complaint handling, disciplinary action, reviewing a benefit or award etc).
Not Suitable for School or Educational Use:
- General chat bots used by students that have not been designed for school or educational use and/or do not offer controls or features for schools to restrict and limit access to students. Chat bots are assessable provided they are designed with students / education use in mind.
- Other use cases of AI that may not be suitable for school or educational use.
NSFW AI Models:
- Utilising NSFW AI models, producing or outputting NSFW content or any other content that may be obectionable or deemed offensive by a reasonable member of the school community (this includes text based models).
Health and Wellbeing Data Processing:
- Applications or services which process health and wellbeing data (e.g. physical or mental health, fitness, meditation, food consumption, body metrics etc).
- Services which may provide or output medical, health (including mental health) or wellbeing guidance or advice. This includes information that may be reasonably interpreted by an end user including a young person to be health or wellbeing guidance or advice.
Prohibited Uses:
- Prohibited uses as defined under the European Union’s Artificial Intelligence Act (the EU’s AI Act).
Screen Monitoring
- Monitoring or recording screen or browsing activity that cannot be controlled by the student or end user of the device.
- Examples include connecting to Microsoft Sharepoint, Department systems etc in order to ingest information, documents and other content.
Ethical and Safety Concerns:
Any application of AI that may be considered by the ST4S Team, a ST4S Working Group member or a reasonable member of the school community to be:
- Invasive
- Unethical
- Pose a risk to safety, human rights or privacy
- Not be within the best interests of the student and/or school
Artificial Intelligence (AI) Exemptions #
As we assess more services, and guidance is released from industry bodies including the Privacy Commissioner etc, we may update the exclusions list or list exemptions here.
The use cases below are not guaranteed for an exemption and require an initial inspection by the assessment team to confirm we are able to assess. Please contact the team using the contact form on our website.
Exemptions are only considered for limited use cases, which currently relate to accessibility.
Speech to Text (STT) and Text to Speech (TTS):
- Services using machine learning (ML) or AI models to provide captioning, voice assisted typing and other similar accessibility use cases.
- Services must ensure they continue to follow other ST4S requirements in ensuring transparency in the privacy policy and sub processor lists where required.
- Services which retain information or process information to improve STT or TTS for end users (e.g. personalising and developing a unique profile for the user that recognises their voice) must provide clear mechanisms to remove or delete such profiles or templates.
References and Definitions #
NSFW is a general term used to describe any content that may be deemed inappropriate to create, view or access whilst in the workplace and schools. NSFW content includes content that may be deemed inappropriate for younger audiences (e.g. person’s under 18).
High risk and prohibited use cases as defined under the EU AI Act are summarised here.
Notes #
Information published on this page is subject to change. As new use cases are evaluated and the framework is updates, this list will change.