Appropriate Filtering for Education Settings
Schools and colleges in the UK are required to establish appropriate levels of filtering to ensure children are provided with safe access to the internet without over blocking. Schools and colleges in England must adhere to the Department for Education's Keeping Children Safe in Education statutory guidance, those in Wales are governed by the Welsh Government's Keeping Learners Safe, in Scotland the requirements are laid down by the Scottish Government's National Action Plan on Internet Safety for Children and Young People and in Northern Ireland the requirements are in the Department of Education's Safeguarding and Child Protection in Schools.
The guidance allows schools a huge amount of freedom, to be exercised with a "risk based approach". Whilst schools benefit from the freedom they have been afforded, further guidance is essential to allow them to properly assess the risks and design appropriate policies. To this end, the UK Safer Internet Centre has issued detailed Appropriate Filtering for Education Settings guidance, which is cited by both Keeping Children Safe in Education and the National Action Plan on Internet Safety for Children and Young People as an example of what constitutes "appropriate filtering".
Although the guidance affords schools the freedom to design their own policies from scratch, we feel that both the Department for Education's Filtering and Monitoring Standards for Schools and Colleges and the UK Safer Internet Centre's standards should form the basis of all schools' filtering policies. Where schools feel the need to deviate from those standards, we strongly recommend that they complete a risk assessment so that the reasons for deviating and associated risks can be understood and documented.
We are committed to supporting schools in carrying out their safeguarding duties, and have outlined below how we meet these standards. Our official UK Safer Internet Centre certification is also available for download.
It is important to recognise that no filtering systems can be 100% effective and need to be supported with good teaching and learning practice and effective supervision.
Illegal Online Content
Our Web Gateway and UTM online safety systems ensure that access to illegal content is blocked. The UK Safer Internet Centre advises that providers:
Aspect | Rating | Explanation |
---|---|---|
Are IWF Members | Pass | We have been IWF members since 2016. |
Block access to illegal Child Abuse Images (by actively implementing the IWF URL list) | Pass | The IWF Child Abuse Image Content URL list is integrated into the Child Abuse Images filtering category and we have successfully completed the IWF's certification process.
Our systems go beyond the basic protection by also utilising the IWF's keywords list, and Non-Pornographic Child Abuse Images URL lists. As well as directly blocking content that the IWF has listed, all of these resources are also used to dynamically identify and block offending content which has not yet been reported to the IWF. |
Integrate the ‘the police assessed list of unlawful terrorist content, produced on behalf of the Home Office’. | Pass | The police assessed list of unlawful terrorist content, produced on behalf of the Home Office is integrated into the Radicalisation filtering category. |
Confirm that filters for illegal content cannot be disabled by the school | We have always sought to give our customers as much control as possible over their own systems, so whether to enable or disable any filter is currently the school's choice. We would, however, advise that it would be negligent for a school to disable the illegal content filters, except as a temporary measure for debugging purposes.
In light of this new requirement, a prohibition on disabling the illegal content filters will be implemented in the coming months. |
Inappropriate Online Content
Recognising that no filter can guarantee to be 100% effective, the following table confirms and describes how Opendium Web Gateway and Opendium UTM manage the following content:
Content | Description | Rating | Explanation |
---|---|---|---|
Discrimination | Promotion of the unjust or prejudicial treatment of people on the grounds of race, religion, age, or sex. | Pass | We provide a Discrimination category which covers content that promotes the unjust or prejudicial treatment of people on the grounds of race, religion, age, or sex.
We also provide a Hate category which covers content promoting religious or racial hate. |
Drugs / Substance abuse | Promotion of the illegal use of drugs or substances. | Pass | We provide a Drugs category which covers content that promotes or facilitates recreational drug use, including "legal highs". This category does not include educational material about recreational drugs and information about medicinal drugs. |
Extremism | Promotion of terrorism and terrorist ideologies, violence or intolerance | Pass | We provide a Radicalisation category which covers radicalisation, extremism and terrorism. This includes the police assessed list of unlawful terrorist content, produced on behalf of the Home Office. |
Gambling | Enables gambling | Pass | We provide a Gambling category which covers online gambling web sites. This does not include information about offline gambling, such as instructions for card games, etc. |
Malware / Hacking | Promotion of the compromising of systems including anonymous browsing and other filter bypass tools as well as sites hosting malicious content. | Pass | We provide an Anonymisers / Proxies / VPNs filtering category to control anonymous browsing systems which could be used to bypass filtering and monitoring.
We also provide a Cracking category which covers information about how to gain illicit entry to computer systems. We also provide a Malware category which covers Malware, spyware, viruses and URIs related to their operation. Also aims to include adverts designed to trick users into downloading malware. |
Pornography | Sexual acts or explicit images. | Pass | We provide a Pornography category which covers pornographic content and erotic text. This does not include non-sexualised images (e.g. medical information).
We also provide a Sexualised Text filtering category which covers textual content which is sexual in nature but falls short of being considered pornographic. |
Piracy and copyright theft | Illegal provision of copyrighted material. | Pass | We provide a Copyright Infringement category which covers content that promotes and facilitates illegal downloading of copyrighted content, such as software, music, movies, etc. |
Self Harm | Promotion or display of deliberate self harm (including suicide and eating disorders). | Pass | We provide a Self Harm category which covers content that promotes self harm and suicide. |
Violence | Promotion or display of the use of physical force intended to hurt or kill. | Pass | We provide a Violence category which covers content that promotes violent acts. |
This list is not exhaustive. We maintain a selection of predefined categories, and updates to the categorisation criteria are downloaded every hour. Websites and web searches are categorised using a variety of methods, including through a database of known web addresses and by real time content analysis. By analysing content on the fly, the system can effectively filter new content and websites that tailor dynamic content to the individual user, such as social networking sites. School system administrators can add filtering criteria to the categories to either augment or override the predefined criteria. School administrators can also add their own custom categories.
Data Protection
Opendium Web Gateway and Opendium UTM are on-premises systems. These systems store internet history data on the school's server. By default, log data, including the user's identification, is retained for 2 years, but the retention period can be adjusted to meet the school's needs.
Internet history data that is stored on our internal systems will be retained for no longer than 3 years. This includes any log extracts, reports, etc. that the school may need to send to our technical support team.
Some filtering providers rely on contractual clauses that place an onus on schools to ensure that they do not pass on personal data to the provider. We strongly believe that it is not possible to provide the level of support that schools expect whilst adhering to those restrictions, and they ultimately lead to data protection law being routinely broken, with the school carrying the liability. Instead, we provide schools with a standard data processing agreement, which allows us to better support the school whilst ensuring that the personal data is properly protected and that the relevant legislation can be adhered to.
All schools should have a suitable data processing, or data sharing, agreement with any third parties that have access to personal data, including the company that supports their filtering system and any outsourced ICT provider, to ensure that personal data is always handled in a secure and legal way.
Over Blocking
Opendium Web Gateway and Opendium UTM allow school administrators a lot of scope for tuning the system to meet their needs. The sensitivity of the filters can be adjusted and administrators can decide whether or not repeat offenders should have their web access automatically disabled. Miscategorised websites can be manually recategorised instantly, or the filters completely disabled for educational websites. Users can be given the option to override the filters after being shown a warning, and users can report miscategorised pages directly to us for recategorisation.
The systems can generate real time alerts for concerning behaviour to ensure early intervention from staff in the most serious circumstances; and comprehensive reports can be generated on an automatic or ad-hoc basis to ensure that staff can spot and follow up on concerning behaviour.
Our systems also support Location Aware Filtering, which can be used to relax filters in supervised parts of the school, or in classrooms that have specific requirements.
Schools may decide that, for some categories, rather than risk overblocking it is better to allow access and to follow up concerning behaviour that is highlighted by the reporting system. A variety of reporting tools are provided to facilitate this, such as real time alerts and our unique Word Cloud report that flags up search phrases which fall into concerning categories. This provides an easy and understandable way for staff to drill down into the data.
Filtering System Features
The following table describes how Opendium Web Gateway and Opendium UTM meet the principles set out by the UK Safer Internet Centre:
Principle | Rating | Explanation |
---|---|---|
Context appropriate differentiated filtering, based on age, vulnerability and risk of harm – also includes the ability to vary filtering strength appropriate for staff | Pass | Opendium Web Gateway and Opendium UTM both integrate with the school's existing user directory and provide a hierarchical system to configure and refine filtering policies, filter sensitivity and real-time alert triggers on a per-usergroup, per-network and per-user basis. |
Circumvention – the extent and ability to identify and manage technologies and techniques used to circumvent the system, specifically VPN, proxy services and DNS over HTTPS. | Pass | Opendium Web Gateway and Opendium UTM provide a variety of tools to prevent circumvention of the system:
We provide an Anonymisers / Proxies / VPNs category to control anonymous browsing systems. Both Opendium Web Gateway and Opendium UTM incorporate anti-spoofing technologies and utilise deep packet inspection to restrict VPN connections whilst allowing other applications. Opendium UTM provides additional protection by providing numerous predefined firewall rule bundles for common applications, which utilise deep packet inspection to prevent VPN connections from misusing ports that are required by legitimate services. Our online safety systems do not rely on DNS filtering, so are unaffected by technologies such as DNS-over-HTTPS (DoH) and DNS-over-TLS (DoT). Opendium UTM also performs DNS and NTP interception to prevent VPNs from taking advantage of these important ports without getting in the way of legitimate systems that rely on them. New VPNs are appearing all of the time and use a wide variety of techniques to mask their traffic. It is important for schools to understand that no system can block them with 100% accuracy, but we work closely with schools to rapidly provide a solution whenever a new threat is identified. |
Control – has the ability and ease of use that allows schools to control the filter themselves to permit or deny access to specific content. Any changes to the filter system are logged enabling an audit trail that ensure transparency and that individuals are not able to make unilateral changes | Pass | The web based user interface allows school administrators to adjust settings from anywhere in the school, with immediate effect. All customers have direct access to our experienced engineers, who endeavour to provide high quality telephone and email support.
Any changes to the system's configuration are recorded in an audit log, and comments can be attached to most configuration items so that they can be documented and understood at a later date. |
Contextual Content Filters – in addition to URL or IP based filtering, the extent to which (http and https) content is analysed as it is streamed to the user and blocked, this would include AI generated content. For example, being able to contextually analyse text on a page and dynamically filter. | Pass | Real time content analysis has been a core part of our filtering technology from its inception.
A URL filter can tell that a user is looking at an online messaging forum, for example, but not that the specific message that they are looking at is extremist or promoting drug use. Nor can a URL filter spot when a legitimate website has recently been hacked and now contains links to pornographic websites. So much of the modern web is made up of dynamic content that a filter cannot be fit for purpose if it is unable to analyse content in real time to catch these types of scenario. We use a combination of techniques to categorise content, including HTTPS decryption, content analysis and URL lists to provide the most accurate filtering. |
Filtering Policy – the filtering provider publishes a rationale that details their approach to filtering with classification and categorisation as well as over blocking | Pass | Our filtering rationale is described in our knowledgebase. A description for each category, outlining the categorisation criteria, is provided through the system's user interface. |
Group / Multi-site Management – the ability for deployment of central policy and central oversight or dashboard | Opendium Web Gateway and Opendium UTM are designed for single-school installations and we therefore do not provide multi-site management. However, individual systems can be independently managed remotely from anywhere in the world.
We expect to provide a comprehensive multi-site management solution in the future. | |
Identification - the filtering system should have the ability to identify users | Pass | Opendium Web Gateway and Opendium UTM both support a variety of user identification methods, such as Kerberos single sign on for workstations and RADIUS accounting, WISPr and captive portal for mobile devices / BYOD. |
Mobile and App content – mobile and app content is often delivered in entirely different mechanisms from that delivered through a traditional web browser. To what extent does the filter system block inappropriate content via mobile and app technologies (beyond typical web browser delivered content). Providers should be clear about the capacity of their filtering system to manage content on mobile and web apps | Pass | By providing a comprehensive transparent proxy service with HTTPS decryption, Opendium Web Gateway and Opendium UTM both allow the school to control apps that communicate using HTTP and HTTPS, and these comprise the vast majority of apps. Where apps have been designed to disallow active HTTPS decryption, the app can still be identified and either allowed or blocked, by means of passive inspection.
A minority of apps use entirely different delivery mechanisms, and Opendium Web Gateway provides a firewall that can control these on a per-network basis. Opendium UTM extends this capability to allow fine grained control over these apps by user group or individual user, in a similar way to web traffic. |
Multiple language support – the ability for the system to manage relevant languages | Pass | The use of a wide variety of categorisation methods makes the system largely language agnostic, filtering both English language and foreign language websites alike.
Our textual content analysis system uses unicode to support all languages and character sets. |
Network level - filtering should be applied at ‘network level’ ie, not reliant on any software on user devices whilst at school (recognising that device configuration/software may be required for filtering beyond the school infrastructure) | Pass | Opendium Web Gateway and Opendium UTM both provide network level filtering and do not require software to be installed on user devices. This is provided through a combination of deep packet inspection, transparent proxying and both active HTTPS decryption and passive HTTPS inspection. |
Remote devices – with many children and staff working remotely, the ability for school owned devices to receive the same or equivalent filtering to that provided in school | Pass | Remote devices can be configured to route their network traffic via the school's Opendium UTM through a secure VPN. Children and staff working from home can therefore receive the same level of filtering whether they are at home or on the school's premises, as well as being able to interact with other on-premises services as if they were physically at school. |
Reporting mechanism – the ability to report inappropriate content for access or blocking | Pass | When access to a website is blocked, the user is given an option to report a miscategorisation of the website directly to us. All reported web sites are manually examined and, if necessary, recategorised.
We also take underblocking very seriously and welcome reports of such instances. We continually work with customers to address any concerns and improve the accuracy of the filters. |
Reports – the system offers clear historical information on the websites users have accessed or attempted to access | Pass | Opendium Web Gateway and Opendium UTM keep historical logs and can generate a variety of reports to allow staff to drill down into the data.
Additionally, the systems can be configured to automatically alert relevant staff in real time, to any seriously concerning behaviour. |
Safe Search – the ability to enforce ‘safe search’ when using search engines | Pass | Opendium Web Gateway and Opendium UTM can be configured to enforce Safe Search on a variety of search engines, as well as Restricted Mode on YouTube. With YouTube Restricted Mode enforced, schools can delegate to specific staff members the ability to white list additional videos through their Google dashboard. |
Supporting Schools
Filtering systems are only ever one tool in helping to safeguard children when online and schools have an obligation to “consider how children may be taught about safeguarding, including online, through teaching and learning opportunities, as part of providing a broad and balanced curriculum”. Our products have always been developed hand-in-hand with schools. Schools are on the front line and in the best position to know what tools they need and we always try to listen and develop those tools.
We provide a holistic service which goes above and beyond filtering. This includes training and advice for school IT and safeguarding staff, and consultancy services to improve schools' network infrastructure to cater for their ever changing requirements. However, we will never pressure schools into purchasing additional services and are equally happy to work with third parties to bring about any infrastructure improvements that our customers require.
We also run webinars from time to time, to help schools to better understand their obligations and how to improve the safety of the school environment. Many of these events, such as our recent "Online Safety for Boarding Schools" webinar, are not specific to our products and are open to all schools to attend at no cost.
Capacity
Schools are now expected to ensure that there is sufficient capability and capacity in those responsible for, and those managing, the filtering system (including any external support provider).
All customers have direct access to our experienced engineers, through both email and telephone. As we recognise that school ICT staff are extremely busy and don't have time to wait in a telephone queue, we do not employ a queuing system. Instead, we endeavour to ensure that we have enough capacity to answer the vast majority of calls immediately, and on the infrequent occasions when all of our staff are busy, customers are invited to leave a voicemail and are called back as soon as possible.
To help schools evaluate our capacity, and to underscore our commitment to high quality customer support, we are pleased to publish the following customer support statistics for the period 1st June 2022 - 1st June 2023:
- Telephone support:
- Ratio of answered telephone support calls versus voicemails left: 96%.*
- Average time to respond to voicemails: 1 working hour, 13 minutes.†
- Average time to respond to urgent calls: 23 minutes.†
- All support:
- Average time to resolution: 2 working days, 5 hours, 13 minutes.
* Excludes voicemails which were left outside of our "standard support" hours (09:00 - 17:00 Monday - Friday).
† The time when our staff annotate the support ticket, which usually happens shortly after they have responded to the voicemail, is used to measure the time taken to respond to voicemails. This figure is therefore an overestimate.
Certification Declaration
In order that schools can be confident regarding the accuracy of the self-certification statements, we confirm:
- that our self-certification responses have been fully and accurately completed by a person or persons who are competent in the relevant fields
- that we will update our self-certification responses promptly when changes to the service or its terms and conditions would result in the existing compliance statement no longer being accurate or complete
- that we will provide any additional information or clarification sought as part of the self-certification process
- that if at any time, the UK Safer Internet Centre is of the view that any element or elements of our self-certification responses require independent verification, we will agree to that independent verification, supply all necessary clarification requested, meet the associated verification costs, or withdraw our self-certification submission.