The Online Safety Act 2023 is a piece of UK legislation that aims to protect children and adults online. It covers a wide range of issues including minimising the risk of children seeing harmful and age-inappropriate content, removing illegal content like child sexual abuse material (CSAM), criminalising fraudulent and scam ads, and introducing age verification for certain online services.
This new legislation has far-reaching implications, particularly for businesses offering age-restricted content. Gone are the days of simply stating content is "for mature audiences only" as companies now have a legal obligation to implement age verification checks before granting access to restricted content.
The following outlines a roadmap for navigating the Online Safety Act's age assurance requirements. We'll break down the specifics of the legislation and explore what it means for organisations across different sectors. Whether you're in gaming, entertainment or any field handling sensitive content, this guide will equip you with a strong foundation of knowledge, confidently achieve compliance and responsible user access through robust age verification and Know Your Customer (KYC) checks.
Enacted in October 2023, the Online Safety Act is a piece of UK legislation intended to create a safer online environment for both children and adults. The Act implements a multifaceted approach to achieving this goal with a focus on:
Protecting children: Minimising the risk of children encountering age-restricted content is a central pillar of the Online Safety Act.
Combating illegal material: The Act prioritises the removal of illegal content, particularly child sexual abuse material (CSAM), by placing stricter reporting and takedown obligations on online platforms.
Tackling deceptive content: The Online Safety Act cracks down on fraudulent and scam advertisements, aiming to create a more trustworthy online space for users.
Age verification: A key aspect of the Act is the introduction of mandatory age assurance checks for specific online services that offer age-restricted content. This ensures users meet the minimum age requirement before being permitted access to such content.
As the appointed regulator, Ofcom is charged with ensuring affected companies are proactively assessing the risks of harm to their users and introducing safeguards (such as age verification checks) to protect them online.
To achieve this, Ofcom is implementing the Online Safety Act in phases:
Phase 1: Illegal harms duties (November 2023–present): This phase prioritises addressing the most severe online dangers, including terrorism, fraud and child sexual exploitation and abuse (CSEA). Ofcom has already published draft codes and guidance for these duties and is currently gathering feedback. Finalisation and parliamentary approval of the "illegal harm codes" are expected in Autumn 2024.
Phase 2: Child safety and vulnerable users (consultation expected Mid-2024): This phase will focus on protecting children from legal but potentially harmful content, such as pornography, content related to suicide/self-harm/eating disorders, and content that may exploit or abuse women and girls. Ofcom has also published its guidance of how to use age verification or age assurance to prevent children from accessing pornographic content.
Phase 3: Transparency and user empowerment (consultation expected mid-2024): This final phase focuses on additional duties for categorised services, including transparency reporting, user empowerment tools, tackling fraudulent advertising and upholding user rights. Consultation for this phase is expected to begin in mid-2024.
Ofcom has the authority to scrutinise companies' compliance measures. Failure to comply with the Act can lead to a range of penalties, including:
Financial penalties: Fines can reach up to 10% of a company's global annual turnover or a maximum of £18 million, whichever is greater.
Criminal action: Companies and senior managers failing to comply or disregarding Ofcom's requests may face criminal prosecution.
Platform shutdown: In extreme cases, Ofcom can instruct service providers to cease working with non-compliant websites, effectively hindering their revenue generation and accessibility within the UK.
The potential financial, regulatory and reputational risks associated with non-compliance are significant. As a result, ecommerce businesses that proactively ensure compliance will benefit from stronger customer trust and a positive return on investment.
The Online Safety Act affects companies with substantial user bases in the UK, or services accessible to UK users and posing potential online risks fall under the Act's purview. This includes platforms facilitating user-generated content, such as:
Gaming operators
Video gaming sites
Dating sites
Ecommerce websites
Social media platforms
Adult websites
Ofcom estimates over 100,000 online services could be subject to the new regulations.
In short, the Online Safety Act seeks to establish a more secure online environment for all users, with a particular focus on protecting children. By understanding who falls under its umbrella and the expectations outlined, businesses can proactively adapt and better ensure compliance with this new regulatory landscape.
1. Risk assessments and proactive solutions:
2. Combating illegal content:
3. Child safety measures:
4. Transparency and reporting:
The implications of the Online Safety Act are far-reaching, meaning many businesses who operate online will need to take proactive steps to achieve compliance. Discover what the Act means for businesses like yours in the section below:
In order to achieve compliance with the Online Safety Act, tech companies and social media platforms must take measures to keep users safe online.
To achieve this, they should consider developing systems for detecting harmful content, provide filtering tools that give users more control over the content they do and do not see, and enforce stricter age limits in order to protect young users. How organisations achieve this must be clearly explained in their terms of service. As the issue of cyberbullying continues to grow in scale, companies should also take steps to provide better protection – especially for children – from cyberbullying, online harassment, hate speech and exploitation.
The Act also introduces a number of new criminal offences, including:
On 31st January 2024, the first person in England and Wales was convicted of cyberflashing under the Online Safety Act.
According to research from the Children’s Commissioner, the average age at which children first see pornography is 13, with 38% of 16 to 21-year-olds having accidentally been exposed to pornographic content online.
To address these shocking statistics, the government has elected to require services that publish or permit pornography on their sites to implement robust age verification or age assurance measures to prevent underage users from accessing such content.
Such platforms will be held to a higher standard of account. If it contains pornographic content — including video, images and audio — they must take steps to ensure children cannot access it, and Ofcom has stated weaker methods will not be allowed; including self-declaration and online payment methods which don’t require ID like debit cards.
In its guidance, Ofcom has indicated facial age estimation and digital identity wallets can be used as highly effective methods to help protect children from accessing pornography online, and that both methods are more protective of privacy than uploading physical identity documents or credit cards. In fact, over 80% of adults select facial age estimation when given a choice of options.
Ninety three percent of children in the UK play video games - a statistic that will be of little surprise to many. But with such a large percentage of children utilising video gaming platforms, it is likely many video gaming companies will be impacted by the Act and have to play their part in keeping children safe online.
Online video games will be in the scope of the Act if they:
Like other companies impacted by the Online Safety Act, gaming platforms will need to comply with the general duties imposed on all regulated services. Gaming companies will also need to carry out child risk assessments to determine whether their game is accessed by children – whether that game is intended for children or not.
As a result, gaming companies will need to implement age assurance measures to better understand the real ages of their players – and not just the ages they may claim to be. Once this has been achieved, video gaming providers will then be able to deliver age-appropriate experiences in-line with the Act. This may include limiting certain features for specific age groups, such as voice chat or file sharing, or age-gating content that is deemed unsuitable for players under a certain age.
Ninety three percent of children in the UK play video games - a statistic that will be of little surprise to many. But with such a large percentage of children utilising video gaming platforms, it is likely many video gaming companies will be impacted by the Act and have to play their part in keeping children safe online.
Online video games will be in the scope of the Act if they:
Like other companies impacted by the Online Safety Act, gaming platforms will need to comply with the general duties imposed on all regulated services. Gaming companies will also need to carry out child risk assessments to determine whether their game is accessed by children – whether that game is intended for children or not.
As a result, gaming companies will need to implement age assurance measures to better understand the real ages of their players – and not just the ages they may claim to be. Once this has been achieved, video gaming providers will then be able to deliver age-appropriate experiences in-line with the Act. This may include limiting certain features for specific age groups, such as voice chat or file sharing, or age-gating content that is deemed unsuitable for players under a certain age.
Like other organisations impacted by the Act, gaming operators will need to comply with the general duties imposed on all regulated services. There will also be a requirement for operators to carry out child risk assessments to determine whether their services are accessed by children, and implement age assurance measures to ensure they know the age ranges of their players. Once this information has been gained, platforms can deliver age-appropriate experiences — which may involve age gating or limiting certain high-risk features for certain age groups or barring some players from their platforms entirely.
The ICO’s guidance on the issue indicates if several children are likely to access a service, even if it’s not designed for use by children, those sites should introduce robust age verification measures to conform with the standards in the Children’s Code. This sentiment is mirrored in Ofcom’s latest guidance which has stated facial age estimation is a highly effective method to help protect children from accessing age-restricted services and goods online. In fact, this method is favoured by over 80% of consumers who select facial age estimation when given a choice of age verification options.
Ofcom, the enforcing authority under the Act, will scrutinise the measures taken by companies to adhere to the legislation, with non-compliance leading to various possible penalties, including:
With these measures in mind, the financial, regulatory and reputational impacts of non-compliance cannot be understated. To tackle this far-reaching issue and protect their customers from harm, operators should review their platforms. As part of this review, they should consider what percentage of users might be underage — or look at evidence indicating underage people might be likely to engage with their sites. This will help ensure gaming operators are in a strong position to meet legislative requirements, which will enable them to foster stronger customer relationships and provide an attractive return on investment.
The Online Safety Act also applies to online retailers who may be selling age restricted goods, such as knives or vapes. It is no longer sufficient to have a user self-declare that they are old enough to access an age restricted product or service, and sites will be expected to take proactive measures to verify the age of their customers in order to achieve compliance with the Act.
Where previously, many sites may have used details from bank accounts and payment cards to verify age, this system is not without its flaws. At a time when many underage users may use the payment details of a parent or guardian when making a transaction, ecommerce sites should consider capturing facial biometrics with an accurate age estimation feature at point of transaction to verify age for age-restricted products. This will help ensure ecommerce businesses are in a strong position to meet legislative requirements and offer safer, more trusted transactions to their customers.
At TransUnion, we make trust possible by helping organisations confidently interact with genuine customers. Our TruValidate solutions encompass identity and device insights to help organisations confidently and securely engage consumers at each stage of the customer journey, helping improve conversions, reduce fraud losses and deliver enhanced, friction-right user experiences.
The Online Safety Act introduces new age verification requirements for a number of industries, but achieving compliance doesn’t have to come at the expense of user experience within already established platforms and websites.
Our TruValidate Document Verification and Facial Biometrics solution offers powerful age verification and estimation features that helps you meet the Act's demands while ensuring a smooth onboarding and engagement process for your customers. The solution also allows for re-authentication at future trigger points, such as downloading a new age-restricted game, or making a withdrawal, to ensure ongoing protection across the customer journey.
TransUnion's age assurance technology provides a user-friendly Know Your Customer solution that prioritises both robust safety measures and a positive customer experience, striking the delicate balance between safety and convenience.
Fulfilling the Online Safety Act's age verification requirements can be daunting. TransUnion's age assurance technology, embedded within TruValidate, offers a valuable evidentiary benefit. By implementing this technology, you proactively demonstrate your commitment to user safety. Employing both age estimation and age verification steps at multiple touchpoints showcases a clear effort toward compliance. This can be critical evidence in the event of future scrutiny, potentially mitigating penalties or reputational damage.
Contact a member of our team today to learn more about how TruValidate Document Verification and Facial Biometrics can help streamline your compliance efforts under the Online Safety Act and empower you to build a trusted online environment.
Alternatively, you can download our latest guide — Authenticating Digital Identities: Optimising Business Performance and Mitigating Fraud Risks — for a full view into how our solutions can help your business thrive in the age of online safety regulations.
We're sorry, your request failed. Please try again in a little while.