No matter what side of the political spectrum you fall on, we can all agree that keeping children safe online should be a top priority. But if the recent reaction to the Online Safety Act is anything to go by, this is easier said than done.
From the MAGA base labelling it the “UK’s online censorship law” to everyday users raising concerns about where their sensitive information is being stored (or who it’s being sold to), it’s clear the Online Safety Act is still flawed – even if it may have been born from good intentions.
Feeling overwhelmed by the discourse? We get it. Let’s explore what the Online Safety Act is and what it means for you.
What is the Online Safety Act, and which sites are affected?
First created in 2023 and enforced by Ofcom, the Online Safety Act aims to make the internet safer for everyone, but especially children. From 25th July 2025, an amendment to the Act, the Children’s Code, requires platforms and websites to prevent young people from seeing harmful content relating to suicide, self-harm, eating disorders, and pornography.
While this kind of content can be found across social media, the Code notably affects porn sites. Now one of the most infamous platforms, Pornhub, institutes an age verification blocker that pops up automatically when users attempt to enter the website. This means that anyone wanting to view the site’s content would have to input a picture of a valid ID, or simply use a VPN to bypass these security measures altogether.
The consequences of the Act are already being felt in the profit margins. Pornhub recently announced that they’d be exiting France – their second biggest market – after access was affected.
Censorship or safety?
But not all the content being blocked is as clear-cut. Some social media users have reported that informative posts – for example, non-graphic content about Gaza, Ukraine, parliamentary debates on grooming gangs, and even an image of Francisco de Goya’s famous painting, “Saturn Devouring His Son” – now require similar ID checks to view.
Instances like this highlight how quickly the Online Safety Act can turn from helpful digital protector to censor. Sandra Wachter, Professor of Technology and Regulation at the Oxford Internet Institute, told BBC Verify: “[The new bill was] not supposed to be used to suppress facts of public interest, even if uncomfortable.”
But perhaps it’s unsurprising that many organisations are exercising caution. If a platform is found to have failed to prevent underage users from viewing harmful content, they could be fined up to £18 million – or 10% of their global revenue.
For many campaigners, however, the new Children’s Code has been a cause for celebration – with some even believing that the measures don’t go far enough to protect young people’s safety. Families who have lost children due to online bullying and dangerous social media challenges have become some of the biggest campaigners for this – with many calling for under-16s to be banned from social networking sites altogether.
Where is the data going?
A majority of criticism doesn’t come from a place of disregard for children’s safety. It’s more about the lack of transparency offered by the platforms taking sensitive data from their users. Are the images of driving licences, passports, and other ID cards being deleted after the age check is complete, or are they being stored? And if they are being stored, what’s stopping that company from selling this data on to other platforms, as has become common practice in recent years?
It’s these unaddressed questions that have left many people suspicious of the Online Safety Act. And it’s also why some users wishing to access censored content have simply turned to VPNs to allow them to do so. “We can choose which VPN to use, but the choice of which age assurance technology to use is with the platform,” one user was quoted in The Guardian.
Others have rightfully pointed out that it could lead to Ofcom exerting too much control over independent sites. Theoretically, any website that hosts user-generated content – whether that be a conversation forum or a volunteer blog with an open comment section – should be subject to the same checks. The only difference being that if they are found to be in breach of the Online Safety Act, they won’t have the same legal backing that huge corporations like Reddit, X, and Pornhub have.
Secure your content, secure your website
We understand if you’re worried about what the changes to the Online Safety Act could mean for your website. Despite initial reactions, it’s not as simple as age checkers popping up on porn sites – the depth and breadth of content that stands to be affected by the Children’s Code is much more unclear.
At Fasthosts, we have over 25 years of experience navigating the complexities of compliance on the world wide web. Our expert tech support team is here to answer your burning questions – and ensure your website is protected against cyber vulnerabilities at the same time. Get in touch to chat today.