Digital Rights & Technology 20 March 2026 12 min read

Websites Pulling Out of the UK

The Online Safety Act is forcing websites to exclude visitors from the UK

✍️ By UKPoliticsDecoded Editorial Team
Websites pulling out of the UK - Online Safety Act March 2026

A growing number of websites from small community forums to educational platforms and niche social spaces are blocking UK users, shutting down entirely, or removing interactive features. This is not a coordinated protest. It is a structural response to the legal, financial, and technical burdens created by the Online Safety Act (OSA).

The Act applies to any site that allows user generated content, regardless of size, purpose, or risk profile. That includes hobbyist forums, comment sections on blogs, educational communities, support groups, creative platforms, open source projects, non-profits and volunteer run sites.

For many, compliance is simply not feasible.

Key Facts at a Glance

  • The Online Safety Act applies to any site with user generated content, regardless of size or purpose
  • Penalties can reach £18 million or 10% of global turnover
  • Microcosm, a free hosting service supporting over 300 community websites since 2007, shut down rather than attempt compliance
  • Ofcom fined 4chan £520,000 for failing to implement age assurance and conduct proper risk assessments
  • Data shows some unregulated offshore sites saw traffic increases of over 800% after OSA age verification requirements came into force
  • Nearly half of UK porn watchers accessed unrestricted sites after the OSA took effect, according to research by the Lucy Faithfull Foundation

Why Websites Are Withdrawing

Disproportionate Compliance Duties

The OSA treats a two person forum the same as a global social network. All must conduct illegal content risk assessments, implement content moderation duties, carry out age assurance checks, establish reporting and governance processes, and update their terms of service in line with Ofcom codes.

For small operators, this is unmanageable. Dee Kitchen, the operator of Microcosm a free hosting service that had supported over 300 community websites since 2007 described the law as "too vague and too broad," adding that the personal risk of compliance was simply too great. The service shut down rather than attempt to navigate the Act's requirements.

Vague Legal Definitions

Key obligations rely on undefined or subjective terms such as "reasonable steps," "harmful content," and "highly effective age assurance." When penalties can reach £18 million or 10% of global turnover, and when senior managers can face personal criminal liability, that ambiguity becomes a risk in itself. For volunteer administrators and small non-profits, the personal exposure alone is enough to prompt closure.

Age Verification Requirements

Sites that contain or could contain adult or sensitive content must implement intrusive age checks. For many platforms, this means collecting sensitive user data, paying third party verification providers, and redesigning entire systems. Some choose the simpler option: block the UK.

The Cost of Age Verification Systems

A major driver behind websites withdrawing from the UK is the financial burden of age verification (AV) requirements. The Online Safety Act requires sites hosting adult or potentially sensitive content to implement "highly effective" age assurance systems. In practice, this means outsourcing verification to specialist third party companies.

For large platforms, this is expensive but manageable. For small websites, communities, and independent creators, it is often prohibitively costly.

AV providers typically charge for identity checks, document verification, biometric analysis, ongoing compliance reporting, and integration and maintenance. Even modest per user fees become unsustainable for small sites with low margins or volunteer administrators.

A hobbyist forum, a small educational site, or a niche community platform cannot pay for enterprise grade AV systems, redesign their infrastructure, store or process sensitive user data, or manage the legal risk of handling identity documents. For many, the only viable option is to block UK visitors entirely.

Some sites also refuse to collect sensitive data on principle. Others cannot meet the security requirements for storing or transmitting identity information. And the Act does not distinguish between a global adult platform, a small art community, a forum with occasional mature discussions, or a site with user comments that might include adult content. This lack of proportionality pushes low risk sites out of the UK ecosystem.

The Unintended Consequence: Offshore Sites Are Booming

Data from internet monitoring company Similarweb, shared with LBC, shows that platforms created since July 2025 when the OSA's age verification powers came into force have seen dramatic traffic increases. Some unregulated offshore sites experienced increases of over 800% in a single month, with one site growing from 1.5 million to 13.7 million users. Meanwhile, major compliant porn sites lost up to half their visitors. Research by the Lucy Faithfull Foundation found that nearly half of UK porn watchers said they had accessed unrestricted sites after the OSA took effect.

Who Is Affected

Small Websites and Forums

Many are shutting down or geoblocking UK users because they cannot meet the Act's requirements. The closure of Microcosm which hosted cycling forums, local community hubs, and other niche groups is one documented example of this pattern.

Educational and Support Communities

Spaces dealing with mental health, identity, or sensitive topics face heightened duties. Some are withdrawing to avoid misinterpretation of "harmful content" rules, even where their purpose is clearly supportive or educational.

Creative and Open Source Platforms

Interactive features, comments, uploads, collaboration tools are being removed or restricted for UK visitors. Open source projects that rely on community contributions face particular difficulties, as any user generated input can trigger compliance obligations.

Adult Content Platforms

Some major sites have implemented age checks. Others have blocked the UK entirely. Meanwhile, unregulated offshore sites are gaining significant traffic, as the LBC and Similarweb data shows. Ofcom has investigated dozens of sites and issued over £3 million in fines in the past year, including a £520,000 fine against 4chan for failing to protect children from pornographic content, failing to carry out a proper illegal content risk assessment, and failing to update its terms of service. 4chan has engaged a US law firm to contest the requirements.

Why This Matters

Loss of Educational and Community Resources

The UK risks losing access to peer support communities, specialist educational platforms, niche knowledge bases, creative collaboration spaces, and grassroots civic forums. These are not "big tech" harms. They are public interest losses that affect ordinary people who relied on these spaces.

Fragmentation of the Open Internet

Geoblocking creates a two tier internet where UK users have reduced access to global knowledge and communities. Rather than making the internet safer, this fragments it and pushes users towards less regulated spaces.

Incentives for VPN Use

Users bypass restrictions using VPNs, undermining both safety and regulatory intent. The Lucy Faithfull Foundation's research suggests this is already happening at scale in the adult content sector, with significant potential safety implications.

Reduced Digital Resilience

By focusing on restriction rather than education, the Act may leave young people less prepared to navigate the real online world including the unregulated offshore spaces that are now growing rapidly as a direct consequence of the legislation.

What Experts Recommended Instead

Before the Online Safety Act was passed, a wide range of specialists child safety researchers, psychologists, digital rights groups, educators, technologists, and regulatory advisers recommended a different approach. Their proposals were grounded in decades of evidence about how children use the internet, how harm occurs, and what actually works to reduce risk.

These recommendations were not adopted.

Education for Parents and Carers

Experts emphasised that the strongest protective factor in a child's online life is an informed, supportive adult, not a filter or an age gate. A national education first strategy would have focused on practical training for parents, open and non-judgemental conversations, support for families, and reducing reliance on surveillance tools. This approach strengthens the whole ecosystem around a child.

Digital Literacy for Children

Experts recommended embedding digital literacy into the curriculum, covering critical thinking, media literacy, social literacy, help seeking skills, and resilience. Digital literacy protects children everywhere not just on UK regulated platforms. It is the one intervention that travels with a child regardless of which site they visit or which country hosts it.

Targeted Intervention for High Risk Harms

Rather than blanket duties on every website, experts recommended specialist law enforcement capacity, platform specific risk assessments, stronger cooperation mechanisms, rapid response pathways, and evidence based prioritisation. This focuses resources where harm is actually occurring on grooming, child sexual abuse material, and extremist recruitment rather than spreading compliance burdens across thousands of low risk community sites.

Proportionate Regulation

A tiered model similar to the EU's Digital Services Act would avoid crushing small websites, preserve community spaces, encourage innovation, focus regulatory effort where it matters, and reduce incentives for geoblocking or shutdowns. The OSA instead applies uniform duties across the entire internet, regardless of a platform's size, reach, or actual risk profile.

Why These Recommendations Were Not Adopted

Despite broad expert consensus, the UK chose a control based, platform centric model that prioritises restriction over resilience, treats all online spaces as equally risky, places enterprise level duties on micro platforms, and incentivises geoblocking and shutdowns.

The alternative education, empowerment, targeted intervention, and proportionate regulation would have strengthened safety without shrinking the digital public sphere. The evidence from the first months of enforcement suggests that the current approach is not only failing to achieve its goals in the adult content sector, but is actively redirecting users towards less regulated, potentially more harmful spaces.

Whether the Act will be revised in light of these consequences remains to be seen. What is already clear is that a significant number of websites have concluded that serving UK users is no longer worth the risk and that the communities those sites supported have lost something that cannot easily be replaced.

AI Use: AI tools were used to support source discovery and to structure the article for clarity. All research, verification, drafting, and final editorial decisions are fully human led.