5 Filters

British Chat Forums Close to Avoid New Internet Policing Law

" The Act—which was celebrated as the world-first online safety law—was designed to ensure that tech companies take more responsibility for the safety of their users."

Was it? Or was it designed to do exactly what was predicted - strongarm social media into shutting down discourse for them.

As with any bad medicine, “side-effects” are showing already

ED

British Chat Forums Close to Avoid New Internet Policing Law
Critics warn of ‘unintended consequences’ of internet laws, as niche forums shut down entirely, and some UK users are blocked from foreign-hosted sites.

By Owen Evans

March 24, 2025Updated:March 24, 2025

British chat forums are shutting themselves down rather than face regulatory burdens recently applied to internet policing laws.

On March 17, the United Kingdom’s Online Safety Act, a law that regulates internet spaces, officially kicked into force.

The law means that online platforms must immediately start putting in place measures to protect people in the UK from criminal activity with far-reaching implications for the internet.

However, for some forums—from cyclists, hobbyists, and hamster owners, to divorced father support and more—the regulatory pressure is proving too much, and its myriad of rules are causing chat forums that have been operating for decades, in some cases, to call it a day.

Conservative Peer Lord Daniel Moylan told The Epoch Times by email that “common sense suggests the sites least likely to survive will be hobby sites, community sites, and the like.”

‘Small But Risky Services’

The Act—which was celebrated as the world-first online safety law—was designed to ensure that tech companies take more responsibility for the safety of their users.

For example, social media platforms, including user-to-user service providers, have the duty to proactively police harmful illegal content such as revenge and extreme pornography, sex trafficking, harassment, coercive or controlling behavior, and cyberstalking.

But what the government calls “small but risky services” which are often forums, have to submit illegal harms risk assessments to the Online Safety Act’s regulator, Ofcom, by March 31.

Ofcom first published its illegal harm codes of practice and guidance in December 2024 and had given providers three months to carry out the assignment.

Riverside House is seen along the waterfront on Bankside in London on July 27, 2010. It houses the United Kingdom’s Office of Communications. Jim Linwood/Flickr, CC BY 2.0

It was given powers under the law and warned that those who fail to do so may face enforcement action.

“We have strong enforcement powers at our disposal, including being able to issue fines of up to 10 percent of turnover or £18 million ($23 million)—whichever is greater—or to apply to a court to block a site in the UK in the most serious cases,” said Ofcom.

Some of the rules for owners of these sites—which are often operated by individuals —include keeping written records of their risk assessments, detailing levels of risk, and assessing the “nature and severity of potential harm to individuals.”

While terrorism and child sexual exploitation may be more straightforward to assess and mitigate, offenses such as coercive and controlling behavior and hate offenses are more challenging to manage with forums that have thousands of users.

‘No Way To Dodge It’

LFGSS (London Fixed Gear and Single Speed), a popular cycling forum and resource for nearly two decades, shut down in December.

“We’re done … we fall firmly into scope, and I have no way to dodge it,” the site said, adding that the law “makes the site owner liable for everything that is said by anyone on the site they operate.”

“The act is too broad, and it doesn’t matter that there’s never been an instance of any of the proclaimed things that this act protects adults, children, and vulnerable people from … the very broad language and the fact that I’m based in the UK means we’re covered,” it said.

Dee Kitchen, the Microcosm forum software developer that was used to power 300 online communities including LFGSS, said he deleted them all on March 16, a day before the law kicked in.

More recently the Hamster Forum shut down.

On March 16, it wrote that while the forum has “always been perfectly safe, we were unable to meet the compliance.”

The resource forum dadswithkids for single dads, and fathers going through divorce or separation—and also teaches how to maintain relationships with their children, also shut down.

(Top) London Fixed Gear and Single Speed (LFGSS), a popular cycling forum for nearly two decades, announces its shutdown on Dec. 16, 2024. The forum officially closed on March 16, 2025, one day before the UK’s Online Safety Act took effect.
(Bottom Left) The Hamster Forum, a site offering tips and discussions on hamster care, announces its closure just days before the legislation took effect on March 14, 2025.
(Bottom Right) Dads with Kids, an online community for single and divorcing fathers seeking help with child access after separation, announces its closure on March 15, 2025. LFGSS, The Hamster Forum, Dads with Kids/Screenshot via The Epoch Times

UK users are also being blocked from accessing sites hosted abroad.

The hosts of the lemmy.zip forum, hosted in Finland, said to ensure compliance with international regulations while avoiding any legal risks associated with the Act, it has made the difficult decision to block UK access.

“These measures pave the way for a UK-controlled version of the ‘Great Firewall,’ granting the government the ability to block or fine websites at will under broad, undefined, and constantly shifting terms of what is considered ‘harmful’ content,” it said.

‘Not Setting Out to Penalize’

An Ofcom spokesman told The Epoch Times by email: “We’re not setting out to penalize small, low-risk services trying to comply in good faith, and will only take action where it is proportionate and appropriate.”

Premium Picks

IN-DEPTH: VPN Demand May Surge Because of Online Safety Law

Lord Moylan: Parliament Doesn’t Know How Online Safety Bill Will Operate

“We’re initially prioritizing the compliance of sites and apps that may present particular risks of harm from illegal content due to their size or nature—for example, because they have a large number of users in the UK, or because their users may risk encountering some of the most harmful forms of online content and conduct,” he said.

Critics of the law said that the ongoing changes to the way British people use the internet is the “law of unintended consequence.”

Professor Andrew Tettenborn, common-law and continental jurisdictions scholar and adviser to the Free Speech Union told The Epoch Times that smaller sites “might well shut down under the pressure. Or simply get hosted abroad.”

He also mulled that people will continue to turn to privacy tools such as Virtual Private Networks (VPNs).

{Image} A girl browses the internet in London on July 10, 2007.
Experts say the legislation could risk pushing young people toward unsafe websites using privacy tools such as virtual private networks. Chris Jackson/Getty Images

“There’s not much Ofcom can do about an outfit abroad, especially as if anyone knows how to use VPNs. It’s the young who are meant to be protected. Indeed Ofcom has to be careful lest it drive young people to decidedly dodgy sites abroad, “ he said.

“Law of unintended consequences and all that,” he added.

Moylan had previously warned that the UK may be “in danger of ending up in a little enclosed island” like China is behind its internet firewall.

He told The Epoch Times by email that survey work in advance of legislation might have helped legislators incorporate those considerations into their thinking.

“But nobody was interested,” Moylan said.

He said that the government was committed to a regulatory structure in which “everything would be devolved to Ofcom.”

“I suppose they can go back to putting notices in church porches and sending out [photocopied] newsletters by post,” he said.

Digital Service Act

The UK law goes even further than the Digital Service Act, an EU-wide regulation that requires social media platforms to remove and take other specified steps to deal with what is deemed “disinformation.”

This is because it only deals with policing what it called very large online platforms, platforms or search engines that have more than 45 million users per month in the EU.

Norman Lewis, visiting research fellow at the think tank MCC Brussels, former PwC director, and the director of technology research at Orange UK, told The Epoch Times that rules such as the UK’s could, in theory, be adopted into European legislation.

He suggested that with so many regulations “platforms that don’t generate millions, hundreds of millions of dollars or pounds in advertising are not going to be able to operate.”

1 Like

…any smaller/irksome sites will mysteriously be plagued by the 77th’s sock puppets doing racisms…

1 Like