Britain’s broadcasting regulator, Ofcom, has launched an investigation into an online suicide forum reportedly connected to at least 50 deaths in the UK, according to local media.
The probe aims to determine whether the site’s service provider failed to implement appropriate safety measures to protect UK users from illegal content and harmful activity.
This marks the first investigation under the UK’s Online Safety Act of 2023, which requires service providers to promptly remove illegal content once notified. Providers were expected to comply by mid-last month.
Ofcom stated that it had made several attempts to engage with the service provider and had issued a legally binding request for a report assessing the risk of illegal harm. However, after receiving limited and unsatisfactory responses, the regulator said it had no choice but to proceed with a formal investigation.
Due to the sensitive nature of the content, Ofcom is not disclosing the name of the website or the service provider. The BBC reports the forum is hosted in the US and has tens of thousands of users, including minors. The platform reportedly contains discussions on suicide methods and instructions for obtaining toxic chemicals.
If the provider is found in violation of the law, Ofcom could issue a court order to remove the harmful content and impose fines of up to £18 million ($23 million) or 10% of the provider’s global turnover.
AFP
