Summary
The United Kingdom’s communications regulator, Ofcom, has initiated nine formal investigations under the recently enacted Online Safety Act 2023, targeting a range of online platforms for potential failures in protecting users from illegal and harmful content. Among the services under scrutiny is 4chan, an anonymous imageboard notorious for hosting controversial content linked to misogyny, violence, cyberattacks, and the dissemination of child sexual abuse material. These probes represent some of the first major enforcement actions under the Act, which seeks to impose stringent safety obligations on online services to safeguard children and vulnerable users across the UK.
The Online Safety Act empowers Ofcom with broad regulatory authority to require online platforms to conduct detailed illegal content risk assessments, implement effective safety measures, and maintain transparency and accountability. Failure to comply with these duties can result in significant penalties, including fines up to £18 million or 10% of global turnover, criminal liability for senior management, and business disruption orders such as payment blocking or site access restrictions. The investigations launched in early 2024 focus on whether these platforms, including 4chan and several file-sharing and adult content services, have adequately met their statutory responsibilities to protect UK users from illegal content, especially child sexual exploitation and abuse.
The investigation into 4chan has drawn particular attention due to the platform’s long history of hosting harmful and illegal material despite its age restriction policies. Ofcom has expressed concerns about 4chan’s responsiveness to regulatory inquiries and its compliance with risk assessment requirements, underscoring wider challenges in regulating anonymous online communities. The ongoing probes illustrate the UK government’s and Ofcom’s broader commitment to enforce robust online safety standards and hold platforms accountable for preventing the spread of illegal content.
Reactions to the enforcement actions have been mixed, balancing calls for stronger user protections with debates over the scope of platform responsibility and free expression. Ofcom continues to engage with service providers and the public, emphasizing its dual approach of encouraging voluntary compliance and imposing sanctions where necessary to create safer digital environments under the Online Safety Act. These investigations mark a critical phase in the evolving regulatory landscape aimed at addressing complex online harms in the digital age.
Background
4chan, an anonymous imageboard, has been at the center of numerous controversies over the years. The platform has been linked to incidents such as Gamergate, various cyberattacks, threats of violence within the United States, and the distribution of child pornography. Concerns have been raised regarding the site’s promotion of misogyny and violence, which some observers argue have contributed to the growth of incel culture and the broader spread of online hate. Despite these risks, the site continues to attract teenage users.
In response to increasing concerns about online harms, the UK government enacted the Online Safety Act, which came into law on 26 October 2023. This legislation assigns Ofcom as the regulator responsible for overseeing online safety, particularly focusing on protecting children and vulnerable users. The Act requires online service providers to conduct thorough illegal content risk assessments, maintain up-to-date measures to mitigate those risks, and comply with transparency and accountability duties depending on their classification under the Act’s categorization system.
Ofcom’s regulatory powers include enforcing compliance with these duties, requiring platforms to take corrective actions, and, in severe cases, pursuing business disruption measures such as blocking access to non-compliant sites or preventing payment providers and advertisers from working with them. The regulator can also hold companies and senior managers criminally liable for failures related to child safety and the prevention of child sexual abuse and exploitation on their platforms. These provisions enable Ofcom to take decisive action against online services that fail to protect users effectively, regardless of where the companies are based, provided they have relevant links to the UK.
Overview of the Nine Probes Initiated by Ofcom
In response to concerns over illegal content and user safety, Britain’s media regulator Ofcom has launched nine investigations under the UK’s Online Safety Act. These probes target a range of online services, including prominent platforms such as the internet message board 4chan and several file-sharing and pornographic websites. The investigations aim to determine whether these services have failed to implement adequate safety measures required by the Act to protect UK users from illegal content and activity.
Among the services under scrutiny, 4chan is being investigated due to complaints about potential illegal content, including the possible sharing of child sexual abuse material. Ofcom has noted that 4chan has not responded to requests for information, heightening concerns about compliance with online safety duties. The platform has a history of controversies involving misogyny, violence, and child exploitation, which have raised ongoing worries about the risks it poses to users, especially minors.
In addition to 4chan, several file-sharing platforms are being examined for possible involvement in distributing child sexual abuse imagery. The regulator has also targeted three pornographic service providers—Itai Tech, Score Internet Group, and Kick Online Entertainment—which operate sites such as Undress.cc, Scoreland.com, and Motherless.com, respectively. These sites are under investigation for inadequate age assurance measures and other compliance failures under the Online Safety Act.
Ofcom’s enforcement powers under the Act include the ability to seek court orders to disrupt businesses that fail to comply, such as requiring payment providers or advertisers to withdraw services or ordering internet service providers to block access to offending platforms in the UK. The regulator has emphasized the importance of completing illegal harms risk assessments, which all in-scope services were required to submit by mid-March 2024, as a fundamental step toward identifying and mitigating risks associated with illegal content.
The investigations form part of a broader enforcement programme that also monitors compliance with duties related to illegal content risk assessments, record-keeping, and age verification, particularly in the adult content sector. Ofcom plans to announce further enforcement actions in the coming months as additional provisions of the Online Safety Act come into effect, including child protection safety duties expected to be enforceable by mid-2025.
Detailed Examination of the 4chan Probe
Ofcom, the United Kingdom’s media regulator, has initiated a formal investigation into the internet message board 4chan as part of nine separate probes under the UK’s Online Safety Act 2023. This investigation aims to determine whether 4chan has failed to implement appropriate safety measures to protect users, particularly minors, from illegal content and activity.
4chan is an imageboard platform intended for users aged 18 and over, largely due to the prevalence of inappropriate content that poses significant risks to children’s online safety. Despite its adult-only designation, concerns persist that 4chan’s content—including material linked to misogyny, violence, and child sexual abuse—remains easily accessible, raising questions about the platform’s effectiveness in preventing underage access.
The Ofcom investigation will assess whether 4chan has complied with its duties under the Online Safety Act, including the implementation of adequate safety measures to protect UK users from illegal content, the completion of risk assessments regarding illegal harms, and the maintenance of accurate records of these assessments. Additionally, Ofcom is examining whether 4chan has appropriately responded to statutory information requests made by the regulator, which platforms are legally required to address. Failure to comply with these requests can lead to significant penalties, including fines up to 10% of turnover or £18 million, whichever is greater.
Ofcom’s enforcement process involves an initial assessment of alleged infringements, followed by a formal investigation if the allegations appear credible. Should the investigation confirm non-compliance, Ofcom may issue a provisional notice of contravention, allowing the platform to respond and take corrective actions. The investigation into 4chan is ongoing, with Ofcom committed to providing updates as more information becomes available.
Given the platform’s history and the nature of complaints received, Ofcom’s scrutiny of 4chan underscores the regulator’s emphasis on child protection and the broader mandate of the Online Safety Act to combat illegal content online. The investigation represents part of a wider effort to ensure that digital platforms take robust and effective measures to safeguard users from harm.
Overview of the Other Eight Probes
In addition to the widely publicized investigation into 4chan, Ofcom has launched eight further probes targeting various online platforms under the UK’s Online Safety Act, enacted in 2023. These investigations primarily focus on whether these services have failed to implement adequate safety measures to protect users, particularly children, from illegal and harmful content.
Among the scrutinized platforms are several file-sharing services, which have faced complaints regarding the potential dissemination of child sexual abuse material. Ofcom’s inquiries aim to determine whether these providers complied with their statutory duties to safeguard UK users by carrying out sufficient illegal content risk assessments and updating them as required. Failure to meet these obligations could result in enforcement actions, as Ofcom emphasizes the necessity for platforms to understand and mitigate risks related to illegal content.
Furthermore, investigations include pornographic services operated by companies such as Itai Tech and Score Internet Group, which manage websites like Undress.cc and Scoreland.com, as well as Kick Online Entertainment, the provider of Motherless.com. These probes evaluate whether these adult content platforms have fulfilled the requirements to offer appropriate safety tools to users and to reduce exposure to harmful material, including content promoting suicide, self-harm, and eating disorders.
Ofcom has indicated that it will continue to monitor compliance rigorously and expects to announce further enforcement actions in line with additional duties that will come into force under the Online Safety Act. These duties include the mandatory implementation of child protection measures and comprehensive risk assessments by set deadlines throughout 2024 and 2025. The regulator underscores the importance of transparency and thorough compliance to safeguard UK users across all online services within its jurisdiction.
Legal and Regulatory Framework
The Online Safety Act 2023 establishes a comprehensive legal and regulatory framework to ensure online platforms operating in the UK take responsibility for user safety, with a particular focus on protecting children and tackling illegal content. The Act designates Ofcom as the primary regulator tasked with enforcing these provisions and overseeing compliance across a broad range of internet services.
Under the Act, Ofcom has been granted extensive powers to regulate online services, including the authority to require platforms to conduct detailed illegal harms risk assessments. These assessments enable providers to evaluate the likelihood of users encountering illegal content or using the service to facilitate criminal offenses. Providers must submit these risk assessments by specified deadlines, failing which they risk enforcement actions including fines and sanctions. Ofcom’s guidance on the required contents of these risk assessments was published in December 2024, further clarifying expectations for service providers.
Enforcement mechanisms under the Act are robust. From March 17, 2025, Ofcom can impose financial penalties of up to £18 million or 10% of a company’s global turnover, whichever is greater, on non-compliant entities. In addition to fines, Ofcom may pursue criminal liability against companies and senior managers for failures related to child safety duties or responses to child sexual abuse and exploitation. The regulator also has powers to seek court orders imposing “business disruption measures,” such as compelling payment providers and advertisers to withdraw support or requiring internet service providers to block access to offending sites within the UK.
Ofcom’s enforcement program actively monitors compliance with the Online Safety Act’s requirements, focusing on illegal content risk assessments, record-keeping duties, and specific sectors such as adult content providers and file-sharing services. This program has included high-profile investigations into allegations of illegal content, including child sexual abuse material on platforms like 4chan and various file-sharing services. Where violations are confirmed, Ofcom’s sanctions aim to hold platforms accountable and mitigate harms to users, especially vulnerable populations such as children.
Responses and Reactions
Following the initiation of nine probes under the Online Safety Act, including scrutiny of 4chan, various responses and reactions have emerged from regulators, platform representatives, and the public. Ofcom has emphasized its commitment to maintaining proactive communication with online platforms to enhance safety measures continuously. While its preference is to encourage voluntary compliance through guidance and supervision, Ofcom retains the authority to launch investigatory and enforcement actions when necessary, particularly if compliance failures are identified.
Ofcom’s regulatory approach aims to balance user protection with proportionality, ensuring that safety duties correspond to the risk of harm, as well as the size and capacity of the platform. This approach avoids imposing the same stringent requirements on smaller services as on large corporations, taking users’ rights into account when determining necessary steps. In line with this, Ofcom has set out guidance on risk assessments and protective measures, with certain safety duties, including child protection, expected to become enforceable around mid-2025.
Regarding 4chan specifically, the platform has long been controversial due to its accessibility of inappropriate content and association with extremist and offensive material. It is officially designated for users aged 18 and over; however, concerns remain about the potential risks posed to children and vulnerable groups by the site’s content. The ongoing investigation reflects wider anxieties about how such platforms respond to statutory information requests, maintain risk assessment records, and comply with the new online safety framework.
Public reactions have been mixed, reflecting both support for stronger regulation to protect users and debates about the limits of platform responsibility and free expression. Ofcom continues to provide updates as investigations progress and stresses the importance of compliance to safeguard online communities effectively. The regulator also highlights available support services for individuals affected by illegal or harmful online content, reinforcing the broader goal of creating safer digital environments under the Online Safety Act.
Impact and Future Developments
The initiation of nine probes by Ofcom under the Online Safety Act marks a significant step in the enforcement of the UK’s new digital regulatory framework. These investigations underscore Ofcom’s commitment to ensuring that online platforms comply with their statutory duties to protect users from illegal and harmful content, particularly in areas such as child sexual exploitation, abuse, and the distribution of illegal material. Platforms found to be non-compliant may be required to implement specific corrective measures to align with the Act’s requirements, reflecting a proactive regulatory stance that seeks not only to identify breaches but also to drive sustained improvements in online safety practices.
Looking ahead, Ofcom plans to publish draft proposals regarding additional duties on categorized services by early 2026, aiming to further clarify and strengthen the obligations of service providers under the Online Safety Act. The regulator is also tasked with developing comprehensive guidance and codes of practice through public consultations before these are finalized and enacted, ensuring that service providers have clear frameworks to meet their responsibilities effectively. This phased approach emphasizes transparency and collaboration, balancing enforcement with industry engagement.
The enforcement program extends beyond compliance checks to include scrutiny of providers’ risk assessments and record-keeping related to illegal content, highlighting the importance of thorough internal controls within platforms. For example, services had to complete illegal harms risk assessments by March 2025, an essential process to evaluate the likelihood of users encountering illegal content and to inform the implementation of mitigation strategies. Ofcom’s ongoing evaluations and targeted enforcement actions are expected to continue, focusing on both current adherence and the evolving challenges posed by new technologies and content types, such as generative AI and chatbots.
The spotlight on platforms like 4chan, which has been linked to various controversies including the promotion of misogyny, threats of violence, and potential dissemination of child abuse material, illustrates the complex nature of regulating online communities where harmful content can proliferate. These investigations may lead to significant consequences for platforms that fail to uphold their legal obligations, signaling a broader industry-wide push for accountability and enhanced user protections.
