Ofcom faces judicial review over alleged failure to act on intimate image abuse location-based forums

Ofcom has been warned it could face a judicial review over its alleged failure to act against platforms hosting non-consensual intimate images and child sexual abuse material (CSAM).

Posted on 05 March 2026

These forums group images by location, with intimate images of women and girls arranged by area as specific as their town, village or university halls of residence.

Leigh Day is acting for a survivor of non‑consensual intimate image abuse, referred to as Jane, and the End Violence Against Women Coalition (EVAW). EVAW is a coalition of organisations and individuals campaigning to end all forms of violence against women.

Last year, Jane formally asked Ofcom to exercise its enforcement powers against platforms which hosted non-consensual explicit images of her and other women and children. She urged the regulator to open a formal investigation into these platforms perpetuating intimate image abuse and to consider using its powers of investigation and enforcement.

Ofcom has previously said it is aware of Jane's case and that it was "considering any appropriate next steps" adding it has "a broad range of enforcement powers to hold tech firms accountable" and that it "won't hesitate to use them where necessary".

Leigh Day human rights solicitors Tessa Gregory and Claire Powell have now sent Ofcom a formal Pre‑Action Protocol Letter on behalf of Jane and EVAW setting out their intention to seek a judicial review.

Together with Jane, EVAW claim Ofcom has failed and is failing to intervene against platforms hosting non‑consensual intimate images and child sexual abuse material and is contravening its own guidance on when it should act.

The letter warns that one such forum hosts thousands of intimate images, including CSAM. It says the images form part of a “collector culture” that fuels misogynistic abuse.

Jane’s case is one that demonstrates Ofcom’s ongoing failings to act on these sites and shows that these are systemic issues impacting women and girls and their safety both online and offline. Jane first raised these issues with Ofcom in February 2025 but has had no substantive response from Ofcom or information on what they are doing to tackle these sites.

EVAW has engaged with Ofcom for many years, including during the drafting of the Online Safety Act 2023, and the duties and powers provided to Ofcom under the Act. Jane and EVAW say that Ofcom is not living up to its commitments to protect women and girls and that the failure in this case demonstrates that urgent, systemic action must be taken.

The letter draws on evidence provided by EVAW, and other expert organisations in the VAWG sector, to Ofcom highlighting the consequences of failing to act against collector culture sites, the importance of prompt and decisive enforcement action and the impact their silence has on the safety of women and girls.

EVAW was one of the organisations which successfully lobbied for the inclusion of women and girls in the Online Safety Act (which on first drafting contained no mention of women and girls) and has a long history of engagement with Ofcom and advocating on the importance of better protections and swift action against these sites.

In failing to act to protect women and girls from intimate image abuse, Ofcom places a burden on charities and support services, such the Revenge Porn Helpline and #NotYourPorn, to support survivors and assist in the rapid removal of non-consensual content from the internet.

Jane and EVAW’s case on Ofcom’s alleged failings is that:

  1. By not acting, Ofcom is failing to comply with the intention of the Online Safety Act 2023, which is intended to protect women and girls.
  2. Ofcom’s approach fails to follow its own policies, which prioritise taking action against sites such as these under its established regulatory framework.
  3. Ofcom’s failure to act breaches Jane’s human rights, including her rights under Articles 3 (inhuman and degrading treatment), 8 (right to private and family life), and 14 (freedom from discrimination).
  4. Ofcom has acted irrationally in deciding which sites to investigate, for example investigating X and Grok but not these sites, without a coherent justification.

Jane and EVAW are calling on Ofcom to confirm whether an investigation has been opened, to disclose its decision‑making documents, or to explain why no action has been taken. They allege that Ofcom has allowed “collector culture” image boards to operate with impunity, despite having the regulatory power to stop them.

Jane is also receiving assistance from Mishcon de Reya who have been corresponding with Google to demand that these sites are urgently delisted. So far Google has failed to remove many of the iterations of this site, despite these sites clearly violating Google’s own policies and terms of service.

Jane said:

"Responsibility for overseeing these sites and ensuring they comply with their legal duties under the Online Safety Act 2023 rests with Ofcom, yet their repeated inaction makes it feel like they are turning a blind eye to the abuse I have experienced.

“Like thousands of other women, I had my intimate images and personal information shared without my consent on a forum named after the place where I grew up. These platforms facilitate a particularly harmful form of degradation and humiliation, where women’s images are exchanged like trading cards - traded for sexual gratification, peer networking, and the social status derived from their abusive context.

“Despite repeated attempts to report and flag these harmful sites to Ofcom, these sites have continued to operate. The burden has fallen on me, someone directly affected by this abuse, to push for stronger regulation. I struggle to see how this reflects any meaningful commitment by Ofcom to tackling online harms against women and girls.”

Janaya Walker, Interim Director of the End Violence Against Women Coalition (EVAW):

“Image-based abuse is a violation of women and girls’ rights, and the existence of ‘collector culture’ is a particularly horrible example of misogyny and violence against women. Tech platforms have responsibilities to prevent this abuse, rather than enable it. And where they fail to do so, the burden should not rest on survivors like Jane to force the regulator’s hand. We are supporting Jane because we expect Ofcom to take proactive action against sites who cause this harm, and to ensure that all women and girls’ rights are upheld.”

Claire Powell, solicitor at Leigh Day, said:

“Collector culture sites are vile and misogynistic. They place women and girls at harm, both in the online and offline world. Despite its public commitments, our clients allege that Ofcom is failing to tackle these sites and failing in its obligations to protect women and girls like Jane who are left without the support and resources of the regulator. Ofcom and the UK government must live up to their promises to tackle violence against women and girls and must exercise their powers of enforcement without further delay."

Jane has launched a crowdfunder on the CrowdJustice website to fund the legal action.

Profile
Claire Powell 2
Diesel emissions claims Group Claims Human Rights Public Law

Claire Powell

Claire is an associate solicitor working across the International and Human Rights departments.

Profile
Tessa Gregory
Corporate accountability Human rights Judicial review Planning Wildlife

Tessa Gregory

Tessa is an experienced litigator who specialises in international and domestic human rights law cases

News Article
Sad woman on bed
Human rights Sexual abuse

Woman challenges Ofcom to take action against explicit image forum for posting pictures of her and others without their consent

A woman whose intimate images were posted online without her knowledge or consent is taking her fight for justice to Ofcom, challenging the media watchdog to use its new Online Safety Act powers to save her and others from further violation.