Are you affected by online hate?

We support you!

Send an email to: 

Give us a call during office hours:

030 / 252 088 38

Mon 10 a.m. -1 p.m. | Tue 3 p.m.- 6 p.m. | Thu 4 p.m. – 7 p.m.

Chat with us:* Wed 3 p.m. – 6 p.m. | Fr 11 a.m. – 2 p.m.

Report hate via our form:

Reporting Form 

Dickpic received?

File a complaint directly! 


Eine Grafik, immer wieder wiederholt mit einer nackten Frau.

Image-based abuse is a crime – and no one helps. Stop it now!

HateAid and Anna Nackt call on EU-politicians to stop image-based abuse now! This month it will be 5 years since Italian national Tiziana Cantone took her own life. She was a victim of revenge porn, and she was failed by the justice system, police, law and other structures of the society, not seeing a way out of a nightmare. 5 years and hundreds of thousands of victims later, all eyes are on Europe as it paves its way to regulate social media platforms.

HateAid and Anna Nackt argue that the Digital Services Act (DSA), the new content moderation rules, should include additional requirements for pornography platforms to help put end to a nightmare for victims of image-based abuse. A nightmare that Tiziana Cantone was not able to wake up from.

Several EU Member States have criminalised image-based abuse, with Ireland and Belgium being among the most recent ones, perpetrators facing high fines and prison sentences. However, researchers and experts call for more kaleidoscopic view on justice, admitting that reforming criminal law, while necessary, is not enough. Victim-survivors call out porn platforms that are making money on their abuse. In 2020, xHamster, one of the world’s largest porn platforms with more visitors than LinkedIn and TikTok, excitedly reported an increase in demand, in some case of more than 100%, for categories such as “Exposed”, “Hidden Cam” and “Public” – exactly the categories where abuse is shared.

Victims are left alone

Why is this even possible one should ask. First, because current policies or thereof lack of proper regulations allow for it. Second, because police lacks resources to identify and prosecute the perpetrators. And third, victims do not receive support they so urgently need.

And while most online platforms have put in place tools to report illegal content and run mostly automated assessment of such reports, removals of abusive material are often slow, complicated and reliant on personal connections. If a platform does not react or their algorithm does not detect the abuse, nothing happens. But in reality, we know that impact on the victims is appalling. Withdrawal, psychological distress, insomnia, fear for physical safety, loosing jobs. The list goes on.

Act now to stop image-based abuse

It is time to stop abuse on porn-platforms. We call on decision makers to include additional requirements for porn platforms in the DSA.

Imposing additional requirements, such as double user verification, trained human content moderators and time-bound take-down for non-consensually shared images, is the bare minimum that the European policymakers should stand by.

The EU can do more for victims of online violence

There are more improvements that the DSA should introduce to support all victims of online violence.

  • Introduce summary proceedings for content decisions to make sure that court decisions can be taken in a speedy manner and expenses kept lower, therefore ensuring that users can seek effective and affordable judicial redress.
  • Codify obligation for online platforms to delete illegal content, so authorities can keep the platforms accountable. At the moment this enforcement is born by users.
  • Require large online platforms to mandate legal representatives accessible to users in each of the EU Member States, in one of the official languages of the country. Victims and organisations like HateAid should not be asked to pay 2400 EUR in translation fees.

It’s time to stand with the victims of digital violence and stop image-based abuse. The time is ripe for a change.