Are you affected by online hate?

We support you!

Send an email to: 

Give us a call during office hours:

030 / 252 088 38

Mon 10 a.m. -1 p.m. | Tue 3 p.m.- 6 p.m. | Thu 4 p.m. – 7 p.m.

Chat with us:* Wed 3 p.m. – 6 p.m. | Fr 11 a.m. – 2 p.m.

*Please accept the cookies to chat with us.

Report hate via our form:

Reporting Form 

Dickpic received?

File a complaint directly! 


Unsatisfied and helpless: New Poll on Digital Violence shows how social media platforms are failing users

Anyone who experiences or observes digital violence on social media platforms can report it to the platforms and request removal. A new representative survey by HateAid and the Landecker Digital Justice Movement gives an insight into users’ experiences reporting digital violence. For this purpose, 10,000 people from France, Germany and Sweden were surveyed. The results are worrisome: users are largely unsatisfied with the quality of content moderation and often find the content decisions of the platforms incomprehensible. Moreover, the respondents demand low threshold options to complain about wrongful content decisions directly to the platforms.

Hate speech, insults, and threats: Digital violence has become a mass phenomenon on Facebook, Twitter and platforms alike. Every second young adult between 18 and 35 in the European Union has already been affected by online violence themselves. Social media platforms continuously claim to spare no effort to protect users from online violence. However, this new representative poll shows: the majority of users have a different perspective.

Reporting mechanisms: Every second user is unsatisfied

Users feel let down by the platforms’ reporting channels. 1 in 2 respondents, who has reported violent content, is unsatisfied with the notification system and the way platforms handled their notification. Main reasons: Lack of transparency, incomprehensible responses, and platforms’ inaction. 48% of those who have already reported violent content criticize that the platforms have not taken action. As a result, victims often are left alone to deal with the violent content. A recent investigation by HateAid and LICRA in the run-up to the presidential elections in France confirms this experience: Facebook systematically failed to remove violent content.

Users call for a low-threshold way to challenge content decisions of platforms

Users expect online platforms to react quickly when abusive content is reported. 82% of respondents agree that all users must have the right to challenge content decisions made by platforms through an internal complaint mechanism if platform has not taken action or has made a wrongful decision.
The poll also found that only 3% of respondents have turned to courts, mainly out of high financial risk and length of the proceedings, signalling that low-threshold options are needed, to challenge the current power asymmetries.

The European Union (EU) now has a once in a generation chance to oblige online platforms to do that with the Digital Services Act. This new set of rules for online platforms could give millions of Europeans more rights on social media and more protection against digital violence.

Anna-Lena von Hodenberg, executive director of HateAid:

“The Europeans surveyed have formulated a clear assignment for the EU: They demand protection through effective reporting and complaint channels in order to defend themselves against digital violence. So far, the platforms have often ignored their responsibility and left victims unprotected. The EU cannot sit still and watch its citizens being targeted and increasingly pushed out of the public digital space. The EU must guarantee their safety and rights.“

Based on these findings, HateAid recommends:

  • Give all users access to internal complaint mechanism and out of court dispute settlement, also in cases where the platform has rejected a notification or not reacted.
  • Make reporting channels user-friendly and located close to the content in question.

Member of the European Parliament Evin Incir, S&D, Sweden:

“Online violence is both a threat to our individual well-being and our democracy. As a young female politician of colour, I have unfortunately experienced it myself many times. I believe that many of my female colleagues in the European Parliament would agree that we are, as in real life, more vulnerable online in comparison to our male counterparts. Already now, many women and people of colour are hesitating to run for office because of experience of sexism, hatred, violence, discrimination and racism. It is crucial that the online platforms ensure effective complain mechanisms for users and that the platforms take their responsibility to report unlawful actions to the police as well as ensuring that their platforms are a safe space for all through taking down harmful content. Online platforms can´t continue to be a place of impunity. Last year through two different reports, the parliament demanded the EU Commission to propose common rules in our union to combat gender based violence – both online and offline. As a result, the Commission has now proposed a regulation on implementation of minimum standards in order to tackle rising gender based violence. I hope that the proposal will be an important tool to combat the current situation where women and girls are being deprived of their fundamental rights online.”

Read full report here.