Are you affected by online hate?

We support you!

Send an email to:

beratung@hateaid.org 

Give us a call during office hours:

030 / 252 088 38

Mon 10 a.m. -1 p.m. | Tue 3 p.m.- 6 p.m. | Thu 4 p.m. – 7 p.m.

Chat with us:* Wed 3 p.m. – 6 p.m. | Fr 11 a.m. – 2 p.m.

Report hate via our form:

Reporting Form 

Dickpic received?

File a complaint directly! 

 



Mixed signals in the DSA

A historic call against sexual abuse material but a clear fail to strengthen hate victims’ rights on the platforms

On Thursday the European Parliament (EP) adopted its report on the Digital Services Act (DSA). The newly introduced provisions, if backed by other European Union (EU) institutions during the upcoming Trilogue negotiations, could change the internet for thousands of victims of digital violence. HateAid welcomes that some of the demands for more protection against online hate have been met. But the counseling center for victims of digital violence warns strongly that one of the most critical issues has been left out of the EP’s position: victims are still left without a right to complain to the platform if hateful content against them has not been taken down.

Milestone for victims of image-based sexual abuse: New regulations for porn platforms

Parliament’s decision would cause for the first time serious action against the abuse of intimate images on porn platforms. The new provisions include identification obligation for content uploaders, trained content moderators and separate reporting channels to notify and take down abuse material. This would be a huge step forward for those affected by so called revenge porn, deepfakes and other forms of image-based sexual abuse. A crime where an overwhelming majority of victims is women. Researchers have warned that image-based sexual abuse is alarmingly common, and that victims report significant delays in removal on porn platforms. HateAid, alongside other civil society organisations, has campaigned to secure that MEPs recognize image-based sexual abuse as an issue to be specifically addressed by the DSA.

Anna-Lena von Hodenberg, CEO of HateAid:

“Sexual abuse of women through porn platforms need to end. That’s why this is a milestone that everyone who has advocated together with HateAid, from victims to NGOs, to academia and MEPs, should celebrate. The scope and consequences of image-based sexual abuse are devastating, and I am pleased that the European Parliament has taken the matter in their hands. Now it is the time for the governments and the Council to recognize significance of the issue and act by backing the European Parliament’s position.”

Improved user’s rights: Victims can address authorities to seek orders against platforms

Furthermore, HateAid is pleased that the EP has supported our call to protect users’ rights when they are personally affected by illegal content. The EP wants to enable users to turn to authorities and seek orders against the online providers to remove illegal content. For example, if a user is attacked online with a death threat or racial hate speech, and platform fails to remove it, despite notifications, the affected person can turn to authorities to request a content removal order addressed to the platform.

Insights into the black box: Parliament votes for more transparency over risks and impacts

The European Parliament has adopted a strong position on platform transparency and platform data access to vetted NGOs. Both the content moderation practices, as well as algorithms used by the platforms are currently protected under trade secrets. This is not justified, argued HateAid and PeopleVsBigTech, a broad coalition of civils society organisations, because they have a significant impact on public discourse online as well as user’s wellbeing. Revelations by the former Meta employee Frances Haugen showed how platforms like Facebook are using toxic algorithmic amplification to generate profits. EP wants to change this: Very large online platforms will need to critically assess risks to public safety, health, fundamental rights, and range of other areas, that their services (from the design to algorithms) carry. If implemented earlier, this kind of measures potentially could have prevented the negative impact on teen mental health that Instagram algorithms have reportedly caused.

Parliamentarians fall short of promises: Victims of hate speech are deprived of direct complaint opportunities

Regrettably, despite of wide support by MEPs at the preliminary stages of the draft report, victims of digital violence will still be suffering from wrongful content decisions by the platforms. If a platform rejects to take down a hateful comment, a death threat or a defamation, these victims will not have any right to complain to the platform directly. They will also not be able to use an out-of-court dispute settlement mechanism. These privileges are only reserved for users who want to complain about content that the platforms might have wrongfully deleted. HateAid finds the two-class system of victims that this creates inapprehensible and unjust. It is leaving millions of users, affected by hate speech and disinformation, vulnerable. HateAid is calling on the Council to uphold their position concerning equal access to mechanisms laid out in Article 17 and Article 18 in the Trilogues.

Press contact:

presse@hateaid.org , Tel. +49 30 252 088 37



    Bitte fülle noch das Captcha aus*

    Captcha
    8 - 2 = ?
    Reload

    Mehr Infos in unserer Datenschutzerklärung. *