Digital Services Act
... 52 % of women express their opinions online less often for fear of hate. Alfred Landecker Foundation Our commitment to a better Digital Services Act (DSA) is realised through the financial support of the Alfred Landecker Foundation. This is how we advocated for...
The Judgement in the case Künast vs. Facebook
...moderators The court expressly rejected Facebook’s claim that the use of so-called upload filters would be necessary for purposes to identify and delete the content in question. Instead, Facebook is required to use human content moderators to decide if the post constitutes a violation of the claimant’s personality rights....
Transparency reports: How social media platforms fail on users’ rights
...for its quality. According to our findings there is reasonable doubt about how reports are assessed by the platforms and users confirm that content moderation praxis is incomprehensible and dissatisfying: In the run-up to the French presidential election in March 2022: Facebook failed to remove reported illegal hate comments in...
The Digital Services Act is coming: Important step to tackle digital violence in the EU
...DSA. In the future, social media platforms must offer direct and quick contact options for users. HateAid and other civil society organizations had advocated hard for this. However, this obligation only applies to electronic communication channels. HateAid fears that the platforms might fob users off, for example with chatbots. It...
Mixed feelings: Digital Services Act replaces NetzDG
...reporting channels for illegal content and complaint mechanisms. Josephine Ballon, CEO of HateAid, comments: “The DSA provides users with specified tools to defend themselves against digital violence. But our experience with the platforms shows: there is reason to fear that Musk, Zuckerberg, and others will seek and exploit loopholes...
#Kriegstreiber: With warmonger accusations into the middle of society
...in order to be able to react well to new developments and to be able to prevent the strategic spread of propaganda. Examples could be: The temporary blocking of particularly active accounts or the hiding of comments. We also consider Twitter’s already announced allocation of labels for tweets and accounts that...
Digital violence
...and how to report content: Guidelines of the platforms Facebook community standards Instagram community guidelines Twitter community rules YouTube community guidelines TikTok community guidelines Snapchat community guidelines Strengthening human rights online HateAid is committed to a better internet. We rely on donations for this work. If...
Digital Services Act brings new rules for TikTok, X and Co.
...respond to a user’s report or rejects it, there are now ways to complain. For example, users can request a second review of their report up to six months after receiving the information about its rejection. Such a complaint can be made electronically and free of charge. Trusted Flaggers:...
Close the gap: Securing women’s voices in politics
...for politically active women. This includes demeaning remarks about appearance, rape threats, and misogynistic comments that attempt to relegate female politicians to the kitchen or perpetuate outdated gender roles. Antifeminists and other misanthropes aim to intimidate women in crucial positions through digital violence. YouGov survey Insults acceptable to...
Digital Services Act: Barely Any Protection for Victims of Digital Violence
...for users. Instead, the DSA relies too heavily on measures against abusive reporting and potentially unlawful deletion of content in social media. The aim is to prevent a restriction of users‘ freedom of expression. But freedom of expression is also at risk when victims of digital violence cannot enforce the...