The Digital Services Act is coming: Important step to tackle digital violence in the EU
With today’s vote in the European Parliament the new internet law of the European Union (EU), the Digital Services Act, was adopted. With this, the EU is launching a far-reaching legislative package. It is intended to make social media platforms more accountable to tackle hate speech and other forms of digital violence. Now HateAid is calling for a strong enforcement regime and pledging to support victims to access their rights. The German counselling center for victims of digital violence also criticizes that the rushed and untransparent negotiation process has created many legal uncertainties and that women on porn platforms are not sufficiently protected.
HateAid welcomes the adoption of the Digital Services Act (DSA) as an important milestone in the fight against digital violence and commends that user’s rights will be strengthened on social media platforms. HateAid had actively engaged in the legislative process during the last year and fought for effective protection against hate speech and other forms of digital violence.
Anna-Lena von Hodenberg, executive director of HateAid:
„Social media networks have provided a platform for the massive spread of hate and agitation. With the Digital Services Act, we have for the first time an EU-wide law that attempts to stand up to Facebook, Twitter and platforms alike. We will be keeping a very close eye on the social media platforms to assess how they implement the new obligations and how regulators keep them accountable. For us, it’s clear that if the measures don’t bring improvements, we will call on the EU to tighten up obligations for platforms to effectively protect from digital violence.”
Important success: Platforms must enable direct contact points
Write a direct message to Twitter or Youtube to solve a problem? That was originally not included in the draft of the DSA. In the future, social media platforms must offer direct and quick contact options for users. HateAid and other civil society organizations had advocated hard for this. However, this obligation only applies to electronic communication channels. HateAid fears that the platforms might fob users off, for example with chatbots. It also remains unclear in which language the contact point must be provided.
With rights against hate: strengthened rights to complain
From the results of our poll on users’ experiences with complaint mechanisms on social media platforms, as well as our counselling services we know: Reporting digital violence often leads nowhere. One in three persons (36%) state that their report of abusive content was never answered by the platform, be it due to AI errors or safety staff shortages. Now users will get a chance to a second complaint in case they disagree with the platform’s decision. This option will be followed by an out of court dispute settlement body, that, thanks to efforts by civil society, will be available to all users across the EU, independently if their contact was wrongly removed or a notification got rejected. Now legal action at the costs and risks of the users is no longer the only the only option.
Rushed negotiation process leaves many questions unanswered
The extraordinarily rapid drafting and adoption of the DSA has not only massively hampered the participation of HateAid and other civil society organizations, favouring the well-resourced Big Tech lobby machine. As a result, the interpretation of the legal text in the courts could last several years.
It remains to be specified what standard will be used to define illegal content. There are major differences between the Member States in this regard. For example, an insult may be classified as illegal in Germany, but legal in France. Another challenge is that the chosen horizontal approach for the DSA sets to treat all platforms based on their size following a simple two-tier scale, regardless of their field of work. However, special regulations for certain platforms would have been necessary to guarantee fundamental and human rights of all users. As a result, the protection of women and particularly vulnerable groups on specific platforms simply comes up short.
Access to justice remains cumbersome for victims
Although the DSA makes an effort to create low-threshold options for users, it has one major shortcoming. Unless mandated by a court, in all the measures, the platforms retain the final say, no matter if it’s about removing potentially illegal content or blocking profiles. However, users will not gain any easier access to courts on content decisions. Therefore, stringent supervision will be essential to ensure that the platforms fulfil this responsibility conscientiously and with the necessary resources.
Political will to combat image-based sexual abuse falls short
The DSA leaves a blank spot against the massive distribution of stolen and faked intimate images on porn platforms. A partial success: The issue did end up on the agenda of the EU chief negotiators for the last round of DSA negotiations. However, it was dropped at the last second without a meaningful replacement. Victims of image-based sexual abuse will remain in a vulnerable position without a sufficient protection.
The Digital Services Act is expected to come into force at the beginning of 2024. However, for very large online platforms, among them likely to see Facebook and YouTube, regulations are likely to apply from mid-2023. HateAid will continue to critically monitor the implementation of the DSA as part of the Landecker Digital Justice Movement.
Press contact: firstname.lastname@example.org, Tel. 0049 30 / 252 088 37