All about the Twitter case
Protection from incitement to hatred?
Twitter promises to protect all its users. But antisemitism is a normality on the platform. Insults against Jews, trivialisation and denial of the shoah: antisemitic statements, images and videos can be found all over Twitter. The platform’s lack of moderation leads to an unprecedented spread of antisemitic ideology.
Now we’ll fight back. HateAid and European Union of Jewish Students (EUJS) are taking Twitter to court.
Antisemitism in Germany
% of Jews in Europe believe that antisemitism is on the rise and poses a problem especially online (1).
% of adults in Germany follow antisemitic ideas. At 30 %, this tendency is even stronger with 18- to 29-year-olds. (2).
% of posts containing antisemitic hate weren’t reviewed by social media platforms (3).
Every year on Shoah memorial day, policymakers make promises to take the lessons learned from the Shoah seriously and to resolutely stand against antisemitism. Today, this antisemitism is mainly spreading on the internet. To counter it therefore inevitably means holding the social media platforms accountable.
Hate speech against Jews is continuously increasing on the internet. Antisemitic content is prevalent in social media and very easily accessible via search functions, according to the report by the Amadeu-Antonio-Stiftung 2021. Experts state that these days more antisemitic content is spread via social media than ever before in history (4). They say that the quality of online posts is appalling – as if the “breach of civilisation post Auschwitz” (5) had never occurred. And this antisemitic hate not only stays on the internet but also paves the way for antisemitic attacks in the analog world which have been massively increasing over the past ten years (6). Still, no social network has ever been held responsible.
Antisemitism on Twitter
When opening a Twitter account, every person enters into a contract with Twitter. Their corresponding Rules and Policies point out the rules that apply on Twitter. The platform makes a boast of not permitting antisemitic content such as denial of the Shoah.
“We prohibit content denying that a genocide […] has taken place […]. This rule applies, among other things, to events such as the Holocaust […].”
But although Twitter prohibits antisemitic hostilities in its Rules and Policies, the platform leaves a lot of such content online. Even if the platform is alerted about it by users. Current studies prove that 84 % of posts containing antisemitic hate speech were not reviewed by social media platforms, as shown in a study by the Center for Countering Digital Hate. Which means that Twitter knows Jews are being publicly attacked on the platform every day and that antisemitism is becoming a normality in our society. And that the platform’s response is by no means adequate.
The safety and protection of the dignity of Jews are therefore in the hands of companies operating according to their own rules. We as the German society must commit ourselves to safeguard a secure and dignified life for all Jews in Germany – in the analog as well as the digital world.
Photo: Andrea Heinsohn Photography
“Twitter’s actions are based solely on its own, intransparent rules, relying on the fact that users have no chance to appeal – for example, when it comes to the non-deletion of incitements to hatred. There has been no single case where a social network was prosecuted for this by the authorities. This is why civil society has to get involved, looking for ways to demand the removal of such content. We as an NGO act as representative for the affected communities which are subject to hostility and incitements of hatred on a daily basis. Thus we can build pressure on the platforms in the long term.”
Josephine Ballon, Head of Legal HateAid
What we want to achieve in court
Jewish youths won’t accept this anymore and are now taking action.
On 24 January 2023, EUJS and HateAid have jointly filed an action against Twitter before the Landgericht Berlin. The complaint against punishable, antisemitic and inciting content also concerns, among other things, the trivialisation and denial of the Shoah.
Whether we can demand this is to be decided by the court. To date it is unclear to what extent Twitter users, on the basis of Twitter’s Rules and Policies, are entitled to demand the deletion of such content in cases where they are not themselves affected. We believe that Twitter has to abide by its own rules which it boasts about in its contract terms – to remove antisemitic posts and make sure that Jews can feel safe on the platform.
With our action, we take Twitter up on its contractual promises. We believe that platforms must delete antisemitic content – obviously, the platform needs to be compelled into doing so.
European Union of Jewish Students
The case is conducted together with European Union of Jewish Students (EUJS). EUJS is the European umbrella organisation of Jewish student unions.
What we want to have settled:
1. We want clarification by the court whether users can demand removal of punishable content such as, for example, denials of the Shoah, even when they are not themselves insulted or threatened.
2. We want clarification by the court whether NGOs such as HateAid or EUJS are likewise entitled to demand deletion of punishable content in this way.
Photo: Alexandre Liebhaberg
“For young people, social participation also means committing themselves online. Some Jews accept online hate and violence as unavoidable. Many lapse into silence. We will tolerate none of it any longer! We want to relieve them from the feelings of powerlessness and resentment and provide hope for a fair-minded and solidarity-based Germany and Europe. The lawsuit is our response to the failure of Twitter and of politics.”
Avital Grinberg, President European Union of Jewish Students
Forced out of public discourse
Twitter has for a while been known for not doing enough against digital violence on the platform. That is particularly dramatic because Twitter is a place where crucial public debates are held. Here, journalists, activists, politicians and other important decision-makers discuss political positions and current events.
The headlines about Elon Musk and his actions as head of Twitter give a striking picture of what will happen if we leave public debate in the hands of a private company, and now of a single individual.
Against this background, more and more Jewish voices are withdrawing from Twitter. Because an increasing number of Jewish users unfortunately have to experience that Twitter won’t protect them from violence and antisemitic hostility. Twitter is about to turn into a mouthpiece for disinformation, hate and agitation. We have all got used to that and Twitter can be confident that no one will have the idea to take it to court at their own risk.
Laws for all of us are made in the government district. This includes platforms like Twitter, but they don’t comply with them. Photo: HateAid
Panel discussion at the start of trial
What needs to be done so that Jews can feel safe on the internet? What are the responsibilities of politics and online platforms like Twitter to achieve this? At the start of the trial, Marina Weisband will discuss this with Josephine Ballon, Head of Legal HateAid, Avital Grinberg, President EUJS, Markus Weiß, research fellow TU Berlin, and lawyer Dr. Torben Düsing.
What the trial means for users
What’s the significance of this trial for all users?
Much of the antisemitic content in social networks is illegal, but affected persons experience difficulties in enforcing their rights to deletion and prosecution. One reason for this lies in the platform operators’ lack of cooperation and unwillingness to assume responsibility.
To date there is hardly a chance to sue social networks for non-deletion of incitements to hatred. Because unlike insults or death threats, these do not constitute infringements of the rights of single users. Large private companies take the small risk of users individually filing legal action to enforce their rights. In many instances, there is virtually no case law concerning basic user rights, affected persons are forced to take their actions through the court levels. That’s extremely laborious and cost-intensive, which is why to date many users have been refraining from taking this step.
With this landmark case we want to obtain a court order which clearly establishes that platforms like Twitter are legally bound by their own Rules and Policies to protect users from antisemitic digital violence. Such a court decision will make it easier for users to enforce their rights against the big platform operators in the future. The underlying principle is quite simple: If the contractual terms state that incitement to hatred is prohibited, then Twitter owes it to its users to remove such content. This could, for example, also be enforced by NGOs like HateAid in order to make the internet a safer place.
Photo: Wolf Lux / Alfred Landecker Foundation
“Strategic lawsuits have proven to be very effective in changing the legislative landscape and in enforcing the digital rights of all those who previously were at the mercy of the social media platforms. We appreciate and support the joint action by HateAid and EUJS because together they enable a society under attack to stand up against systemic hate. Rule of law can no longer be absent from the big platforms – it’s the only way to fight defamation, arbitrariness and discrimination in the long term.”
Silke Mülherr, Co-CEO Alfred Landecker Foundation
Alfred Landecker Foundation
The lawsuit was realised as part of the Landecker Digital Justice Movement – an initiative by HateAid and the Alfred Landecker Foundation.
Frequently asked questions concerning the lawsuit
The terms “Holocaust” and “Shoah” both describe the systematic murder of around six million Jews during the era of National Socialism between 1933 and 1945.
The term “Holocaust“ comes from the Greek word “holókaustus“ which means “burnt offering“ or “completely burned”. In ancient times the term referred to religious rituals in which a sacrificial animal was burned. In order to avoid the reference to sacral offerings, the Jewish community in particular prefers to use the Hebrew word “Shoah” (“huge catastrophe”, “downfall”, “destruction”).
In particular, we speak of incitement to hatred if somebody calls for violent measures or incites hatred against segments of the population. Persons trivialising or denying certain events – such as the Shoah – may also render themselves liable to prosecution. This is ruled in § 130 of the German Criminal Code. This law is to protect public peace, peaceful coexistence in society and human dignity.
It is important to report illegal posts to the platform operators, because only if they are aware of such posts they do have the obligation to take action. But the large platform providers’ transparency reports from the year 2022 show that just under 15 % of reported content is actually deleted. In fact, many users either don’t get a reply at all or they receive an automated rejection in a matter of minutes. It looks like contents are only reviewed by human beings in exceptional cases. Also, contents can be reproduced very quickly which means that many of the perpetrators aren’t deterred by the deletion of their posts. If possible, persons affected by hate speech should therefore also file a criminal complaint. Civil action for indemnity or compensation is also possible. Only in this way can strict criminal prosecution on the internet be implemented.
Why does the lawsuit relate to Twitter’s Rules and Policies when one could file a suit under German law?
To date, users can only claim for the deletion of contents if they’re personally affected – i.e. were insulted or threatened. Criminal offences such as incitement to hatred or anti-constitutional symbols, however, aren’t directed at individual persons, but at the general public. Individual users are therefore in many cases not personally affected. The only option is to file a complaint with the supervisory authority, whose possiblities of penalising the non-deletion of particular contents are limited, because it would be necessary to prove systematic failure in moderation to be able to impose a fine. This proof is difficult to provide. In recent years there has been no known case where a social network was prosecuted for non-deletion of contents. That’s why we’re taking matters in our own hands by referring to the user contract which every user enters when creating a profile. This includes the internal guidelines (Rules and Policies) which make great promises regarding protection of users from hate and agitation.
More on the topic
(1) Agentur der Europäischen Union für Grundrechte 2018
(2) World Jewish Congress
(3) CCDH 2021
(4) Schwarz-Friesel, Monika (2019): Judenhass im Internet. Antisemitismus als kulturelle Konstante und kollektives Gefühl
(6) Bundesverband der Recherche- und Informationsstellen Antisemitismus e. V. (2022): Jahresbericht Antisemitische Vorfälle in Deutschland 2021