All about ”Travis vs. X” landmark case
Independent research in danger
Data analyst Travis Brown researches how social media platforms work. In his research, he uncovers that right-wing extremists are spreading hate and disinformation unhindered on the platform X, formerly Twitter. Instead of taking consistent action against the perpetrators, X repeatedly blocks Brown’s account. The accusation: unauthorized data collection. HateAid therefore took legal action against X together with Brown. The goal: to get the analyst’s account back and enable free research.
The verdict
Hurdles in court
However, the court did not address the issue of data access. This is because it declined jurisdiction in this case and declared that the courts at the platform’s location had jurisdiction – an assessment shared by the Berlin Court of Appeal. This now means that Brown would have to enforce his claim to have his account unblocked before a court in Ireland. This could also apply to many other users who wish to take legal action against platforms: they could have to sue in the location of the company in question.
The problem here is the enormous hurdles. For example, the search for legal representation abroad, language barriers or the high costs of 150,000 to 300,000 euros per lawsuit. For Travis Brown and many other victims, this is unacceptable. Even for an organisation like ours, proceedings at Irish courts are hardly manageable. For this reason, we will not be filing a lawsuit in Ireland and unfortunately cannot continue with Travis Brown’s case.
What will happen next
Demands on the EU
We are approaching the European Commission with a letter. Our demand: Users must be able to sue social media platforms at their place of residence. The EU must ensure that the rights of users do not only apply on paper, but can actually be claimed by those affected under reasonable conditions. Social media platforms such as X have legal representation in Germany anyway and therefore have the means for trials in local courts. Affected parties often do not have these resources for a lawsuit abroad.
The EU Commission can and must act now. That’s why we are increasing pressure on EU policymakers to consistently implement the DSA to guarantee freedom of research. Photo: Shutterstock/MDart10
The Travis Brown case
What it is about
In July 2023, X had blocked the account of data analyst Travis Brown without warning. Brown, together with HateAid, then obtained a temporary restraining order against X. His account was reactivated. The reason given was that Brown had not been heard by X and that the account had been blocked without sufficient justification.
In October 2023, Brown’s account was blocked again – this time with an advance warning and citing alleged violations of the platform’s guidelines. Again, with the support of HateAid, Brown filed a motion for a preliminary injunction. With a surprising result: the Berlin Regional Court dismissed the action, considering itself to lack jurisdiction. Brown appealed against this decision. HateAid covered the costs of the proceedings and legal representation as part of its legal costs funding.
Foto: HateAid
“It is extremely dangerous if independent research and reporting on social networks can no longer take place. Especially in the run-up to important elections, we often see right-wing extremists mobilising massively on social networks and spreading disinformation. We have supported this case with Travis Brown on behalf of all researchers who make public what goes on behind the scenes on the platforms. Because only if we know what is happening there and who is acting how, can we protect our democracies.”
Anna-Lena von Hodenberg, CEO HateAid
Our goal
This is what we’re committed to
We want to show that platforms like X cannot arbitrarily block people they don’t like. With this landmark case, we want to strengthen the work of researchers in the long term. The data collected is crucial so that we can take action against disinformation and hate.
Researchers must therefore be protected from the platforms’ attempts at intimidation. So far, taking action against the platforms in these cases involves an enormous cost risk that many of those affected cannot bear. The EU Commission must therefore ensure that users are able to sue social media platforms at their place of residence. Otherwise, they will hardly be able to defend themselves against arbitrary decisions by the platforms due to the high hurdles.
Why now?
Critical voices are locked out
The leaders of social media platforms are increasingly locking out researchers and their content. In the USA, for example, X is suing the nongovernmental organisation Center for Countering Digital Hate for publishing several reports showing an increase in hate speech on the platform.
The organisation AlgorithmWatch even had to suspend a project on Meta’s Instagram platform in the face of a threatening lawsuit.
The danger: out of fear of intimidation lawsuits with expensive lawyers smothering them in hundreds of pages of briefs, researchers may soon stop publishing important information altogether.
But we urgently need these insights to counter hate, disinformation and extremism.
Foto: Wolf Lux / Alfred Landecker Foundation
„Social media platforms are part of the public space in which political opinion making and agenda setting take place. That is why the platforms fulfil a democratic function – and we must not ignore it when extremists and antisemites mobilise there unimpeded unhindered.”
Silke Mülherr, Co-CEO Alfred Landecker Foundation
Alfred Landecker Foundation
This landmark case is realised as part of the Landecker Digital Justice Movement – an initiative by HateAid and the Alfred Landecker Foundation.
Frequently asked questions concerning the lawsuit
Travis Brown is a software developer and data analyst from the USA. He has been active on X, formerly Twitter, since 2007 and now lives in Berlin.
Between 2014 and 2015, Brown himself worked as an open source advocate at X. Since 2021, Brown has been developing open-source programs that track hate speech on the platform and collect statistics using freely available data.
The results of his work are cited around the world by major news outlets such as the BBC and CNN.
Brown’s goal is to use the data collected to investigate the extent to which X has developed into a platform for the far-right scene.
A preliminary injunction is a provisional decision by a court in summary proceedings. It is intended to secure claims or settle disputed legal relationships until the court has reached a final decision.
In urgent cases, a temporary injunction can be issued within a few days or weeks.
In this case, three jurisdictions come into consideration: Germany, Ireland and the USA. Travis Brown lives in Berlin, but created his account at X in the USA. In addition, the platform’s headquarters for the European region are located in Dublin.
For the question of jurisdiction, it is relevant where the place of performance for the contract between Brown and X is located and whether Brown is a consumer within the meaning of European law.
In both cases, these are very complex legal issues.
What is surprising here, however: in the first injunction proceedings, the Berlin Regional Court deemed itself competent. The second application, on the other hand, was rejected for lack of jurisdiction.
X argues that Brown violated platform policies by collecting and analysing data on use and function. However, Brown did not draw on original data sets at all with the open source programs he developed.
Instead, Brown is working with data from a “WayBackMachine”. This is an internet archive that stores various versions and states of web pages from around the world.
For this reason, Brown’s research is not subject to the platform’s rules and therefore cannot violate them.
In fact, the action against Brown is a clear attempt of intimidation. With that, researchers like Brown are to be driven off the platform and stop their work.
The Digital Services Act is a new European Union regulation that aims to regulate online platforms and reduce the spread of disinformation and digital violence. Online platforms are, for example, social media or search engines that allow users to distribute their own content (e.g. Facebook, YouTube or Google).
These platforms will have had to comply with new rules since the DSA has come into force, for example they have had to make it easier to report criminal content and submit regular transparency reports. In addition, each EU member state must set up its own Digital Services Coordinator to monitor compliance with the new regulations. The Digital Services Act has applied to all online platforms since February 2024. You can find our DSA user guide here.
Online platforms have an immense influence on our communication and information behaviour. However, at the same time, they promote the rapid spread of hate speech, disinformation and other forms of digital violence.
To counter these dangers, we need robust insights into the functioning and risks of online platforms and their algorithmic systems.
However, researchers are repeatedly denied access to the relevant data by the platforms. Article 40(4) of the DSA stipulates that platforms must allow researchers to access this data under certain conditions.
This is to ensure that research to identify systemic risks can actually be carried out.
However, this regulation is very restrictive and there are still many ambiguities, such as when and how researchers can request data access.