Sunday, October 05, 2025

CfP: Conference, Online Platforms: Private Actors with State-Like Power? organized by Maria Diory Rabajante Daniel Buchmann Max Planck Institute for the Study of Crime, Security and Law

 




I am delighted to pass along this CfP for the Conference, Online Platforms: Private Actors with State-Like Power? organized by  Maria Diory Rabajante Daniel Buchmann Max Planck Institute for the Study of Crime, Security and Law

📚 𝐂𝐀𝐋𝐋 𝐅𝐎𝐑 𝐀𝐁𝐒𝐓𝐑𝐀𝐂𝐓𝐒 📢

According to a famous quote by Mark Zuckerberg the online platform Facebook is more like a government than a company. Many scholars have picked up on this comparison, calling online-platforms the "new governors" or "speech police" of the digital public sphere. These remarks underscore the increasing role of online platforms in regulating speech, enforcing laws, and shaping political and economic landscapes—functions traditionally associated with sovereign states. As these platforms wield unprecedented power not only in influencing human behavior and interactions, but also in shaping public discourse, political economy, and democratic processes, critical questions arise about their impact on fundamental rights, governance, and democracy.

Our conference entitled 𝐎𝐧𝐥𝐢𝐧𝐞 𝐏𝐥𝐚𝐭𝐟𝐨𝐫𝐦𝐬: 𝐏𝐫𝐢𝐯𝐚𝐭𝐞 𝐀𝐜𝐭𝐨𝐫𝐬 𝐰𝐢𝐭𝐡 𝐒𝐭𝐚𝐭𝐞-𝐥𝐢𝐤𝐞 𝐏𝐨𝐰𝐞𝐫? explores the extent to which platforms like Meta, Google, X, and TikTok exercise state-like powers. Through keynote speeches, panels, and discussions, we will examine issues like
• regulation of online speech,
• the platforms’ influence on public discourse and democratic processes including elections,
• the platforms' influence on the exercise of fundamental rights and the issue of effective fundamental rights protection in the digital sphere,
• regulation approaches between self-regulation and “hard” governmental regulation and
• the role of private platforms as regulators and norm-enforcers.

By fostering discussions across various controversial yet timely topics surrounding the enormous power of online platforms, this conference aims to critically investigate these platforms’ responsibilities in the digital age and chart pathways toward a more transparent, accountable, and equitable governance in the digital public sphere. Contributions can focus on any area of the issue, including EU law, other legal frameworks, policy issues, and theoretical foundations.

The conference will take place on 21./22. May 2026 at the Max-Planck-Institute for the Study of Crime, Security and Law in Freiburg, Germany. You can apply by sending an abstract (max. 600 words) as a PDF to the organizers' e-mail addresses (m.bovermann@csl.mpg.de; m.rabajante@csl.mpg.de; d.buchmann@csl.mpg.de) until 19. December 2025 at the latest. Funding for travel and accommodation within the EU is available to all speakers, subject to confirmation by the Max Planck Institute for the Study of Crime, Security and Law. More information is available in the Call for Abstracts below.

Maria Diory Rabajante Daniel Buchmann Max Planck Institute for the Study of Crime, Security and Law Max Planck Law
 
The CFP follows below.  
Mark Zuckerberg once famously remarked that “in a lot of ways Facebook
[was] more like a government than a traditional company”.1 This comment
captures the growing role of online platforms in regulating speech,
enforcing laws, and shaping political and economic landscapes—functions
historically reserved for sovereign states. Today, platforms wield
unprecedented power not only in influencing human behavior and
interaction, but also over public discourse, political economy, and
democratic processes. This raises urgent questions about their impact on
fundamental rights, governance, and democracy.
Scholars have described these companies as “new governors of online
speech,”2 “guardians,”3 “custodians of the internet,”4 and “speech police.”5
Some have even suggested that online platforms could soon be considered
actual states.6 While such claims may veer toward internet exceptionalism,7
they usually lead to the opposite conclusion: the depiction—sometimes
exaggeration—of platform power is frequently invoked to justify more
expansive governmental regulation.
1 David Kirkpatrick, The Facebook Effect (Simon & Schuster 2010) 254.
2 Kate Klonick, ‘The New Governors’ (2018) 131 Harvard Law Review 1600, 1669.
3 Niva Elkin-Koren and Maayan Perel, 'Guarding the Guardians: Content Moderation by
Online Intermediaries and the Rules of Law' in: Giancarlo Frosio (ed), Oxford Handbook
of Online Intermediary Liability (OUP 2020) 669.
4 Tarleton Gillespie, Custodians of the Internet (Yale University Press 2021).
5 David Kaye, Speech Police (Columbia Global Reports 2019).
6 Moritz Holzgraefe and Nils Ole Oermann, Digitale Plattformen als Staaten (Herder
2023) 220 ff.
7 In the sense of John Perry Barlow, A Declaration of the Independence of Cyberspace
(Electronic Frontier Foundation 1996).

2
Indeed, recent initiatives, such as the European Union’s Digital Services Act
(DSA), the United Kingdom’s Online Safety Act, and Canada’s Online Harms
Act ostensibly aim to curb online platforms’ powers and strip them of any
“state-like” position they may hold. But what if these laws actually do the
opposite?8 By recognizing and codifying existing private powers—through
mandated appeals procedures, transparency requirements, and risk
assessments—they may inadvertently strengthen and legitimize platform
authority within legal frameworks. Far from dismantling platform control,
these measures often formalize it, granting platforms a recognized role in
governance.
In doing so, regulation frequently portrays and treats online platforms as
(de facto) exercising the traditional tripartite functions of the state—
legislative, executive, and judicial—within their own digital territories,
inhabited by billions of users across the globe. The difference is that they
do so without a separation of powers, and under a private framework driven
by corporate interests rather than public mandates.
Executive Function
Beyond setting rules, platforms perform an “executive” function by
enforcing them on an industrial scale. They remove content, restrict, or
even suspend user accounts. This executive function relies heavily on
automated systems that filter, flag, and remove content for policy
violations, supplemented by human content moderators who review
escalated cases. The scale and speed of these operations exceed the
capacity of most regulators, positioning platforms as powerful enforcers of
online norms. Unlike states, however, their actions are not guided by
principles such as the protection of fundamental rights, the rule of law, or
democratic participation, but by incentives to maximize profits. This
divergence explains persistent concerns over the “overblocking”9 of
content, where lawful content is disproportionately removed in an attempt
to comply with regulations at in the least resource-intense way possible.
8 Cf. Petros Terzis, ‘Against Digital Constitutionalism’ (2024) 3 European Law Open 336,
346.
9 Especially surrounding the German Netzwerkdurchsetzungsgesetz, see e.g. Mathias
Hong, ‘The German Network Enforcement Act and the Presumption in Favour of Freedom
of Speech’ (Verfassungsblog, 22 January 2018) < https://verfassungsblog.de/the-
german-network-enforcement-act-and-the-presumption-in-favour-of-freedom-of-
speech/> accessed 13.08.2025.

3
Legislative Function
In their “legislative” capacity, platforms unilaterally create rules through
instruments such as Community Standards and Terms of Service. They
define what is allowed and not allowed within their spaces. These platform
“laws” generally apply uniformly across jurisdictions, with little regard for
local legal or cultural differences. Often, platforms go beyond state laws by
prohibiting “lawful but awful” speech—content that is legally protected yet
perceived as problematic or harmful.10 While merely contractual in nature,
these rules exert influence far beyond mutual rights and obligations.
Moreover, platforms legislate through a more subtle but also more powerful
mechanism: code.11 Their algorithms and user interface designs regulate
user behavior and determine which information becomes visible, amplified,
or marginalized.
Judiciary Function
Completing the tripartite structure of governance, platforms also exercise
“judicial” functions. In response to the growing volume of problematic
speech online and mounting public pressure, they have developed internal
complaint and appeal mechanisms. Meta’s Oversight Board represents a
particularly sophisticated model of adjudication and appellate review: an
independent body designed to hear appeals from users challenging Meta’s
content moderation decisions. Sometimes colloquially dubbed Meta’s
“Supreme Court”, the Oversight Board issues decisions that set precedents
and influence Meta’s broader content policies. In the EU, the emergence of
out-of-court dispute resolution bodies under Article 21 of the DSA, such as
User Rights and Adroit, adds a new dimension and poses the threat of a
parallel judicial system.
10 Simon Chesterman, 'Lawful but Awful: Evolving Legislative Responses to Address
Online Misinformation, Disinformation, and Mal-Information in the Age of Generative AI'
(2025) 20 The American Journal of Comparative Law <
https://doi.org/10.1093/ajcl/avaf020> accessed 13.08.2025.
11 Cf. Lawrence Lessig, Code and Other Laws of Cyberspace (Basic Books 1999).

4
Call for Abstracts
The concentration of legislative, executive, and judicial powers within online
platforms raises profound questions about how they influence society. The
assumption of state-like powers by companies such as Meta, Google, X,
and TikTok invites critical examination of their impact on fundamental
rights, governance, and democracy.
This conference explores the extent to which major platforms exercise
functions historically associated with sovereign states, and the implications
this has for law and policy. Through keynote speeches, panels, and
discussions, we will examine how all three branches of state power are
exercised in the digital sphere, and the legal consequences this might have.
We welcome abstracts engaging with, but not limited to, the following
themes:
• Regulation of online speech and the privatization of content
governance
• Platforms’ influence on public discourse, democratic processes, and
electoral integrity
• The platforms’ impact on the exercise and protection of fundamental
rights in the digital sphere
• Different regulatory approaches such as self-regulation, “hard”
governmental regulation, and co-regulation
• The evolving role of platforms as private regulators and norm-
enforcers
• Platform accountability and user redress mechanisms, and their
effectiveness
• Legal and theoretical implications of the blurring boundaries between
“public” and “private” power in platform regulation
• The ways in which regulation may legitimize, rather than constrain,
platform power

By fostering discussion on these timely and often controversial issues, the
conference aims to critically investigate platforms’ responsibilities in the
digital age and chart pathways toward more transparent, accountable, and
equitable governance in the digital public sphere. Contributions may
engage with any relevant aspect, including but not limited to EU law, other
domestic or regional legal frameworks, comparative or international law,
policy debates, and theoretical or doctrinal foundations.
5
The conference is designed to bring together early career researchers. We
especially encourage applications from current PhD candidates. Abstracts
should not exceed 600 words and should be submitted (as PDF) to the
organizers’ emails (m.bovermann@csl.mpg.de; m.rabajante@csl.mpg.de;
d.buchmann@csl.mpg.de) by 19 December 2025 at the latest. Please use
the subject line “Early Career Platform Regulation Conference 2026.”
We plan to publish the papers presented at the conference and are in the
process of identifying suitable publication venues. Papers will be around
6.000 words in length, depending on publisher requirements. Funding for
travel and accommodation is, in principle, available for all speakers, subject
to confirmation by the Max Planck Institute for the Study of Crime, Security
and Law.
6
Key Facts
 
Date: 21-22 May 2026 (full days)
 
Place: Max Planck Institute for the Study of Crime, Security and Law, Fürstenbergstraße 19,
79100 Freiburg, Germany
 
Organizers: Marc Bovermann (m.bovermann@csl.mpg.de); Maria Diory Rabajante (m.rabajante@csl.mpg.de); Daniel Buchmann (d.buchmann@csl.mpg.de)
 
Deadline for Abstracts: 19 December 2025
 
Notification of Acceptance: 7 January 2026
 
Deadline for Completed Papers: 7 May 2026

No comments: