Digital platforms as private governance systems
This project aims to identify and analyze challenges created by digital platforms as private governance systems from both private and public law perspectives, including consumer protection, anti-discrimination, contractual fairness, human rights and access to justice.
Digital platforms play an increasingly important role in society. The so-called collaborative platforms are used for commercial exchange but also for providing services of broader societal interest for instance on the housing and transportation market and social platforms such as Facebook serve as communication channels around the world for billions of people. Moreover, platforms seem to create their own, private norm systems. Platforms define the rules applicable on the platform, they implement these rules by way of rating systems and by user exclusion and they offer dispute resolution mechanisms available to the users on the platforms. By taking on the roles as both regulators, implementers and dispute resolution bodies, platforms undertake governance functions comparable to a state.
Platform private governance systems have been met with regulatory and other governmental push-backs of different kinds. More recent ones include the EU Regulation on platform-to-business relations (Regulation 2019/1150), establishing rules on unfair contract terms in the platform-to-business relationship, the EU Copyright Directive, imposing new obligations on online content-sharing service providers (Directive 2019/790), the judgment of the CJEU in Case C-18/18 (Glawischnig-Piesczek) (2019) supporting injunctions against Facebook with a worldwide effect, the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz) aiming to combat agitation and fake news in social networks, and the expected proposal for a revision of the liability exemption rules in the e-Commerce Directive in a “Digital Services Act” (2020). There are also examples of consensual agreements between states and platforms to support important state policies such as the collaboration agreement between the Danish Tax Authority and Airbnb, which requires Airbnb to share certain information with the Danish Tax Authority.
It is the hypothesis of this project that digital platforms create challenges in both private and public law. The aim of the project is to identify these challenges and to analyse them through the lenses of private governance.
Social Media Platforms – entangled between private and public regulation of online offensive speech
Giant social media platforms, such as Facebook and Twitter, have increasingly become “gatekeepers” of the online information flow due to their essential role in regulating online speech. These private entities have built their own governance systems to address online objectionable content and are performing three roles simultaneously, acting like: (1) legislature, in defining what constitutes legitimate content on their platform, (2) judges, who determine the legitimacy of content and (3) administrative agencies, who act on adjudications to block illegitimate content. Under these private governance systems, private entities now perform a task that traditionally belonged to the state – governing free speech. Concurrently, governments around the world put pressure on these online intermediaries to remove undesirable site content in order to suppress hate speech, defamatory statements, privacy violations etc.
The primary objectives of this PhD project are to examine the interplay between private and public regulation of online offensive speech and to uncover the circumstances under which online intermediaries such as social media platforms can be held liable. The project will focus on online intermediary liability for defamation, but will also examine how speech types closely related to defamation, such as hate speech, privacy violations and “fake news”, are regulated under the private governance systems of social media platforms. In short, the project will explore on what basis online intermediaries can be held liable towards third parties for “under-removal” of illicit content as well as “over-removal” of protected speech.
Limiting online access to legal content in the European Union
This PhD project focusses on the limitation of users' protection of European fundamental rights through user term-based agreements on (large scale) online platforms and takes both a holistic approach to defining the current online world in a legal sense (as a public utility, a service, or a hybrid), and a more dogmatic approach to answer whether private companies can exclude users' lawful content through unilaterally imposed contracts (user terms).
In the latter part, the first focus is the possibility to exclude a person from a platform through the application of user terms and assesses whether such an application could be regarded as an unfair contract term; the second focus are anti-discrimination principles and the obligation to contract.
For a general introduction to digital platforms as private governance systems, see
- Clement Salung Petersen/Vibe Ulfbeck/Ole Hansen: Platforms as Private Governance Systems - The Example of Airbnb (Nordic Journal of Commercial Law No 1 (2018))
PI Director of centre, professor
Vibe Garf Ulfbeck
Faculty of Law
University of Copenhagen
South Campus, Building: 6B.3.64
Karen Blixens Plads 16
DK-2300 Copenhagen S
Phone: (45) 35 32 31 48