Intermediary Liability

The legal question of when a platform, application, or service becomes responsible for content created, shared, or transmitted by its users — rather than only the users themselves being responsible.


What is it?

Intermediary liability deals with a fundamental question for anyone building a platform: If a user does something harmful through your system, are you liable?

The answer depends on what your platform does with user content. There’s a spectrum from passive conduit (like a postal service delivering a sealed letter) to active publisher (like a newspaper choosing what to print). Where your application falls on this spectrum determines your legal exposure.1

This question has become central to modern technology law. The internet was built on the assumption that platforms are neutral intermediaries — Section 230 of the US Communications Decency Act famously established that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” But this US-centric model is being challenged worldwide. The EU’s Digital Services Act (DSA), enacted in 2024, creates a tiered system of responsibilities. Switzerland has no Section 230 equivalent at all — liability follows from general criminal and civil law principles.2

For developers, this means that architectural decisions — whether your app sends messages on behalf of users, whether it moderates content, whether it amplifies some content over others — directly determine legal risk.

In plain terms

Intermediary liability is like the difference between being a postman and a newspaper editor. A postman delivers letters without reading them — if a letter contains threats, the postman isn’t liable. A newspaper editor chooses what to publish — if an article is defamatory, the newspaper is liable. Your platform sits somewhere between these two extremes, and your design choices determine where.


At a glance


How does it work?

The three tiers of intermediary (EU DSA model)

The DSA establishes three categories of intermediary service, each with increasing obligations:3

1. Mere conduit

The service transmits information without selecting or modifying it (like an ISP or VPN).

  • No liability for the transmitted content
  • Must not initiate the transmission, select the receiver, or modify the content
  • No obligation to monitor

2. Caching

The service temporarily stores information to make transmission more efficient (like a CDN).

  • No liability if it doesn’t modify the content and complies with removal requests
  • Must act quickly when notified of unlawful content at the origin

3. Hosting

The service stores information provided by a user at their request (like a social platform, forum, or file host).

  • Conditional immunity — not liable for stored content IF:
    • It does not have actual knowledge of illegal content, AND
    • Upon obtaining knowledge, it acts expeditiously to remove or disable access
  • This is where most applications fall

Think of it like...

A landlord is not responsible for what tenants do inside their apartments — unless the landlord knows about illegal activity and does nothing. The “knowledge + inaction = liability” formula applies to platform hosting in the same way.

The curation problem

Here’s where it gets complicated for modern applications. If your platform recommends, ranks, amplifies, or algorithmically curates user content, you may be moving from “hosting” toward “publishing.”4

Platform actionLiability implication
Store and display user content, chronologicallyHosting — conditional immunity
Rank content by engagement metricsCuration — increased scrutiny
Recommend content based on user profilesActive curation — potential publisher status
Generate content using AIPublisher — full liability
Send content to third parties on behalf of usersTransmission with amplification — high risk

Developer rule of thumb

The more your platform shapes, selects, or amplifies content rather than passively displaying it, the closer you move to publisher liability. Every algorithmic decision is a potential editorial decision in the eyes of the law.

Switzerland: no safe harbour

Switzerland has no Section 230 and no DSA. Intermediary liability is determined by general civil law (Code of Obligations, Art. 41ff) and criminal law (StGB Arts. 173-174 for defamation, Art. 180 for threats).2

Key implications:

  • No blanket immunity for platforms
  • Knowledge + inaction can create civil and criminal liability
  • Facilitating harmful communications (providing the tool and the audience) may be sufficient for accessory liability
  • The 2025 draft platform law aims to create DSA-like obligations for Swiss platforms

The notice-and-action framework

Under the DSA (and increasingly expected elsewhere), platforms must implement a process for handling reports of illegal content:3

  1. Notice — receive reports from users or authorities
  2. Assessment — evaluate whether the content is illegal
  3. Action — remove, disable, or restrict access to illegal content
  4. Transparency — inform the content creator and publish transparency reports
  5. Appeal — provide a mechanism for content creators to contest removal

Concept to explore

When AI generates the content on your platform, intermediary liability intersects with AI content liability — you may be both the platform and the “author.” See ai-content-liability.


Why do we use it?

Key reasons

1. It determines your legal exposure. Understanding where your platform sits on the liability spectrum is essential for risk management. The wrong architectural choice can make your entire platform a liability risk.

2. It shapes product design. “Build the feature, handle liability later” doesn’t work. The decision to send messages on behalf of users versus letting users send themselves is a product decision with legal consequences.

3. It’s evolving rapidly. The DSA, the 2025 Swiss draft platform law, and ongoing US Section 230 reform mean the rules are actively changing. Building with liability awareness means building adaptably.


When do we use it?

  • When your platform stores, displays, or transmits user-generated content
  • When users can send messages, comments, or reviews through your system
  • When your platform uses algorithms to rank, recommend, or surface content
  • When you provide tools that users can employ to communicate with others
  • When AI generates content that is displayed to or sent to third parties
  • When designing content moderation policies and processes

Rule of thumb

If content flows through your platform from one person to another, you’re an intermediary. The question is then: are you a passive pipe or an active participant? Design accordingly.


How can I think about it?

The venue owner analogy

You own a venue where people give speeches. As a venue owner:

  • If someone gives a hate speech, you’re not liable if you didn’t know and had no reason to know
  • If you were told in advance and did nothing, you may be liable for providing the platform
  • If you curated the speakers, set the agenda, and promoted the event, you’re a co-publisher — fully liable

Your application is the venue. Your users are the speakers. Your algorithms are your event programming. The more you curate, the more you’re responsible.

The courier vs ghostwriter analogy

A courier delivers a sealed package — they don’t know what’s inside, and they’re not liable for the contents. A ghostwriter creates the content on behalf of someone else — they share responsibility for what’s written.

Most platforms are somewhere in between:

  • Courier = mere conduit (ISP, email relay)
  • Courier who opens and sorts packages = hosting with algorithmic curation
  • Ghostwriter = AI-generated content sent under the user’s name

Your design choices determine which role your platform plays.


Concepts to explore next

ConceptWhat it coversStatus
ai-content-liabilityWhat happens when the platform itself generates the contentcomplete
algorithmic-transparencyHow content curation algorithms affect liabilitycomplete
personal-data-protectionModeration processes involve processing personal datacomplete

Some cards don't exist yet

A broken link is a placeholder for future learning, not an error.


Check your understanding


Where this concept fits

Position in the knowledge graph

graph TD
    A[Data Governance] --> B[Intermediary Liability]
    A --> C[AI Content Liability]
    A --> D[Algorithmic Transparency]
    B --> E[Content Moderation]
    B --> F[Notice and Action]
    B --> G[Safe Harbour Provisions]
    style B fill:#4a9ede,color:#fff

Related concepts:


Sources


Further reading

Resources

Footnotes

  1. Wray Castle. (2026). Intermediary Liability: Rules, Risks and Reforms in the Digital Age. Wray Castle.

  2. Swiss Criminal Code (StGB) Arts. 173, 174, 180; Code of Obligations (OR) Art. 41ff, as referenced in the legal compliance analysis for pol.yiuno.org (2026). 2

  3. Kinstellar. (2026). The Digital Services Act: An Overview of the New Online Intermediary Liability Rules. Kinstellar. 2

  4. Scelta, D. (2026). Towards a New Paradigm of Platforms’ Liability. MediaLaws.