Section 230 Debate

The cluster focuses on debates about Section 230 of the Communications Decency Act, including its protections for online platforms from liability for user-generated content, the role of moderation, and misconceptions about publisher vs. platform distinctions.

📉 Falling 0.3x Legal
3,524
Comments
18
Years Active
5
Top Authors
#6355
Topic ID

Activity Over Time

2008
2
2010
2
2011
4
2012
8
2013
11
2014
12
2015
3
2016
16
2017
27
2018
106
2019
160
2020
1,119
2021
839
2022
343
2023
385
2024
296
2025
166
2026
25

Keywords

e.g FANNG www.eff LSB10306 TOS LSB E.g CAN EFF CDA 230 section 230 section liable content publisher moderation liability platform immunity

Sample Comments

tmaly Sep 29, 2021 View on HN

how does this jive with Section 230?

linuxftw Jan 11, 2021 View on HN

Section 230 makes them not liable. So, not really.

dwaltrip Jun 30, 2020 View on HN

My limited understanding is that Section 230 of the Communications Decency Act (which is apparently one of the most important laws for this topic), passed in 1996, provides very broad protections to web platforms:1) They can't be held liable for user-generated content, e.g. Facebook can't be sued for a defamatory statement that I make in a post on their platform.A newspaper that authors and publishes an article making a similar defamatory statement could be held liable. I believe

nobody9999 Jan 9, 2021 View on HN

>Yes, i'm saying that this law/rule (or whatever it's called, I'm not from USA) should be changed.It's a section of a law. Specifically, Section 230[0] of the Communications Decency Act of 1996.>You should be able to choose a status, either "platform" or "published", and in the case of the first, you'd have to leave everything not directly illegal (with rules in place, how to contest the legality in court if the post was removed and

walthamstow Apr 24, 2024 View on HN

Wouldn't that position would also invalidate section 230 protections?

krapp Jun 6, 2023 View on HN

Sorry, I thought you would actually read it, forgot where I was for a second - Section 230 protection doesn't require a platform to act as a "common carrier" or to only moderate strictly illegal content, so it would not "come into play" under the circumstances you described.

DeonPenny Oct 16, 2020 View on HN

Except for the fact that section 230 argues they shouldnt be censoring anything. So they should be liable to be sued for doing things like this like any other publication.

owl_troupe Sep 13, 2021 View on HN

The distinction lies in whether the service provider has rendered themselves a "publisher" under 230. The protection has historically been broadly interpreted but, in theory, Facebook could lose the protection if it chose, selectively, what content to promote or remove in violation of its own public TOS. Generally:https://crsreports.congress.gov/product/

gonehome Jul 24, 2020 View on HN

The person you’re replying to is mistaken - 230 protects owners by allowing them to moderate, without it the existing legal precedent would force them to do nothing because any moderation would make them responsible for everything (as was the case before 230).This point is often confused and misunderstood. People think without 230 owners would be liable, but they’d only be liable if they moderated (so they wouldn’t). Without 230 owners would not moderate which would be worse for everyo

basch Oct 30, 2024 View on HN

You have it backwards.230 is about who is NOT liable. Platforms are NOT liable for what they don't moderate, just because they moderated other things. It protects them from imperfect moderation being used to claim endorsement.