Platform Content Liability

The cluster debates legal protections like Section 230 and DMCA safe harbors for online platforms hosting user-generated content, questioning whether platforms should be liable for illegal user posts or required to moderate proactively.

📉 Falling 0.4x Legal
4,861
Comments
20
Years Active
5
Top Authors
#8459
Topic ID

Activity Over Time

2007
1
2008
10
2009
33
2010
97
2011
70
2012
147
2013
94
2014
100
2015
112
2016
143
2017
251
2018
395
2019
503
2020
631
2021
729
2022
482
2023
358
2024
337
2025
350
2026
18

Keywords

MS ISP FB US allreviews.com reuters.com AWS CDA GBP niemanlab.org content liable legal illegal host platform generated content dmca hosting law

Sample Comments

sosuke Nov 11, 2016 View on HN

I think what you're miss is the safe harbor parts of the DMCA in the USA. They offer protection for any user generated content on the web provided the operators respond to DMCA requests within a certain time (24h?). Without these safe harbor provisions the idea is that the web would crumble under the weight of liability. Think about shared hosting companies, AWS, a public forum like this and then consider how much effort it would take to police the amount of content being created by users.

8note Sep 18, 2022 View on HN

Sounds like a weird requirement. If they host the content, they should be sued, minus 230 protections

dahfizz Jul 16, 2021 View on HN

Platforms are not publishers. The publisher of these things should face legal action (including the removal of their content). It's not for the platform to pick and choose.

numpad0 Jan 12, 2021 View on HN

Boom you're a social media company. And you'll be naturally liable for hosting contents. Isn't that how Winny worked?

leptoniscool Dec 17, 2020 View on HN

The platform shouldn't be held responsible for their content, similar to social media right?

sieabahlpark Oct 15, 2020 View on HN

Have it removed so no other company can get to their scale? We have to always have a way to defer liability to the user who posted the content.If a website is always responsible for whatever the users post you won't have much of a free internet anymore. However I think the issue with FB and Twitter is that they haven't had any repercussions for breaking the law in the first place.

root_axis Jun 6, 2019 View on HN

This is wrong. The law very explicitly states that companies are not liable for content uploaded by users and are also legally able to moderate their platform as they see fit. Google section 230.

cbsmith Jan 11, 2021 View on HN

Yeah, but I think this is case where it is at best illegal because of belief that they cannot adequately police their content... that seems like the kind of thing you could give them a lot more lead time on and give them some practical transition options (like, your content isn't deleted, but we're not going to route any traffic to public IPs). I mean, it's a belief, not even something that they have really evaluated and dealt with legal consequences. You might think they are enti

ajsnigrutin Jan 9, 2021 View on HN

Yes, i'm saying that this law/rule (or whatever it's called, I'm not from USA) should be changed.You should be able to choose a status, either "platform" or "published", and in the case of the first, you'd have to leave everything not directly illegal (with rules in place, how to contest the legality in court if the post was removed and the author think it's legal speech - sort of like with DMCA claims), or you're a publisher, where you c

ineedasername Jan 10, 2021 View on HN

You're talking about services that are "common carriers".They have special legal obligations to, within reason, facilitate every legal transaction that comes their way. The scenarios you describe therefore cannot happen with them.If you want internet platforms to have that same level of "anything goes", then you're arguing for them to be classified as common carriers.