Content Moderation Challenges
This cluster centers on the difficulties of content moderation at scale on platforms like Facebook, Twitter, and YouTube, including high labor costs, psychological trauma for human moderators, scalability issues, and debates over AI versus human approaches.
Activity Over Time
Top Contributors
Keywords
Sample Comments
You're aware that this will result in _more_ content from being moderated off the sites, right?
Some outsource the problem:https://www.wired.com/2014/10/content-moderation/
You're right.-(But they do their own doubleplusgoodthink very well, methinks ...)PS. More seriously, this is a concerning and latent issue. Moderation is a "cost" I am sure they'd like to streamline, and AI scales - as opposed to poor, psychologically "breakable" humans now moderating ...... so I am sure they'll unavoidably come up with something.-
Never thought of it this way, but it is a content moderation problem, isn't it? The human collateral for moderation / quality enforcement is just ferocious on the internet, whether it's porn, NSFL.
How does the content moderation work?
Content moderation is a labor problem, not a technical or "operations" problem. There's simply not enough profit in any online forum/platform to cover the labor costs of effective moderation, which is why Facebook/Google use AI-based pre-review of reports - and very cheap labor from countries where both language and cultural barriers lead to lots of problems.How are you intending to fix that? Especially since you have a lot of extra overhead because you have to handle
I think this article glosses over some big issues with user generated content: auto-moderation is extremely fickle, humans are expensive and if your platform is large enough people will upload horrifying stuff.There have been several stories about the poor treatment of content moderators by companies like Google (YouTube) and Facebook but to reiterate: the work can be literally traumatizing. Content moderators will be exposed to gore, child abuse and worse, frequently.If your startup will
I don't want to be a naysayer but how would content moderation in regards to spam and illicit things work in this scendario?
Content moderation at scale is hard.
Naive comment ignoring other obvious problems with human moderation: https://www.thebureauinvestigates.com/stories/2022-10-20/beh...