What Reddit Got Wrong | Electronic Frontier Foundation

Content moderation doesn’t work at scale. Any scheme which attempts it is bound to fail. For sites which need continuous user growth, that is a problem. So what can they do? Well, we know what doesn’t work:

Simply having minimal or no moderation results in a trash fire of bigotry and illegal content, quickly hemorrhaging any potential revenue and potentially landing a platform in legal trouble. 
Automating moderation inevitably blocks legitimate content that wasn’t targeted, and is gamed by bad actors who get around it.

Every approach comes to the same conclusion—a platform needs workers: Lots of them, around the clock. Sites are then stuck trying to minimize this labor cost somehow. The worst version of this is a system of poorly paid workers, typically outsourced, merely reviewing user reports and automated moderation decisions. These mods invisibly compare out-of-context posts to a set of ever-changing and arbitrary rules. It’s grueling work, where one only views the worst the internet has to offer while remaining totally alienated from the community.

Yep. This is the key thing!
#reddit #socialMedia #platforms

https://www.eff.org/deeplinks/2023/06/what-reddit-got-wrong