These are strange days for people who care about trust and safety on platforms. Historically, many people have suggested that either more effective central moderation (a platform owner intervening directly in policing the content of the platform) or better decentralized moderation (allowing users to curate their spaces through community-driven moderation) could pave the way to a better social media landscape—or, ideally, some alchemically balanced combo of the two. But, in true Silicon Valley fashion, one platform is centralizing in the worst way possible, while the other is decentralizing catastrophically.
Of late Reddit and Bluesky are showing how to fail at both—one in pursuit of an IPO that is destroying the very thing that made the site valuable to its users; the other in pursuit of a dream of decentralization that quickly tarnished the site and threw its grandiose claims into doubt.
The problems at Reddit are complex, but, in brief, the company decided to charge users for access to its application programming interface (API), which had been free since 2008. The financial motivations for this, its knock-on effects on the site’s army of volunteer subreddit moderators, and how poorly Reddit has handled the situation all, taken together, comprise a crisis for the site. The original API change prompted a mass “blackout” on the site, where moderators restricted access to their subreddits, blocking off large, popular parts of the decentralized site to users as a kind of strike.
For mods, the stakes were especially high. The API changes threatened to gut third-party applications and bots that had made their jobs significantly easier—popular third-party mobile apps for reading Reddit like RiF or Apollo were especially useful for the visually impaired moderators of /r/Blind due to their accessibility features. Many mobile features for moderation, in particular, are affected. Further, Reddit has limited access to its own internal tool, Automoderator. Now, in the words of longtime /r/GirlGamers moderator Jaime Klouse, volunteer mods are “thrown back to 2015 when there were no useful tools and you had to moderate by literally reading every single comment submitted to your communities.”
Reddit has always been a byword for toxicity, but /r/GirlGamers, an inclusive gaming community, was one of several subreddits that provided an alternate model, through careful enforcement of collaborative norms guided by a strong sense of ethics.
With a rather apposite metaphor, moderator Swan Song told me. “We weren’t just volunteer knights, we were volunteer blacksmiths and armorers too, crafting our own powerful tools to aid us in defense efforts,” she said, referring to the various tools that free API access had made possible. Another mod, iLuffhomer, said that Reddit’s saving grace was that it “allowed us to moderate as we saw fit. Now, it feels like Reddit doesn’t respect what we do.”
If a community is to moderate itself, giving its regular users a stake in the day-to-day happenings of their online watering hole, it stands to reason that “empowering” them (that much beloved corporate buzzword for outsourcing responsibility) requires giving them the tools to do so. For the moment, Reddit’s moves seem designed to retain only a veneer of community moderation, undermining the very things that made it, and the site, worthwhile. It’s a warning to other, newer sites that will one day be in search of more money as well.