America

Reddit’s I.P.O. Is a Content Moderation Success Story

A decade ago, no one in their right mind would have put “Reddit” and “publicly traded company” in the same sentence.

At the time, Reddit was known as one of the darkest parts of the internet — an anything-goes forum where trolls, extremists and edgelords reigned. Light on rules and overseen by an army of volunteer moderators, Reddit — which once hosted popular communities devoted to nonconsensual pornography, overt racism and violent misogyny, among other topics — was often spoken of in the same breath as online cesspools like 4chan and SomethingAwful.

Few could have predicted back then that Reddit would eventually clean up its act, shed its reputation for toxicity and go public, as it is expected to on Thursday at a $6.4 billion valuation.

Today, Reddit is a gem of the internet, and a trusted source of news and entertainment for millions of people. It’s one of the last big platforms that feel unmistakably human — messy and rough around the edges, sure, but a place where real people gather to talk about real things, unmediated by algorithms and largely free of mindless engagement bait. Many people, me included, have gotten in the habit of appending “Reddit.com” to our Google searches, to ensure we actually get something useful.

There are a lot of lessons in Reddit’s turnaround. But one of the clearest is that content moderation — the messy business of deciding what users are and aren’t allowed to post on social media, and enforcing those rules day to day — actually works.

Content moderation gets a bad rap these days. Partisans on the right, including former President Donald J. Trump and Elon Musk, the owner of X, deride it as liberal censorship. Tech C.E.O.s don’t like that it costs them money, gets them yelled at by regulators and doesn’t provide an immediate return on investment. Governments don’t want Silicon Valley doing it, mostly because they want to do it themselves. And no one likes a hall monitor.

Back to top button