We Know What Facebook Knew, and When It Knew It. Now What?
Two weeks ago, The Wall Street Journal published “The Facebook Files,” a damning series based on a cache of leaked internal documents that revealed how much the company knew about the harms it was causing and how little it did to stop it.
In a hearing on Thursday, senators on the consumer protection subcommittee accused Facebook of hiding vital information on its impact on users. “It has attempted to deceive the public and us in Congress about what it knows, and it has weaponized childhood vulnerabilities against children themselves,” Senator Richard Blumenthal, the chairman of the subcommittee and a Democrat from Connecticut, charged.
I’ve spent the last six years researching how platforms govern speech online, including a year inside Facebook following the development of its Oversight Board. While the “factory floor” of the company is full of well-intentioned people, much of what the series has reported confirmed what I and other Facebook watchers have long suspected.
The Journal’s reporting showed that Facebook regularly gave preferential treatment to elites if their speech was flagged on the platform; that it implemented shoddy solutions to mitigate the harmful mental and emotional health effects of its products on teenagers; and that it underinvested in enforcing its own rules about what is allowed on the site outside of the United States. The series has stirred the now familiar outrage at Facebook for failing to take responsibility for how people use its platform. While these revelations are disturbing, they also point to some opportunities for reform.
One of those opportunities is redefining how Facebook determines what a “good” product is. For much of its history, the company’s key metric has been user engagement — how long users log in, the pages they spend time on, which ads they click. The greater the user engagement, the more valuable Facebook’s ads, and the more profit for shareholders. But the Facebook Files stories have put to rest any doubt that this narrow concept of engagement fails to capture the platform’s real impact — both the bad and, yes, the good.
Facebook is perfectly capable of measuring “user experience” besides the narrow concept of “engagement,” and it is time those measurements were weighted more heavily in company decision-making. That doesn’t mean just weighing harmful effects on users; it could also mean looking at and measuring the good things Facebook offers — how likely you are to attend a protest or give to a charitable cause you hear about on Facebook.However it ends up being calculated, it needs to be transparent and it needs to become a bigger part of the company’s decision-making going forward.
The series also revealed that Facebook had conducted its own research into the harmful effects of Instagram, the popular photo-sharing platform it acquired in 2012, on the mental health of teenage girls but downplayed the results. For social-media researchers, these revelations confirmed much of what we already knew from multiple third-party studies showing that cellphones and social media are bad for teenage mental health. (And long before smartphones and Instagram, social science placed similar blame on fashion magazines and television.)
While calling out Facebook for its mistakes and omissions may seem like a win, berating the company for its flawed internal and external research projects does not mean this type of work will become more ethical or transparent. The outcome is that it doesn’t get done at all — not by Facebook or anyone else — and if it does, the results stay hidden.
Other popular platforms are also part of the problem. Snapchat supposedly studied the effect of its platform on its users’ mental health, but never released the results. Instead, it announced new intervention tools. Following the publication of the Facebook Files series, TikTok rolled out “mental health guides” for users.
These moves reveal what companies are trying to avoid. If you look inward and investigate the harms your platform has caused and it turns out to be too expensive or too hard to fix them, it stirs up the exact kind of public relations storm Facebook is now enduring. From these companies’ perspective, the alternative is simpler: If you don’t study it, there’s nothing to reveal.
Between Facebook’s internal research and reports last month on the company’s failed program to share its data with outside social scientists,executives across Silicon Valley at other companiesare most likely breathing a sigh of relief: They’ve managed to dodge pressure from outside researchers to interrogate their own practices.
The series’ most damning takeaways were the revelations around how Facebook has handled content issues in Africa, Latin America and Asia. While Facebook applies its community rules globally, those rules can’t possibly adhere to the wide range of cultural norms of Facebook users around the world. Understanding those differences requires more and better people to constantly revise the rules and enforce them.
Last week, Facebook announced it has spent more than $13 billion on safety and security since 2016 and currently employs 40,000 full and part time safety and security workers. For 2020 alone, this puts the costs in this area between $5 billion to $6 billion — or about one-10th of the company’s overall costs. To put this all in perspective, in the United States thereisroughly one law enforcement officer for every 500 people. Facebook has 2.8 billion global monthly active users; that means just 1.3 people working in safety and security for every 100,000 users.
There is no quick fix for content moderation. The only way to do it better is to hire more people to do the work of “safety and security,” a term that encompasses all who both directly and indirectly write, revise and enforce Facebook’s community standards. According to Facebook’s SEC filings, the average revenue per users in the United States and Canada in the last quarter of 2020 was $53.56. Europe, its next-largest market, accounted for only a fraction of that at $16.87, with Asia-Pacific users at just $4.05. “Rest of World” was just $2.77 per user. Those numbers don’t necessarily reflect where Facebook ultimately ends up investing in safety and security. But itdoes help explain onepowerful set of incentives that might motivate the company’s priorities.
The Facebook Files series is motivating change. But it will take more than breathless reporting to make sure that reform happens in effective ways. That will require laws demanding transparency from platforms, a new agency to specialize in online issues and more science. Whistle-blowing gets us halfway there. We have to do the rest.
Dr. Kate Klonick (@klonick) is a lawyer and an assistant professor at the St. John’s University Law School. She is a fellow at Yale Law School’s Information Society Project and the Brookings Institution, and currently writing a book on Facebook and AirBnB.
The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.