Tech

We’re Smarter About Facebook Now

This article is part of the On Tech newsletter. Here is a collection of past columns.

In Facebook’s major scandals of the last five years, some of the scary details or breathless conclusions have been off base. But each one has moved us closer to essential truths about how Facebook affects our lives.

In 2016, the worst fears were that a wildfire of Russian propaganda on Facebook persuaded a bunch of Americans to vote for Donald Trump. In 2018, people spun yarns that the political consulting firm Cambridge Analytica brainwashed us with data they vacuumed up from Facebook users. Not quite right.

In the firestorms, there may have been too much credit given to the Kremlin, Cambridge Analytica and Facebook — and too little to human free will.

And in Facebook’s crisis du jour, kicked off by a whistle-blower’s claims that the company repeatedly chose its short-term corporate interests over the good of humanity, some nuance has likely been lost. Instagram’s internal research about the app’s influence on teenage girls’ mental health doesn’t appear conclusive, as some researchers told me and NPR reported.

So yes, we’ve all gotten stuff wrong about Facebook. The company, the public and people in power have at times oversimplified, sensationalized, misdiagnosed the problems or botched the solutions. We focused on how the heck Facebook allowed Macedonian teenagers to grab Americans’ attention with fabricated news, and did less to address why so many people believed it.

Each public embarrassment for Facebook, though, is a building block that makes us a little savvier about the influence of these still relatively new internet technologies in our lives. The real power of the scandals is the opportunity to ask: Holy moly, what is Facebook doing to us? And what are we doing to one another?

Kate Klonick, a law school professor, told me that when she started as a Ph.D. student at Yale Law School in 2015, she was told that her interest in internet companies’ governance of online speech wasn’t a subject for serious legal research and publication. Online life was not considered real life, she explained. Russian election propaganda, Cambridge Analytica and other Facebook news in the years that followed changed that perception.

“Those stories have done one huge thing: They’ve started to make people take the power of technology companies seriously,” Dr. Klonick said.

That is one thing that’s different about this Facebook episode from all the ones that came before. We are wiser. And we are ready. There is a coterie of former tech insiders and outside professionals who have studied Facebook and other tech superpowers for years, and they are armed with proposed fixes for the harms that these companies perpetrate.

Another difference in 2021 is the presence of Frances Haugen, the former product manager at Facebook who seems to be the right messenger with the right message at the right time.

I want to resist the comparisons that some senators and Facebook critics have made between the company and cigarette makers. The products are not analogous. But the comparison is apt in a different way.

For decades, there were warnings about the harmful effects of smoking and big tobacco companies’ covering it up. In the 1990s, a whistle-blower — Jeffrey S. Wigand, a former executive from Brown & Williamson Tobacco — crystallized and confirmed years of suspicions and helped compel U.S. government authorities to act.

Haugen, like Wigand, went public with damning firsthand knowledge and documents, and a compelling story to tell to a public that was ready to hear it. That magical formula can change everything for a company or industry.

“We are moved by stories,” Erik Gordon, a professor at the University of Michigan business school, told me. “The facts don’t have to be bulletproof. They have to be enough to give a good story credibility.”

I don’t know if this is Facebook’s Big Tobacco moment. Haugen was not the first former Facebook insider who sounded alarms about the company. After Wigand’s bombshell disclosures, it took a couple more years for the U.S. government’s crackdown on the tobacco industry to get real. And, of course, people still smoke.

Blame is a blunt instrument, but at each Facebook crossroad, we learn to wield blame more judiciously. Facebook and other online companies are not responsible for the ills of the world, but they have made some of them worse. We get it now.

The answers aren’t easy, but Haugen is directing our attention straight at Facebook’s molten core: its corporate culture, organizational incentives and designs that bring out the worst in humanity. And she is saying that Facebook cannot fix itself. A wiser public must step in.


Before we go …

  • Imagine if your co-workers’ salaries and performance reviews were public: Years of data from Twitch, the popular livestreaming website, leaked online in recent days. The data included the website’s computer code and its payments to people who broadcast themselves playing video games, my colleague Kellen Browning reported. Vice News explains what is worrying Twitch streamers.

  • How to protect yourself from garbage products online: A Washington Post writer shares research techniques and tips to sort out the good from the bad in the sea of merchandise online. (A subscription may be required.)

  • Why listening to books is the best: “Audiobooks aren’t cheating,” writes Farhad Manjoo, my New York Times Opinion colleague. Some books “achieve a resonance via the spoken word that their text alone cannot fully deliver.”

Hugs to this

This dog in Istanbul loves traveling on public transit, and the authorities tracked his favorite commuter haunts.


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at [email protected].

If you don’t already get this newsletter in your inbox, please sign up here. You can also read past On Tech columns.

Related Articles

Back to top button