Select Page

The only safe prediction to make about the Senate’s Facebook hearing today is that, for the first time in a long time, it will be different. Over the past three and a half years the company has sent a rotating cast of high-level executives, including CEO Mark Zuckerberg, to Washington to talk about Facebook and its subsidiaries, Instagram and WhatsApp. This has calcified into a repetitive spectacle in which the executive absorbs and evades abuse while touting the wonderful ways in which Facebook brings the world together. Today’s testimony from Frances Haugen, the former employee who leaked thousands of pages of internal research to The Wall Street Journal, Congress, and the Securities and Exchange Commission, will be decidedly not that.

Haugen, who revealed her identity in a 60 Minutes segment on Sunday, is a former member of the civic integrity team: someone whose job was to tell the company how to make its platform better for humanity, even at the expense of engagement and growth. In nearly two years working there, however, Haugen concluded that it was an impossible job. When conflicts arose between business interests and the safety and well-being of users, “Facebook consistently resolved those conflicts in favor of its own profits,” as she puts it in her prepared opening statements. So she left the company—and took a trove of documents with her. Those documents, she argues, prove that Facebook knows its “products harm children, stoke division, weaken our democracy, and much more” but chooses not to fix those problems.

So what exactly do the documents show? The Wall Street Journal’s reporting, in an ongoing series called “The Facebook Files,” is so far the only window into that question. According to one story, Facebook’s changes to make its ranking algorithm favor “meaningful social interactions”—a shift that Zuckerberg publicly described as “the right thing” to do—ended up boosting misinformation, outrage, and other kinds of negative content. It did so to such an extreme degree that European political parties told Facebook they felt the need to take more extreme positions just to get into people’s feeds. When researchers brought their findings to Zuckerberg, the Journal reported, he declined to take action. Another story documents how Facebook’s “XCheck” program applies more lenient rules to millions of VIP users around the world, some of whom take advantage of that freedom by posting content in flagrant violation of the platform’s rules. Yet another, perhaps the most important published so far, suggests that Facebook’s investment in safety in much of the developing world—where its platforms are essentially “the internet” for many millions of people—is anemic or nonexistent.

You can see the challenge here for both Haugen and the senators questioning her: Such a wide range of revelations doesn’t coalesce easily into one clear narrative. Perhaps for that reason, the committee apparently plans to focus on a story whose headline declares, “Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show.” The committee has already held one hearing on the subject, last week. As I wrote at the time, the documents in question, which the Journal has posted publicly, are more equivocal than that headline suggests. They also are based on ordinary surveys, not the type of internal data that only Facebook has access to. In other words, they may be politically useful, but they don’t greatly enhance the public’s understanding of how Facebook’s platforms operate.

Some of the other documents in the cache, however, apparently do. Crucially, at least according to the Journal’s reporting, they illustrate the gaps between how Facebook’s executives describe the company’s motivations in public and what actually happens on the platforms it owns. So does Haugen’s own personal experience as an integrity worker pushing against the more mercenary impulses of Facebook leadership. Conveying that dynamic might do more to advance the conversation than any particular finding from the research.


More Great WIRED Stories