Frances Haugen, then a social media algorithm expert at Facebook, first began to have concerns about how the company was deploying its proprietary algorithms during the Iowa presidential caucuses in 2020.
“Facebook understood the danger of misinformation long before that,” said Haugen, the featured speaker at the January 25 Presidential Lecture at the Staller Center for the Arts. “They formed a panel they called the Civic Integrity team after 2016 and built it up for four years. Then less than a month after the 2020 election, they dissolved it. That was the moment where I thought, ‘I don’t think Facebook can save itself.’ If they could come out of that election and think ‘we’re done,’ then it was clear to me that they didn’t understand the potential danger of their own product. That was the moment I decided that I had to do something.”
Haugen’s worst fears would be realized at the January 6, 2021, attack on the U.S. Capitol — an attack coordinated in part by groups leveraging the capabilities of Facebook.
Haugen, now an outspoken advocate for social media transparency and accountability, addressed the campus community in the presidential lecture, which was moderated by Stony Brook University Executive Vice President and Provost Carl Lejuez. (President Maurie McInnis was not feeling well and could not host as planned.)
“This is a great opportunity to hear a story of advocacy and bravery that illuminates the need for transparency,” Lejuez said in his opening remarks. “It’s because of Frances and advocates like her that the world is beginning to grasp the critical importance of reforms to keep young people safe using social media as a tool for good, and a tool that unites instead of divides.”

An algorithmic expert and co-founder of Beyond the Screen — a coalition of technologists, designers and thinkers addressing online harm — Haugen has carved out a career that has focused on ranking algorithms for technology companies and platforms including Google, Pinterest and Yelp. In 2019, she joined Facebook as lead product manager on the company’s civic misinformation team.
After her concerns following the 2020 U.S. presidential election, Haugen shared thousands of internal documents, now known as the Facebook Files, with members of the U.S. Congress and offices of attorneys general. After leaving Facebook in May 2021, she embarked on an international mission of advocacy aimed at highlighting the danger of Facebook and other social media companies that prioritize profit over public safety.
“When you read through those files, you realize how visceral some of the things we’re discussing are, and Facebook knew this,” said Haugen. “Algorithms are embedded to maintain control, but they knew that kids’ mental health was being harmed. Children were describing their behavior and their relationships with social media like an addict would. They couldn’t stop themselves, they felt horrible using it, but they thought that they would be marginalized if they didn’t. Facebook knew their algorithms were influencing people.”
Haugen likened today’s social media danger to the American auto industry in the 1960s, a time when unsafe and dangerous designs were too often enabled and ignored — that is, until lawyer and consumer advocate Ralph Nader published his groundbreaking book Unsafe at Any Speed: The Designed-In Dangers of the American Automobile in 1965. The book’s central theme was that manufacturers resisted the introduction of safety features such as seat belts because they were reluctant to spend money on improving safety.
“Ralph Nader’s book says, ‘Guess what? The automotive industry knows how to make cars safer, and they’ve known for years. They choose not to,’” said Haugen. “That’s kind of where Facebook is today. We know lots of ways we can easily make the platform safer. So why won’t they do it?”

In 1966, the year after Nader’s book was published, the National Traffic and Motor Vehicle Safety Act was enacted in the United States to empower the federal government to set and administer new safety standards for motor vehicles and road safety. The adoption of seat belts, booster seats for children and stricter enforcement of drunk-driving laws are just a few examples of the improved safety initiatives that would follow.
As in the auto industry 60 years ago, much of today’s social media danger is due to a lack of transparency, Haugen said.
“Nader’s book inspired other concerned citizen groups like Mothers Against Drunk Driving (MADD) to take action,” she said. “We don’t have anything like MADD for social media, mostly because most mothers have no contacts and can’t even learn how these platforms work other than going to work for them. We’re standing in a world where there are only a few thousand people in the world, and maybe as few as hundreds, who understand the physics of this. Yet no one is allowed to study these algorithms. Facebook has refused to provide data for at least 10 years.”
Haugen is encouraged by the recent Digital Services Act — a European regulation adopted in October 2022 to address illegal content, transparent advertising and disinformation — calling it a landmark law.
“The Digital Services Act is trying to really zoom in and explore the trade-offs that went into designing and building these algorithms,” she said. “There weren’t enough people at the table.”
Remedies, Haugen said, can be as simple as requiring people to open links before they share them. “That sounds really easy, doesn’t it?” she said. “Saying you have to actually look at something before you spread it would decrease misinformation by 10 to 15 percent. Twitter did it, but not Facebook. There are at least 2 billion users on Facebook that are very minimally literate. This is not about what I think is right or what Mark Zuckerberg thinks is right. To make a positive difference, we need the input of a large number of people with diverse perspectives who can contribute to the conversation in a meaningful way.”
“The work I do is aimed at figuring out how to get information out there so that we can have these conversations,” Haugen added. “We have no norms established around what’s adequate when it comes to a social media company. We need to collectively figure out what we want the ‘norm’ to be.”
— Robert Emproto