How Facebook’s News Feed Became A Political Propaganda Machine

In his new book, journalist Steven Levy unpacks how Facebook’s news feed influenced a presidential election in the Philippines—and how it was a symptom of the rise of fake news.

The following is an excerpt from Facebook: The Inside Story by Steven Levy.

a book cover that reads "facebook the inside story"

Buy The Book

Facebook: The Inside Story

Buy

In fact, halfway around the world, there was terrifying proof of those fears. In the Philippines.

By 2015, nearly all inhabitants of that Pacific island country of 10 million had been on Facebook for several years. A major factor in making this happen was the Internet.org Facebook program—hatched from the Growth team—known as Free Basics. It was designed to increase Internet activity in poor countries where many people could not afford to pay for data charges. Free Basics allowed people to use Facebook—for no fee. While the program had run into trouble in India, in its 2013 test bed in the Philippines, it was a “home run,” said Zuckerberg in a 2014 conference appearance. (A couple of years later, Zuckerberg heard that 97 percent of Philippine Internet users were on Facebook. His joking reaction was, What about the other 3 percent?)

The country also got the bulk of its news from Facebook. This is why when one of the country’s leading journalists, Maria Ressa, started a publication called Rappler in 2010, she designed it specifically to run on Facebook. “I had always thought that this technology would help to solve problems bottom-up,” she says. “And it did for a while, up until 2015.”

Related Segment

The Evolution Of Facebook

That was when a candidate for the Philippines’ May 2016 presidential elections, a populist authoritarian named Rodrigo Duterte, spread misinformation about his opponents and misrepresentations about conditions in the country at large. A body of pro-Duterte bloggers flooded Facebook with horrific posts that took full advantage of the viral power of the News Feed. Its design visually treated marginal or unscrupulous “news” sites the same as the most highly vetted publications. And because those dicey operations commonly dealt in sensational content that was hard to ignore, Facebook rewarded them.

“Newspeople don’t tell lies, but lies spread faster,” Ressa says. She had bet her entire publication on Facebook but now was being eclipsed by the false information from Duterte bloggers. The country was inundated with posts like a fake sex tape where the head of Duterte’s female opponent was digitally grafted onto the body of a porn actress. Facebook also was empowering the Duterte mob to use the platform to attack his critics, putting them in danger from his angry supporters. Ressa was personally targeted.

And despite her multiple complaints, Facebook was doing nothing to stop this.

Ressa thought that after Duterte won the election in May 2016, things might calm down. But then he began using the same tactics on Facebook to push his governance platform of strong-arm tactics.

Ressa understood that the Duterte forces were drawing a road map for future political abusers around the globe to use Facebook. She pushed for a meeting to warn the company. In August 2016, she met with three senior Facebook officials in Singapore. She had identified 25 fake accounts that were able to amplify their hateful and false information to 3 million people. “I began showing them lies, the attacks against anyone who attacked [violent acts by Duterte supporters],” she says. One example was a post from the Duterte campaign spokesperson, showing a photo of a girl he claimed was raped in the Philippines. “We did a check and it showed that the photo was a girl from Brazil,” says Ressa, speaking to me in 2019. “And yet that post was allowed to stay up. It’s still up there today.”

“I had always thought that this technology would help to solve problems bottom-up. And it did for a while, up until 2015.”

It seemed to Ressa that the Facebook officials were in total denial of what she was pointing out with clear evidence. “I felt like I wasn’t talking to people who use Facebook as well as I do,” she says. Despite her handing over the names, Facebook did not act on it for months, even after Ressa published a three-part series about the misinformation, and she was personally targeted with thousands of hate messages. (Facebook says that when it got the necessary information, it acted on the accounts.) Later she would recall a moment in the meeting when, frustrated, she reached for the biggest hyperbole she could think of to portray what might happen if such practices continued. “If you don’t do something about this,” she said in August 2016, “Trump could win!”

The Facebook people laughed, and Ressa joined them. It was a just a joke. Nobody thought that could happen.

In the fall of 2016, Facebook still wasn’t thinking of the News Feed as a propaganda machine. But the Trending Topics experience had made it impossible to ignore how many low-quality stories and outright hoaxes were being spread on Facebook. Every Monday, Facebook’s top managers—the small group—gather in Zuckerberg’s conference room for a long meeting. The first hour is devoted to the topic du jour, and the rest of the time the focus turns to specific projects. That first hour is freewheeling—a time where anything goes. The fake-news topic came up one Monday as the election approached. While the company surely had to address it, the small group decided it was too risky to do so in the heat of the contest. “We didn’t want to overreact to it and create a political snafu for ourselves,” says Bosworth. “We’re worried about taking action and spawning a big flame-up. We were aware of our natural perception as being aligned to Democrats. So we assume that’s the bias. We didn’t want to interfere with an election. We figure anything that looks like we’re playing one side of this against the other is off-limits.”

So in order to avoid interfering with the election, Facebook effectively gave a green light to misleading, sensationalistic posts that themselves arguably interfered with the election.

Zuckerberg’s inner circle had no clue that misinformation was thriving in their system because, well, where was the data? 

The ultimate justification for this could be attributed to the engineering mentality that Mark Zuckerberg celebrated in his company. It was a matter of metrics. Compared to the number of total posts hosted by Facebook, the disputed content was minuscule. Those on the product side viewed it from a data perspective and noted that fake news comprised a tiny percentage of the billions of stories posted to Facebook every day. The numbers did not indicate the urgency of the problem.

“These people had all the power,” says one Facebook executive. “All their metrics were better ad metrics, more growth, more engagement. That’s all they cared about. And [on the Sheryl side] they’re dealing with the downsides of all that. And that was effectively how the company ran.”

In short, Zuckerberg’s inner circle had no clue that misinformation was thriving in their system because, well, where was the data? “We do a lot of work to understand what the top 25 things are that people are concerned about or the things where people are having bad experiences,” says Chris Cox. “We asked them what are the bad experiences you’re having and then we rate the bad experiences and then we get things like sensationalism, click bait, hoaxes, redundant stories, and stuff like that. But as a practical matter, [misinformation] wasn’t on our radar. We missed it.”

“The dirty secret no one talks about is that the stuff was really small,” says Bosworth. “So we’re just like, How can we deal with this? Can we build good policies that we think are uniform? And so we’re talking about it but it’s not urgent. I would say honestly it was just business as usual until the election, which all of us thought Hillary would win. Like a lot of other people, I assume.”


From Facebook: The Inside Story by Steven Levy, published by Blue Rider Press, an imprint of Penguin Random House, LLC. Copyright © 2020 by Steven Levy.

Meet the Writer

About Steven Levy

Steven Levy is author of Facebook: The Inside Story (Blue Rider Press, 2020) and an Editor at Large for Wired in New York, New York.

Explore More