The Evolution Of Facebook
Facebook is a household name globally with nearly 2 billion users. Mark Zuckerberg’s goal was to connect the entire world online when he founded the company in 2006. But 14 years later, Facebook has evolved into more than a social media platform. The company has been involved in debates and scandals around user privacy, outside interference in elections, and the spread of fake news. Last summer, the Federal Trade Commission fined Facebook $5 billion for “repeatedly used deceptive disclosures and settings to undermine users’ privacy preferences in violation of its 2012 FTC order.”
Journalist Steven Levy has been following Zuckerberg and the company since the beginning. In his new book Facebook: The Inside Story he chronicles Zuckerberg’s growth and data-driven approach and how that influenced the tactics the company applied to the problems that resulted from the platform.
Read an excerpt of Levy’s new book about how Facebook’s news feed allowed political propaganda and misinformation to spread during a presidential election in the Philippines.
Steven Levy is author of Facebook: The Inside Story (Blue Rider Press, 2020) and an Editor at Large for Wired in New York, New York.
IRA FLATOW: Facebook– oh, you know all about Facebook. It is wildly successful and has billions of members. It’s known around the world. And that was the goal, to connect the entire world online when Mark Zuckerberg founded the company in 2006.
But 14 years later, Facebook has evolved into more than just a social media platform. The company has been involved in debates and scandals around user privacy, outside interference in elections, and the spread of fake news. And the person who has watched all of this and followed all of this over the years is my next guest, Steven Levy.
He has been following Mark Zuckerberg and the company since the beginning. He has chronicled Facebook and its growth and data-driven approach to develop products and the problems that resulted. And he has a new book in which he chronicles the whole thing. It’s called– what else– Facebook, The Inside Story. We have an excerpt of that book on our website at sciencefriday.com/facebook. Steven Levy, welcome back. It’s been a while.
STEVEN LEVY: It’s good to be back. Yeah, yeah, here I am. I’ve been working on this book.
IRA FLATOW: I think we go back 40 years or more talking about stuff. OK, so let’s get right into the book. Mark Zuckerberg has a reputation of not being the most interview friendly. He goes into moments of silence, which you talk about these icy stairs in your book. How did you convince him and Facebook to let you follow him around all this time?
STEVEN LEVY: It was almost four years ago now. The day he announced that a billion people were online on Facebook the same day– that was in 2015– I knew I had to write this book. So I approached Facebook. First, they said no. No one had ever done this before. They’d never given access to their executives for such an extended period of time. But I just kept at it.
And because I had covered Facebook for so long, and I knew Mark and I knew Sheryl Sandberg, they felt, well, OK, maybe an outsider coming in here would be OK to tell our story. Now, this was before the 2016 election. It got a little dicier after that. But I got the OK, no strings attached. They didn’t get to look at an advanced copy or anything. And I started talking to everybody, both inside and people outside of Facebook of course, too.
IRA FLATOW: He outlined his vision in notebooks that he called, quote, “The Book of Change.” I want to read part of a quote where he sums up his vision. He says, “using Facebook needs to feel like you’re using a futuristic, government-style interface to access a database full of information linked to every person. User needs to be able to look at information at any depth. User experience needs to feel full.” I mean, that idea that it’s a futuristic, government-style interface– doesn’t that set off bells in your own mind?
STEVEN LEVY: Well, yeah, it did. And it was amazing to me. I really wanted to get a hold of one of his notebooks. It was tough, because I had heard it was legendary he would keep these notebooks. But at a certain point, he destroyed it. But I heard that he made copies for parts of it for some employees. And I just kept trying really hard to figure out who those people were and whether I can get a piece of it. And I did.
And at the same time, right around the time I first met him in the spring of 2006– and not able to get an answer of me would stare at me when I asked him a question– he was creating these notebooks of great detail to redesign Facebook and for the first time, really, figure out how he was going to scale it to the world. And the thing you just read just shows how ambitious he was. And this wasn’t stuff that he was sharing to outsiders at that point. But these notebooks are really a glimpse into his psyche.
IRA FLATOW: Our number, 844-724-8255, if you’d like to talk to Steven Levy, author of Facebook, The Inside Story on Science Friday from WNYC Studios. During these early days when he was sketching his grand vision out, how was he thinking of privacy? He talked about these databases. I’m just thinking it’s a geek tackling a giant thing that he may not be able to hold on to, like a tiger by the tail, after a while.
STEVEN LEVY: Right. Well, Mark has always had this push-pull relationship with privacy. Even before Facebook, at Harvard, he did this application, this project which became famous later on called Facebash, which compared how attractive one coed or Harvard student was to another. And he got in trouble for that.
And then Facebook itself actually had some privacy built in. And he didn’t steal the information from the Harvard servers. He made people bring their own, and they chose what to put on Facebook. But what happened to that information after they put it on Facebook has become a source of huge controversy, because Mark just kept pushing things. He always believed that sharing was, in itself, a good thing. Whether or not that turned out to be a good idea, we could talk about.
But he would often release products that pushed privacy issues farther than his users wanted to go. And his motto was, move fast and break things. So sometimes he released something, and people would object to it, and he might either scale it back or add a little tool to help out, or sometimes just say, hey, I think you’re going to like this in the long run.
And some of the things he introduced were things that people around him said, Mark, that’s not a good idea. That isn’t very privacy friendly, and we can get in trouble for that, or it’s just wrong. And he would think, well, let’s do it anyway. Let’s see what happens. And then a couple of times, some of his employees even refused to be involved with the launch of some of these projects.
IRA FLATOW: I’ve got a minute before the break. But that’s what engineers say. Let’s just do this thing, right? Right? Engineers– they say, let’s do this and see if it breaks or what happens. But that’s not something you might do with people’s lives, playing it that way.
STEVEN LEVY: That’s right, yeah. Yeah, and at a certain point, the consequences of these projects became bigger and bigger and bigger as Facebook went around the world. So it wasn’t a dorm room anymore. It was something where people can get really hurt, and they did get hurt.
IRA FLATOW: All right, we’re going to talk more about that. Our number– 844-724-8255. You can also tweet us @scifri. Talking with Steven Levy, author of Facebook, The Inside Story. He’s also an editor at large at Wired here in New York. Again, number 844-724-8255. We’ll be right back after the break. Stay with us.
This is Science Friday. I’m Ira Flatow. In case you just joined us, we’ve been talking with Steven Levy, editor at large for Wired, about his new book, Facebook, The Inside Story. And Steve, one of the stories that you have really fleshed out in depth, I think, like no one else has is the Cambridge Analytica scandal two years ago. Give us a profile of what you found there and a little bit about the scandal.
STEVEN LEVY: Sure. Actually, I had a lot of fun putting that together as a narrative, which I don’t think was done in a clear way before like this. Actually, I believe that Cambridge Analytica, which was the scandal where the information of 87 million Facebook users, their profiles, fell into the hands of a researcher in Cambridge University, who then handed them over and licensed them to this company in the UK that was co-funded by an extreme right-wing financier, Robert Mercer, and then was used to help elect Donald Trump.
And I actually feel it began in 2010. And that was when Facebook decided to give away this massive amount of information to people who wrote programs, [INAUDIBLE] Facebook, software developers for this operating system called Platform. And Mark had a idea for a product instant personalization, which required a lot of information.
So if you clicked on one of these apps that run on, like Spotify or something like that, you were not only giving your own information away but the likes and relationship status and political proclivities of all your friends. And since each person has 130 friends on average, it doesn’t take many people to get a database of millions. So I follow it as this researcher in Cambridge learned how he could make these surveys that a few people would take and then get this massive amount of information that– met with a fellow at Cambridge Analytica, who handed it over to the company, and the company went and used it, at first, to help Ted Cruz.
And a journalist actually broke that story in 2015, and Facebook didn’t do very much with it. Even though Facebook realized the scandal in December 2015, they just sent these forms to Cambridge Analytica and the researcher saying, could you verify you deleted the data, and weren’t very aggressive in following this up. And during the whole election, Cambridge Analytica never bothered to say, yeah, we deleted the information. Meanwhile, Facebook was selling Cambridge Analytica millions of dollars of ads. So they were basically asleep at the wheel.
IRA FLATOW: Did they apologize for any of these things?
STEVEN LEVY: Yeah, they do apologize. But once it happens again, you wonder– don’t say sorry. Do something about it. And the last three years, it’s been one thing after another– and I write about all of them– where Facebook is not getting ahead of what’s around the corner.
And I think if you look at the history of Facebook in a nutshell, they would release products, and they wouldn’t be asking themselves what could go wrong because they were doing unprecedented things in terms of scaling out their system to the world. Facebook has almost 3 billion people now. If it were a country, it would be the biggest country in the world. And people use it in all sorts of ways in all kinds of countries where people aren’t very sophisticated about digital literacy. They aren’t sophisticated about digital literacy here, clearly. And it’s up to Facebook, really, to own what it’s created.
IRA FLATOW: And you write that they have a threat intelligence team and other departments that focused on security, but it doesn’t seem like they’re doing much.
STEVEN LEVY: Yeah, in this case, we’re talking about the discovery that the Russians used Facebook as a platform to spread misinformation. Basically, this information campaign in the 2016 election– Donald Trump is still saying it didn’t happen, but the intelligence agencies said it did. And Facebook understands that it did. And Facebook didn’t discover this until well after the election was over.
It was actually one of their smart researchers in threat intelligence in DC. I tell the story of one day how he put it together. It led to this office in St. Petersburg, where this information destruction team that was part of their Russian intelligence agencies was putting all this fake stuff on Facebook and trying to get people who were anti-immigration and people who are pro-immigration to gather in a demonstration in a city and fight each other, things like that.
IRA FLATOW: So how do you read that Facebook is approaching the 2020 election? Are they making any patches or changes or an overhaul of the system?
STEVEN LEVY: Well, they’re under such pressure that they’ve had to do quite a lot. And they have put in a lot of extra monitors. And they’re a little more aggressive now about trying to identify what fake news is.
A problem is that some fake news is perfectly acceptable on Facebook. If you or I were to post something totally made up, Facebook wouldn’t say, uh-uh-uh, you can’t do that. They would wait to see if people complained about it. And then, if they did, they might hire some fact checkers. They have these fact checkers on call to say whether it was true or not.
And even then, they wouldn’t take it down. They might put extra information for people when they came across it to say, hey, you might want to think carefully whether this is true or not. So given that the basic problem with this kind of manipulation is the fake news appearing, cutting it down doesn’t really solve the problem 100%.
IRA FLATOW: Twitter announced that they’re considering putting a red flag under political tweets that could be misleading. Is Facebook might think about something like that?
STEVEN LEVY: Well, at first they did. They said, well, if the fact checkers find this is fake, we’ll put a little extra little note on it saying, hey, this is under dispute. But they found that people clicked on it more in that case.
IRA FLATOW: Do you still have a Facebook account?
STEVEN LEVY: Yeah, I do.
IRA FLATOW: Do you ever think about erasing it?
STEVEN LEVY: Well, I mean, look, I write on Facebook. It’s a good way to find people on Facebook. But I do. I’m very careful about setting the privacy settings. And Facebook has responded to claims– one person who worked in privacy at Facebook early on told me you almost needed a PhD to figure out how to use their privacy controls. And they have improved that. So if you find those things in settings, privacy settings, you can go and limit who sees your stuff. Of course, Facebook will see all your stuff.
IRA FLATOW: And here’s a tweet from Wolfgang, who says, “If there are laws against propaganda speech, would Facebook be banned for promoting false information in the countries wise enough to have such laws?”
STEVEN LEVY: Well, I guess any law would have to follow our First Amendment. And that’s the issue. When you’re a platform which distributes what people say in a wide manner, legally all you’re required to do is follow the First Amendment protections. And the limitations are things like libel or terrorism, things like that.
Facebook has to actually do more to make their platform a place that people would even think of going on. So they try to get rid of hate speech and bullying and things like that. And they have a rulebook that they set up.
I described the evolution of that rulebook from– at first, it was almost like a dorm room conversation in their office between a few people who were in charge of fielding complaints. But now they have tens of thousands of people around the world trying to moderate these things. And making these decisions in bulk is tough. They make a lot of mistakes. But in short, I can’t imagine, without changing the Constitution, putting an edict against, quote, “propaganda.”
IRA FLATOW: Let’s go to the phones to Fresno with Karen. Welcome to Science Friday.
KAREN: Oh, Hello.
IRA FLATOW: Hi there. Go ahead.
KAREN: Well, I was just asking how– with Facebook, they claim that it’s actually swayed the election and stuff like that. And with that kind of platform, I mean, if it’s a social media, how could it possibly– where would that come into–
IRA FLATOW: Swaying an election.
STEVEN LEVY: Yeah, so no one is 100% sure whether in particular the fake news or the Russians’ information– what impact that had on it. It’s very tough to measure. But we do know that Facebook probably did have a very big impact in that one campaign, the Trump campaign, used it much better than the Clinton campaign.
And I talk about this, how the Trump campaign played it like a Stradivarius and the Clinton campaign played Facebook maybe like a homemade banjo. The Trump campaign would sometimes put as many as 175,000 ads a day on Facebook, using a lot of information they had on people, which is the way you circulate ads on Facebook. You use all the information that Facebook has and sometimes merge it with voter registration data or things like that.
So they were able to target ads to the people saying, hm, are you pro-gun? So let’s see if you’ll respond to this. Or you care about immigration? You don’t like immigration? Let’s see if we can get you to vote for Trump this way. And the people they figured who were hopeless in terms of voting for Trump– then they would send ads to them that made them disgusted with the whole situation in general in hopes that they wouldn’t vote at all. So they did a really good job of using Facebook the way it’s supposed to be used, which is disturbing in itself.
IRA FLATOW: And so Facebook knew all about this?
STEVEN LEVY: Yeah, they did. They sat there and watched as Trump used it so brilliantly. They’ve described to me in awe about how Trump was using it, but they really didn’t think he was going to win. Like a lot of people, they thought, well, he’s doing a great job on Facebook, but he couldn’t possibly win. So they just let it happen and weren’t upset about it. And to be fair, it wouldn’t be right for Facebook to say, we’re going to give you our services on one side and not you on the other. They offered it to both.
IRA FLATOW: And so I would imagine, since it worked so well for the Trump campaign in the last election, everybody is going to try it in 2020.
STEVEN LEVY: Yeah, they are, but the Trump people have a head start. And it looks like the Democratic side still– you have to go candidate by candidate, but still hasn’t caught up on really using Facebook to the maximum, like the Trump did.
IRA FLATOW: Does one candidate have a better team than the other?
STEVEN LEVY: Well, probably of all the candidates on the Democratic side, I would think the Bloomberg would know how to use this stuff, because they put terminals in all the campaign offices.
IRA FLATOW: They did. I’m Ira Flatow. This is Science Friday from WNYC Studios, talking with Steven Levy, author of the book Facebook, The Inside Story. He’s also editor at large for Wired. We have an excerpt on our website at sciencefriday.com/facebook. Why did Facebook succeed, when you had all those others before it, the other social– that we don’t even remember their names?
STEVEN LEVY: Well, Friendster, Myspace.
IRA FLATOW: Why did Facebook become such a big hit and none of those survive?
STEVEN LEVY: Well, think one reason was it started in a small community. From the get-go, it didn’t try to connect the whole world. That wasn’t something that occurred to Mark until he was a couple years in. So it was a community where people knew each other, so there was a degree of trust. And the people you didn’t know were only a couple friends away.
And it was limited. It had a privacy benefit built in because it was only the people on your college internet domain they could see things like your profile. So he was able to test out and perfect his product within this little Petri lab. And then, when he extended it outwards, he had a good grasp on how to do that stuff.
IRA FLATOW: There are a lot of people– there’s so many, I don’t want to call anyone caller online or tweeting– saying, you have free will. If you don’t like Facebook, just get off the system.
STEVEN LEVY: Well, whether you’re on Facebook or not, Facebook is big in your life. So many people are on it that the discussions that are on it affect you. And as we talked about on things like elections, that affects you whether you like it or not.
But the thing is that for everything that Facebook does we complain about– this is what I found in the book that was frustrating, that they had been warned about that. So things like anti-vaxx posts– someone told me, gee, in 2015 I was complaining about that. And then things like they went into other countries before people at Facebook even knew how to speak that language.
They’d hire native speakers in the countries they went. And they didn’t know when content was put on there, then in some cases would start riots, and people would die. It wasn’t until 2015, for instance, a few years after violence had broken out over Facebook posts, that they even bothered to translate the rulebook into Burmese.
IRA FLATOW: Well, it’s all in Steven Levy’s huge tome of a book. It’s called Facebook, The Inside Story. Thank you, Steven, for taking time to be with us today. Great work.
STEVEN LEVY: Always a pleasure, Ira.
IRA FLATOW: And you can read an excerpt, as I say, up on our website at sciencefriday.com/facebook. One last thing before we go– former NASA mathematician Katherine Johnson passed away this week. She was one of a group of African-American human computers.
Remember her featured in the 2016 film and book Hidden Figures? In her decades at NASA, Johnson’s calculations charted the safe and early space flights for astronauts like Alan Shepard and John Glenn. And in 1969, her trajectories took the Apollo 11 crew to the moon and back. We spoke with the Hidden Figures author, Margot Lee Shetterly, about Johnson’s story back in 2016.
KAREN: It’s an almost unbelievable thing. I mean, we have a situation. Were in Virginia, which is still a segregated state, and yet here is Katherine Johnson, a black woman, working with white male engineers and saying, listen, I’m the one to finish the report that describes the orbital flight that was upcoming.
When they were counting down to the 1962 orbital flight of astronaut John Glenn, you can imagine the kind of checklists and anxiety around that. This was a very complicated mission that they had. One of the checklist items was having Katherine Johnson basically take a set of data that went through the computer, basically simulating the upcoming flight, having Katherine Johnson take the raw numbers and run them through all of the equations that had been programmed into the computer by hand to make sure that the computer’s results were the same as her results.
IRA FLATOW: Margot Lee Shetterly talking about Captain Johnson on Science Friday back in 2016. Katherine Johnson passed away on Monday at the age of 101. We also got word today of the passing of the physicist and mathematician Freeman Dyson, known scientifically for his work on quantum electrodynamics but more widely for his outside-the-box ideas on topics from energy to space exploration to the nature of humanity. He wanted to, for example, genetically engineer a turtle that had teeth of steel so he could do away with waste. Freeman Dyson died this week at the age of 96.