02/17/2017

The Price of ‘Free’ Internet Services? Your Privacy

17:19 minutes

Credit: Shutterstock
Credit: Shutterstock

What does Facebook know about you? Your political views? Smartphone make and model? Whether you live in your hometown? Try all of the above. (Click on “Your Information,” then select “Your categories.”)

As we trade more and more of our personal data to big companies in exchange for their services, internet users must decide for themselves where to draw the line on internet privacy. In this segment, Manoush Zomorodi, host of WNYC’s Note to Self podcast, discusses privacy online—the subject of her latest project The Privacy Paradox.

And Eric Umansky of ProPublica joins in to talk about ProPublica‘s Facebook advertising experiment, in which they discovered that the social media giant allowed advertisers to exclude certain users, based on racial categories.

donate2

Segment Guests

Manoush Zomorodi

Manoush Zomorodi is host of the TED Radio Hour, and author of the book Bored and Brilliant. She’s based in New York, New York.

Eric Umansky

Eric Umansky is Deputy Managing Editor at ProPublica in New York, New York.

Segment Transcript

IRA FLATOW: This is Science Friday. I’m Ira Flatow. Facebook now has 1.2 billion daily users. Are you one of them? If so, what do you think the social media giant knows about you? I checked on it this morning in the ad preferences panel. I looked up my own Facebook account and found that Facebook has pigeonholed me in the following categories– one, that I like people who primarily use a 4G connection. I don’t know how or why that’s important. But that’s what they decided. I like people whose birthdays are in the month of March. Wow, surprise, surprise again. And this one really was surprising. I prefer roommates, people who room with each other than to people who live with family members.

OK, now you know everything you need to know about– I don’t even remember telling Facebook that I have roommates. And in fact, I don’t have any. So this, I guess, is the price we pay for all these free services, isn’t it? You pay with your privacy and what they think your life is about.

Our friends over at WNYC’s Note to Self podcast have been exploring some of those tradeoffs, what our smartphones track, how algorithms sell us stuff, and how to claw back a little personal space. It’s their latest project, the Privacy Paradox. The host and managing editor of Note to Self, Manoush Zomorodi, is here to share what they found. Welcome back, Manoush.

MANOUSH ZOMORODI: It’s great to be here, Ira. Thanks for having me.

IRA FLATOW: And if you want to hear what you think, and we do want to hear what you think, I like to give out much of our privacy. Do you want to give it out? Do I want to give mine out? You can give us a call at 844-724-8255. That’s 844-SCITALK. Or tweet us at @SciFri.

Manoush, this is the latest boot camp from Note to Self?

MANOUSH ZOMORODI: Yeah, that’s right. This is our third year in a row that we’ve done something interactive with tens of thousands of our listeners. And this time around, Ira, we wanted to take on this idea, this big vast idea, of digital privacy because we spend so much time online these days. And it seems as though our civil rights don’t necessarily extend to the virtual world where we live a lot of the time.

IRA FLATOW: And you developed a quiz, a privacy, personalized personality quiz, to gauge what people are comfortable with in terms of sharing data. So what type of privacy personality are you?

MANOUSH ZOMORODI: Well, it is interesting that you ask. Before we did the project and I took the quiz, I was a realist because I felt– I answered all these questions. I kind of understood that for a service to be really personalized, I have to give up some of my information, my personal information, my data, so that the service does what I want it to, so that Twitter knows who I want to see next. Right?

But after having done this five-part plan, the Privacy Paradox, I have to say, I have come around. And my score went up. I am now a believer.

IRA FLATOW: Wow.

MANOUSH ZOMORODI: Yeah, a believer. I actually went to the National Archives. And I went to see the Fourth Amendment. It actually wasn’t the Fourth Amendment originally. But in any case, this idea that our right to privacy is actually something incredibly innate and written into the Founding Fathers’ doctrine as to how we want to live.

And I came up with the idea of the project before the latest administration came into office. But now, as we talk about more marginalized populations being concerned about being targeted for their beliefs, their affiliations, or their origins, more and more, this idea of privacy, it’s less ephemeral and more real, very real.

IRA FLATOW: Mm-hmm. I’d like to play a clip, an experience of one of your listeners, an experience that they shared with you. And this listener wanted to remain anonymous. So this isn’t her real voice. But it’s really a powerful example of internet ad targeting gone wrong. Let’s hear that.

[AUDIO PLAYBACK]

SPEAKER: I felt recently that I’ve had some problems with drinking a little bit more than I should. I don’t know if it’s a huge problem. But this morning, I was actually just kind of googling to see what my options were and if I could talk to someone. And then I logged into Facebook. And the first ad that I saw was for my local liquor store. And there’s just something so inherently offensive in that. I don’t even really know how to react to that. I mean, it wasn’t like I was getting ads for therapy or something helpful. You know, they just took this information that was so shameful and exploited it.

[END PLAYBACK]

IRA FLATOW: Wow.

MANOUSH ZOMORODI: Yeah, I mean, Ira, to me– we did a call out to our listeners. Tell us your stories about privacy. And there were, of course, some people who told us the classic stories, that their Social Security number was stolen or they were hacked in some way. Other people said, you know, it hasn’t happened yet. I feel a little creepy about my data being out there, but nothing has happened.

But then we had all these stories in the middle, this idea of personal privacy being stepped on. Another guy married into a very religious family. Facebook changed its privacy settings and revealed that he was part of an atheist group to his wife, who subsequently divorced him.

IRA FLATOW: Wow.

MANOUSH ZOMORODI: All kinds of stories where you think, well, now that you share your ideas, your most intimate thoughts with the web, it doesn’t get protected there. Right now, the Fourth Amendment only covers what happens in your home. Well, our home is now virtual. It is online.

IRA FLATOW: I know that one of your episodes in the series focused on a project that ProPublica– all about algorithms and machine bias, what they know about us, and what they do with all that information. So I want to bring on another guest to talk about that. Eric Umansky is deputy managing editor at ProPublica here in New York. And he joins us here in our New York studios. Welcome.

ERIC UMANSKY: Thanks for having me.

IRA FLATOW: Welcome. You’ve done a lot of work investigating how algorithms can discriminate. And one of the really shocking things you all uncovered was how Facebook was allowing advertisers to specifically exclude specific racial groups from seeing certain ads. Explain. Tell us about that.

ERIC UMANSKY: Sure, so Facebook, if you look at the underpinnings of what Facebook does and what it serves, it’s great for showing photos of family and kids and newborns and your, of course, wonderful vacations that never go wrong. But what in fact it is also doing, its whole purpose really as a business, is to gather information about you as a user and then to offer that very detailed information to advertisers so that they can do what’s called microtargeting. If you’re an advertiser, you really only want to advertise to the people who are most likely to buy your product because that’s most efficient. And that is something that Facebook can do with a wealth of data that it has about users in a way that network TV–

IRA FLATOW: Can’t come close to.

ERIC UMANSKY: But they would argue, you know, you’re going to get ads. That’s our business. You’re going to get ads. So why not just give you ads of stuff that you’re interested in.

IRA FLATOW: Sure.

ERIC UMANSKY: So that actually can be a very powerful tool. It can be very helpful.

IRA FLATOW: But you’re saying that they leave stuff out.

ERIC UMANSKY: Right. So the question, of course, is always with tools that can be helpful, well, it’s a powerful tool. And what is the tool used for? And what are the possible uses of the tool? And what are the protections to make sure it’s not abused?

So in this case, Facebook was offering to advertisers– you have your little pull-down menu. I want only people from Portland who like medium roast coffee and so on and so forth. And a logical extension– if you’re an engineer, if you’re a software coder and you think, well, I’m going to include a menu that says, here are things– right? It’s just like a search on the web. You can say, here are things I want to include in the search, and here are things I want to exclude.

And so it’s a logical thing. It says you can exclude all sorts of things. And on the pull-down menu is, exclude ethnic affinities for people with quote-unquote “affinities” to African-Americans, people with “affinities,” quote-unquote, to Hispanics. Interestingly, there was no pull-down menu for people with ethnic affinities to whites, which is a telling thing about the values. You think of these as all zeros and ones. Maybe that’s what how the coders think about it. But of course, there are values involved.

And I am sure that people at Facebook and the coders at Facebook thought, well, this is all perfectly logical. This is the tool that we are giving. And gee, when we learned about that, we thought, well, it feels like other people should know that. And there are some real serious questions there. And you can tell by the response to it, which was really significant, we weren’t the only ones who thought that.

IRA FLATOW: Manoush?

MANOUSH ZOMORODI: Yeah, I love ProPublica’s work, and so that was our thing on day two is we told people about this targeting, this ethnic affinity targeting. And what I also love that ProPublica did is they tested it. They bought an ad for an event for housing, a housing event, and they specifically left out people who had certain ethnic affinities. And from what I understand, Julia Angwin, the investigative reporter who works with Eric, they bought the ad, no problem, and then went to a civil rights lawyer, who said, that is absolutely illegal. It is discrimination.

ERIC UMANSKY: I believe he gasped.

IRA FLATOW: So how did this end? Did it have a happy ending at all?

ERIC UMANSKY: Well, what happened is– and one of the contexts here is, of course– and let’s understand both the positive targeting that can happen and the negative targeting that can happen. There is certainly nothing wrong with somebody putting an ad on a certain TV station because they think those people might like the ad. Our government and our society has created rules in the past 45 or 50 years to protect certain classes for certain things. So there is a long history of racial discrimination in housing, long history of racial discrimination in hiring. And so our government, during the Civil Rights era, made laws that said you can’t discriminate on the basis of, for example, race when it comes to housing.

So that’s why, as Manoush says, we decided to test this tool, and not only say it as a hypothetical, but to do a housing-related ad. And we put it in, not simply because it raises ethical questions, but it raises legal questions.

And what happened? Well, what happened was first, the ad was approved in 15 minutes. And then, in what was, I think, a very unsurprising development, we published our story, and there was a wave of criticism from the Congressional Black Caucus, from actually the federal government, from many places. And Facebook, in a short period of time, announced that they were going to put protections in place to make sure this wouldn’t happen.

IRA FLATOW: Interesting. Manoush, in your series, you talked to Sir Tim Berners-Lee, who is considered the inventor of the World Wide Web, about a project he’s working on called Solid, a new way to store personal data on the web. We have a clip here in which he lays out his vision for how data should be treated.

[AUDIO PLAYBACK]

TIM BERNERS-LEE: I own all my data. I can have direct access to it. Anything that you as a company, as a bank or a supermarket, anything you keep about me, I can access at any point. And if it’s personal data, I can fix it. If they spelt my name wrong or they’ve got my blood type wrong, then I can fix it. So in the future, I’ll tell the bank, by the way, this is where my data lives. Every time a bank statement goes out, I am going to give you a key so you can put it in there. And I’ll trust you. And in fact, I require you that anything in the relationship between us– every wire transfer, every change of terms of service from your point– you have to put in here. And that information, if I want to give access to anybody else, then that’s up to me.

[END PLAYBACK]

IRA FLATOW: Is that the future do you think?

MANOUSH ZOMORODI: Well, that’s what Tim wants it to be. So that clip that you just played was actually from Day Number 5 of the Privacy Paradox. And on that day, I had the pleasure of going to visit him at his MIT lab. And we talked long and short term. In the long term, he thinks that the web that he built has been corroded, essentially, that it has a social problem, as he calls it, where we don’t have any transparency. We have no control over our data.

So what he really wants is a paradigm shift where, instead of me logging into Facebook, Facebook would log into me. And I would decide how much data I would give them. And I would be able to take it back. Now, is that going to happen immediately? No. That’s a big, tall order. But I mean, Ira, if there’s anyone who can do it, it’s the guy who invented the web in the first place. Right?

IRA FLATOW: Right This is Science Friday from PRI, Public Radio International. We’re talking about data and privacy. You know I was listening to what he said and what struck me is, well, if they want this data so much, why don’t they pay us for the data. You know? I was hoping that– why don’t they pay us for the data? They’re going to use it. They make all kinds of money from it. Here, we say, wow, we’re so lucky to have it for free. No. We should be paid for it. What do you think, Eric?

ERIC UMANSKY: Well, I actually think we are paying for it. Right? And we are paying for it because they are paying us in in kind ways. Right? So we give Facebook our data. And in turn, we get these really handy tools to keep track of what’s going on with our friends and what movies we should go see and so forth. The real question there is, is that transaction transparent or not? And too often, it isn’t. I mean, I was listening to Tim Berners-Lee and thinking about how far it is away from the world we are in now.

I mean, one of the stories that we did around Facebook as well is we pointed out that– so you looked up your information, Ira, and you found, well, gee, they think I like roommates. I don’t really like roommates. Right? But it turns out that’s the information that Facebook is telling you it has about you. It’s actually not all the information it has about you.

One of the things they are not showing you on that page, fundamentally, is that Facebook buys information about users from third party brokers, and in particular, information brokers who gather often financial information about your life– the size of your mortgage, your cars, your housing, and on and on, your credit score. And they put it all together. And that is, Ira Flatow, not just that he likes 4G, not just that he does or does not like roommates. The problem is, they’re not telling you about that.

IRA FLATOW: This is all interesting stuff. Do you think we can take back any of our privacy here? How do we do that?

MANOUSH ZOMORODI: Yes, so that’s what we tried to do with our series. This is a five-part plan. And we tried to take this sometimes terrifying, very overwhelming topic and break it down into small parts because that’s the only way I can wrap my head around it. So for example, Day 1, Bruce Schneier, the cryptographer, who I think has been on Science Friday– he talks us through something that a lot of us know we should do, but we don’t, and we never make the time– digging through the settings of all the apps on our phones, very simple things. So one listener told me he went through, and he found out that this flashlight app was collecting his contacts and his location and had access to his microphone. OK? Simple thing. Flashlight app, be gone. Something very, very simple that we can do.

And I think we have to start at very small points to understand what it feels like to take back a little bit of our privacy, and then maybe we can start to look into the future about how we do something more systemically.

IRA FLATOW: But you hope people are skilled enough to be able to go through all the menus and menus and menus you have to on a cell phones do that.

ERIC UMANSKY: Well, I think one of the challenges there if there is a massive power disparity. I mean, Facebook, the tool that we built with you guys and that we did on the project with Note to Self, actually, what we did and the thing you used, Ira, allows people to find at least what Facebook is telling them they know about them, which actually people could find anyway. It is on Facebook’s site. It’s just buried and buried and buried. And so one of the things that journalism can do and other outfits can do is take a little bit of this power back.

IRA FLATOW: Yeah, and hopefully, after people listen today, they will get a little bit more power back. And we have a link today to the Privacy Paradox and the ProPublica series. It’s at sciencefriday.com/privacy.

MANOUSH ZOMORODI: Come do it with us. It’s fun too. We try to make it fun.

IRA FLATOW: There you go. Manoush Zomorodi and Eric Umansky, thank you both for taking time to be with us today.

MANOUSH ZOMORODI: Thank you, Ira.

Copyright © 2017 Science Friday Initiative. All rights reserved. Science Friday transcripts are produced on a tight deadline by 3Play Media. Fidelity to the original aired/published audio or video file might vary, and text might be updated or amended in the future. For the authoritative record of ScienceFriday’s programming, please visit the original aired/published recording. For terms of use and more information, visit our policies pages at http://www.sciencefriday.com/about/policies/

Meet the Producer

About Christopher Intagliata

Christopher Intagliata was Science Friday’s senior producer. He once served as a prop in an optical illusion and speaks passable Ira Flatowese.

Explore More