Sorting Out Social Media Feeds
This week, Instagram announced a proposed change to how its photos will be filtered—algorithmically, instead of chronologically. The proposal set off an uproar from users fearing that the app would become part of a social media algorithmic apocalypse. Technology writer Will Oremus and data scientist Hilary Mason break down the public outcry and the different ways social media is sorted across platforms.
Will Oremus is a senior technology writer for Slate in New York, New York.
Hilary Mason is founder and CEO of Fast Forward Labs in New York, New York.
IRA FLATOW: This is Science Friday. I’m Ira Flatow. This week Instagram announced their proposed change to its photo feed. The photo sharing app is going to ditch the chronological ordering of photos, you know, send them as they come in and they’re going to turn to an algorithmic filter.
We’re going to talk about what that means, because Instagram has not made the changes yet and, already, users are up in arms. There was an immediate call to arms on social media. A Change.org petition was put out, which now sits at over 300,000 signees, and the hashtag Keep Instagram Chronological has started trending.
So what is an algorithmic filter? Why the outcry? How are posts being sorted across platforms? How does this determine what we see online? Those are the questions we’re going to try to get to the answers with Will Oremus, senior technology writer at Slate here in New York, Hilary Mason, founder and CEO of Fast Forward Labs, also in New York. They’re in our CUNY Studios here. Welcome. Good to see you both again.
HILARY MASON: Great to see you too.
WILL OREMUS: Thanks for having us.
IRA FLATOW: And we have a number to call for folks who want to talk about it. Our number is 844-724-8255, 844-SCI-TALK. You can also Tweet us at SciFri. I would like to know do you like a certain type of feed for a specific purposes? A chronological feed for keeping up with sports, you want that? Or does an unfiltered dump of posts feel overwhelming to you?
Will, you’re gritting your teeth almost about this. What’s going on here? What’s the difference? Why do this?
WILL OREMUS: Well, it’s so interesting. Every time people hear about their favorite social media service changing how it shows them the posts in their feed, they get so, so upset. I mean, it shows you, doesn’t it, how personally we take this, and how important it is to us really. Whether it’s Facebook or Instagram.
IRA FLATOW: Yeah.
WILL OREMUS: Twitter had a similar backlash recently when it announced that it would start using some algorithmic filtering in its feed. But, you know the funny thing is that, as much as we hate the idea of this, the massive success of Facebook, which is really the sort of prime example of a social media feed that is ordered by a complex algorithm, shows us that it works. I mean, people keep coming back to Facebook. We say we don’t like Facebook making these choices for us, and yet we use it every day.
IRA FLATOW: Hilary, what is this term, algorithm, mean. How does that show up differently than what’s happening now?
HILARY MASON: So an algorithm is really just a recipe. In this case, it’s a mathematical way of processing the data and what they’re doing here is actually not that complicated. They’re looking at the things that cause you to engage more, to like photos, to spend more time looking at them. They’re using that to train a system that will predict which photos in your feed you’re going to like the most, and then they’re going to show you that set of photos rather than just the chronological set of photos you wouldn’t have seen otherwise.
IRA FLATOW: The fact that Instagram was bought by Facebook and, you know, is Facebook trying to make it more look like Facebook?
HILARY MASON: Well, I can’t speak for Facebook’s intent, but I do know that both Instagram and Facebook have wonderful teams of machine learning engineers and they have strong incentives to keep us all engaged with looking, and clicking, and liking those photos.
IRA FLATOW: Will, is this all about money? I mean, everything in this game is getting more clicks, or advertising, things like that.
WILL OREMUS: Yeah, it is and it isn’t all about money. I mean, it’s all about money in the sense that Facebook is a public company. It’s trying to keep making money hand over fist as it’s been doing with extraordinary success for the past few years. But it’s a little too simple to say that a change like this is just about maximizing revenue in the short term.
Because a company like Facebook, especially where Mark Zuckerberg actually controls a majority of the voting shares, they’re looking at the long game. And, so, yes, they care about making money. They think that sorting your feed according to what they call a relevancy score, that’s the output of this machine learning algorithm they have, sorting your feed according to how relevant each post might be to you, helps them to maximize revenue in various ways, but most importantly, what they’re always looking at is, does it keep you happy with Facebook?
Does it keep you coming back? And they’re going to want to know the same thing about Instagram. They’re going to be studying it. From my experience with Facebook, they’re going to be studying it from 25 different angles. What are all the effects of these changes to the algorithm, and are they making people happy, or are they making people unhappy? And then, thus Instagram would lose money in the long term by turning people off.
IRA FLATOW: About 300,000 signatures on a petition sounds like to me, Hillary, like unhappy. At this point.
HILARY MASON: So it’s particularly interesting because the tension here, I think, is not so much that people are resistant to algorithms. These are similar to the algorithms they already use to target ads at you, for example, on all of these platforms and we don’t hear people really complaining about that every day.
What’s different here, though, is that this algorithm is now mediating a social action, so it’s sitting between me and you and our friendship. And it’s deciding, sort of, what part of that friendship I’m going to see and you’re going to see and I think that’s where the tension comes from. It’s not from the mathematics. It’s from the social interruption
IRA FLATOW: If it was a real person doing it, do you think it would be different, people would have a different reaction? The fact that they’re being processed by some algorithm. You know, I’m a person.
HILARY MASON: And I think that if there were actually a human being at Instagram HQ sitting there deciding, oh, is this a good enough picture to put in Hillary’s feed, I would be probably more annoyed. I think people are less consistent than algorithms.
IRA FLATOW: Yeah, we– yeah, go ahead Will.
WILL OREMUS: What’s interesting, though, is that actually– so Facebook’s algorithm is only as good as the data that goes in. That’s why it’s always studying your behavior on the site. It’s looking at what you click on, what you like, what you share, over time it’s been looking at more and more advanced metrics.
Like when you click on a post in your feed, how long do you spend reading that post on a different website before you come back? And do you hit like on it before or after you’ve read it because if you hit like before you’ve read it, maybe it was just click bait. Maybe it wasn’t actually that great of a story.
And one of the things they’ve turned to recently is actually a more personalized form of data. They have now hired thousands of people around the world to go through their Facebook feed every day and say, which stories they actually would like to see at the top of their feed. They rate everything they see and give that sort of human feedback that Facebook thinks maybe this is the kind of data that will make people feel better about having an algorithm make those choices for them.
We took a poll of our Twitter followers and with the first 2,000 people who voted, we got them in and we found that 92% prefer a chronological timeline, only 8% want an algorithmic filter feed. Not surprising.
HILARY MASON: It’s not that surprising. Again, because people want to have a sense of control over what they see in their social network and I think one thing we’ve lost over here is that the algorithm is not transparent, It is, by definition, a black box and if they told us what was going on, we’d exploit it. So they can’t do that.
And we can worry about what biases the creators of that algorithm, the people doing the future engineering might have. Think about what if, you know, Facebook machine learning engineer really likes dogs and really hates cats and that gets built into the algorithm. Is that a world we’re happy with? It’s very unlikely, but still these are questions that are reasonable to ask because we don’t know what’s happening under the hood.
IRA FLATOW: We have some Tweets coming in with people reacting to it. Christine Rogers says, chronologically, I want it so I can follow breaking news. I use a hash tag for relevance. [? Nolonger ?] says, a good algorithm that I can tweak. Please, what happened to the tailorable interfaces and transparent personalization?
Maybe, is there a middle ground that you can flip a switch and say, I’ll take it either way? Choose which way you want to have a [INAUDIBLE].
WILL OREMUS: You know, Twitter has really been trying to find that middle ground. Facebook has it. Facebook’s all the way on one side. They’re all about the algorithm. It works for them. They’re going to stick with that.
Also, I should note Facebook doesn’t really care that much, at this point, about following along with events in real time. That has always been Twitter’s game. Its edge is, that because the feed is chronological, you can have an ongoing conversation, you can discuss what you’re watching on TV right at that moment. Twitter is never going to get rid of that entirely I don’t think. And if they do, I think it will doom the company.
So they’ve been looking for a middle ground where they show you some algorithmically selected tweets at the top of your feed, and then they put the rest of the tweets in chronological order. They’ve tried the things that some of your Tweeters talked about. They’ve tried giving people options. Facebook has tried giving people control over their feeds.
People don’t do it. By and large, people say they want it, they actually don’t use that control when it’s given to them. And they don’t toggle between an algorithmic feed and a chronological feed and that’s why Twitter is trying to do it all in a single feed.
IRA FLATOW: Is this an effort to take on Twitter at all? Do you see this as a competition?
WILL OREMUS: Yeah, you know, it’s an effort to take on Twitter. I mean, I think Twitter feels like it’s getting crushed by Facebook and I think that is why Twitter has been looking at more ways to surface the tweets at the top of your feed that it thinks will appeal to you. It sees how well that has worked for Facebook.
Twitter is in a real bind because, when it went public a few years ago, the idea was that it was the next Facebook. And so all its’ investors were counting on it to get that big. Well, it turns out that about 20% of American adults love Twitter and it’s chronological time line and the other 80%, they don’t really care that much. And Twitter has to find a way to capture some of that other 80% that Facebook has really managed to compel with its algorithm feed.
IRA FLATOW: And its stock has taken a real beating since then, has it? Hillary, how predictive can these algorithms become in the future?
HILARY MASON: So one of the really common misconceptions about this sort of machine learning is that it can perfectly model your individual behavior as a unique human being into the future and that’s really not the case. These things tend to be quite accurate at the population level, or when doing an aggregate analysis, but that doesn’t mean that it can say that you specifically are going to engage, just that you may be more likely because you’re part of this population.
And so this is really not something you need to worry about in terms of being able to say that precisely this time of day you’re going to click on that photo of a puppy or a baby, but rather that it may choose to bias the things that you see by what people like you actually do and how they behave.
IRA FLATOW: Let’s see if I can get a couple of phone calls in. Our number 844-724-8255. Let’s go to Birmingham, Alabama. Tesla, is it?
TESLA: It is Tesla, yes.
IRA FLATOW: Very appropriate name.
WILL OREMUS: Great name.
TESLA: As the man and not the car.
IRA FLATOW: Do you have lightning bolts going everywhere around you? Oh, God, I’m sorry. Go ahead.
TESLA: Yeah. Yeah. I just had a question in regard to something Will had said, you know, about Facebook users, you know, staying on and not leaving and stopping using Facebook because of the algorithm. And I just feel like if there was probably another option that was up to par with Facebook, I’m not sure that that would be may be accurate.
And even with Instagram. There’s not anything that’s really like Instagram, so, of course, we’re staying because we love these sites and services, but we still don’t like the way that it’s happening. And as far as toggling between chronological and– sometimes when you go to your most recent, it can be you’ll see a post that says eight hours ago, and then on top, and then five posts down, it says two minutes.
So it’s not even really in order. I just kind of felt his comment’s probably accurate, but if there was something– if there were other options, I’m not sure that we would all still be using these services.
IRA FLATOW: All right. Good point. Will, you want to comment on that?
WILL OREMUS: Yeah. I think your caller is absolutely right that there is genuine anger and genuine resentment among users of these services when these changes happen. They don’t like the fact that some engineers in Silicon Valley are deciding for them which friends posts they see at any given time.
That said, you know Facebook has huge teams of data analysts who are looking all the time at this immense array of metrics, and every time it changes anything in the algorithm, it’s going to look at 15 different ways that that’s affecting people’s engagement. And what the data in aggregate tells Facebook, again and again, is that if you use the right data and you tweak the algorithm in the right way, that it will keep people coming back.
That doesn’t mean that some people might not be turned off. It doesn’t mean that people might be using the service and be unhappy about it, but that’s actually something that Facebook’s been working on as well, which is that they are now directly asking more and more users, are you happy with Facebook? Did this post make you happy or did it make you upset? And they are trying to address that real problem I think.
IRA FLATOW: I’m Ira Flatow. This is Science Friday from PRI Public Radio International. Here in New York talking with Hilary Mason and Will Oremus. We have a question from Twitter. MJ [? Mahameto ?] tweets, maybe we should ask, can we come up with an algorithm that best represents time? Is there an algorithm, because he just heard what the caller said, I get one that says eight hours ago after I got one that said two minutes ago.
WILL OREMUS: I want to hear Hillary’s thoughts on this, but I just want to point out, I mean, a chronological feed, there’s still an algorithm there.
HILARY MASON: That is an algorithm.
WILL OREMUS: It’s an algorithm that is sorting all the possible tweets from the people you follow. It’s only sorting them on one criterion, which is time.
HILARY MASON: Right. And we might be able to add some other simple things into that. Like maybe we’d sort it by time and geography, or time and geography and that friend of mine who has post 10 pictures of their food before they eat it. Let me only see one of those. One is sufficient. And these kinds of heuristics are algorithms themselves, but time itself is pretty straightforward.
WILL OREMUS: And the drawback to doing it just by time is, this is what Instagram said when it announced this change that has caused such a backlash, they said, look, you’re missing, on average, 70% of the posts in your feed.
Now we can’t change the fact that we’re going to miss 70% of the posts, but what if we could come up with a way to make sure that the 30% you are seeing are the ones you care about more. What if it isn’t just the most recent 30% that you want to see. What if there are some others that might be more important to you.
IRA FLATOW: Yeah, because you’re not going to see those posts. They go by so fast, right? So you’re saying, if we make an algorithm that you get to see the same people that you deal with all the time, maybe you’d like that instead of just chronologically.
HILARY MASON: Yeah, it’s a pretty compelling argument. The idea that you’re going to see some 30% of the set of photos that exist in your feed and why don’t you see the best 30%? Why do you only see the 30% that happen to be there when you’re paying attention?
IRA FLATOW: Let me see if I can get one more call in. From Gainesville, Florida. John, hi, welcome to Science Friday.
JOHN: Ira, thanks for taking my call. The thing that struck me when I was listening actually is a different thing. We’re worried about that interface, and I work in IT and the interface is really important. People care a lot about it, but what really struck me is another feed that understands us.
So, my comment was really, we are getting to a place where we know more and more things so, I know they already hold the data, but now, if you’re going to have Instagram, another feed that’s like another window, so who understands who I am?
Some people know me really well and where’s that data being held? Where’s that understanding of me and what I’m going to click on and what I’m going to like? And how is that used? Its privacy, but it’s not just privacy.
Now it’s taking the level of marketing and advertising of Mad Men to a very, very different place. And so there’s a bigger question to me about the results of that algorithm, not just the fact that it’s using you.
IRA FLATOW: OK. Hilary, do you want to react to that?
HILARY MASON: Yeah. That’s a super important point specifically for people who build algorithmic systems and keep in mind that these sorts of algorithms are not just in our social media feeds. These are the same algorithms that run our spam filters, your Netflix, movie selections, your Amazon.com, and other e-commerce products suggestions.
And so there’s quite a bit of information about what you are likely to do that exists out there in the hands of various commercial entities. That said, again, I’ll repeat the point that it’s not particularly accurate when modeling any individual. Much more accurate at the level of the population.
IRA FLATOW: Big data.
HILARY MASON: Indeed. But this is a conversation that we do need to have about who owns that data and who owns that model.
IRA FLATOW: We’re going to have that many, many more times. Hopefully, with both of you. Will Oremus, senior technology writer at Slate. Hilary Mason, founder And CEO of Fast Forward Labs. Thanks for taking time to be with us today.
WILL OREMUS: Thanks, Ira.
HILARY MASON: Thank you.