Eli Pariser: What Search Engines Are Hiding From Us
Eli Pariser will appear tonight at 7 at the Revelle Forum at the Neurosciences Institute.
You could be getting too much of what you want from the internet, and not enough of what you need. That's one of the concerns in a new book by former Move-on.org director Eli Pariser. "The Filter Bubble" warns that algorythmic editing is transforming the internet experience into a "web of one."
This is a rush transcript created by a contractor for KPBS to improve accessibility for the deaf and hard-of-hearing. Please refer to the media file as the formal record of this interview. Opinions expressed by guests during interviews reflect the guest’s individual views and do not necessarily represent those of KPBS staff, members or its sponsors.
CAVANAUGH: I'm Maureen Cavanaugh this is KPBS Midday Edition. You could be getting too much of what you want from the Internet, and not enough of what you need. That's one of the concerns in a new book by former moveon.org director, Eli Pariser. The filter bubble warns that algorithmic editing is transforming the Internet experience into a web of one. Hi, Eli.
CAVANAUGH: Welcome to Midday Edition.
PARISER: Thanks for having me on.
CAVANAUGH: Now, I want to let our listeners know that they're welcome to jump into this conversation. Do you find your Internet experiences getting narrower because of all the personalization going on? You can give us a call, our number is 1-888-895-5727. That's 1-888-895-KPBS. Or you can send your tweets to KPBS midday or on line at KPBS.org/Midday Edition. I used the term algorithmic editing in the intro. I'm not sure I know what I mean by that. Can you tell us what that means.
PARISER: Sure, well, you know, I first ran into this on Facebook and I had gone out of my way to meet people who had let's of different political views. And you upon, I wanted to hear people who were thinking very differently from myself. And I logged on one morning and noticed that my more conservative friends had disappeared from my Facebook feed. They were just gone. And it turned out that what Facebook was doing is it was hooking at what I was actually engaging with, and it was saying, well, you say that you want to hear from these people, but actually you're clicking more on the links that you agree with, so we're gonna show you more of those. And this kind of invisible editing of your experience of the web. You don't see it happen, you don't know what's being left out is being baked into more and more websites These Days. It's not just Facebook it's also Google, which shows you the search results that it thinks you're most likely to click and not necessarily other people's. And it's also a lot of news websites now.
CAVANAUGH: Now, I think people are really familiar, people who know about the Internet know that ads are tailored to your specific desires, and your music is kind of tailored to your desires. But I think this aspect of it, the idea that Facebook and Google is actually tailoring news results or the kinds of input that you get on your Facebook page by your previous desires, your previous clicks, not a lot of people know with this, right?
PARISER: No, and in fact, I was surprised when I started researching for the book to find how many different popular websites this is happening on. So for example yahoo news looks like a normal news website. And you would never know actually that different people see different versions of it, unless you actually sat them down with computers side by side and looked. But it is tailored. And it's tailored for a very simple reason which is that if you can show people what they're more likely to click and more likely to engage with, then they'll keep coming back, and there's a psychological phenomenon called confirmation bias. They'll feel good about themselves. It feels great to have your own views reflected back to you, and you feel so right, but actually it's very dangerous. Because to make good decisions, you need to have a clear view of what all the options are.
MAUREEN CAVANAUGH: Now, in a presentation that you gave that I saw on YouTube, you actually showed two people who put in the word Egypt in Google and they got very different results that came up. Why is it that happen?
PARISER: Well, you know, it's hard to say exactly how the algorithm works. That's one of the challenges here, that all this kind of stuff happens behind the curtain. And we don't get to see it. But basically what Google is doing is it's trying to guess based on the information that people have put in before what they're going to click the most. And so apparently one of my friends seemed very interested in what was happening in Egypt, and the protests, and so he got a lot of results about that. The other friend got basically, like, travelled to see the pyramids' websites, which would have been an interesting trip at the time. And Google and Facebook and these other companies are just -- in this incredible arms race to basically get as much data about every one of us as they can to support this kind of filtering.
CAVANAUGH: I'm speaking with Eli Pariser, he is formerly of moveon.org. His new book is the filter did you believe, and he's gonna be talking about that in the neurosciences institute in La Jolla later tonight, in evening at seven. So a lot of people might just say, well, that's great. I love that. They're finding out what I like so I don't have to deal with a bunch of stuff that I have no interest in. What do you find is not good about that?
PARISER: Well, we're all sort of torn between two parts of ourselves. There's the part that I just want what I want, and I don't want to be bothered by anything else, and sort of the short term more compulsive self. And then that's the longer term, aspirational self that wants to be informed about the world and wants to be a good citizen. The best media basically helps us strike a balance between those two things. So you get some of the kind of nourishing information, your information vegetables and you get some information desert. And what this filtering allows you to do very easily and without even noticing it, is you can just get surrounded by information junk food. You just get the stuff that's sort of the most compulsively clickable, and all of the things that are either really important to know or not as clickable. But actually they're just sort of further afield, you know, get dropped out.
CAVANAUGH: The stuff that might expand your horizons just doesn't appear anymore.
PARISER: Right. Well, when I was having these conversations with engineers at these companies I said maybe what you need is, you know, it was a hard slog at first, but it changed my life button. Because many of the best experiences that you have are these things that would never be -- you would never think that you would be interested in them, you would never click on them immediately, and yet either those are the great books, and the great movies, and the great experiences are often of that kind.
CAVANAUGH: We were taking your calls. I think we had a caller on the line who couldn't make it. So let me tell you. We are taking your calls at 1-888-895-5727. 1-888-895-KPBS. So what kind of a reception have you gotten? You said you've been speaking with the people who actually developed this kind of algorithmic editing process. What kind of reception have these guys given you?
PARISER: Well, different companies have -- and different people have different takes. I think, you know, the most blunt conversation I had was with someone at Facebook who said, look, you know, what we do very well is maximize the number of minutes that people spend on Facebook. And it's fun, and it makes us money. What you're asking us to do is something that's much more complicated that involves the sort of messy decisions about what's important. And we'd rather just focus on the fun stuff. Other engineers at other places, and especially Google, I think they really would like to dig into this, and if some ways, it's one of the most interesting problems that they face: How do you actually give people the right blend of information? But it's not a priority because all of the money to be made is in let's just show people this very tight very narrowly relevant stuff.
CAVANAUGH: The caller who couldn't stay on the line basically was gonna tell us that the Internet is over. And Facebook and various other social media is taking over. But it lends itself to a very interesting question. You know, people very recently were still talking about the promise of the Internet to unite people and to unite cultures and nations and so forth. What do you see about the idea of editing in this way, editing the information that we receive personally? What does that do to the promise of the Internet?
PARISER: Well, it concerns me. I mean, I think -- you know, I grew up in a very small town in Maine with nine hundred people, and I remember sort of thinking about the Internet, and imagining the Internet as this kind that was gonna connect me with different people all over the world, and really allow me to get out of this very narrow little locality. I think the connective promise of the medium isn't showing up. And it's very good at connecting us to people who are very likes you. But when you talk about sort of introducing people to new ideas and different ways of thinking, it's not what's happening. And the other way in which the Internet isn't turning out to be quite the way we thought I think is we have this image in our head, I have an image of people connecting directly to each other, lots of different dots connecting directly. Instead, really what's going on is -- you don't connect to the local business directly. You connect to it through Google. And you don't connect to your friends directly, you connect through Facebook. And what that means is these companies have an incredible amount of power to decide how that information flows through them and which local businesses you connect with and which friends you connect with. And we haven't really called them to take responsibility for that power. You know, they would like to pretend that it's all just people connecting to people. But really, they have a very important role to play right in the middle of it.
CAVANAUGH: Now, some people still complain about the fact that the Internet is much more commercial than it used to be. But this seems to be adding another layer onto that complaint in that it's not just commercial interests as you point out, but these ever growing, these powerful search engines, and these powerful social media sites.
PARISER: Right. And it is commercial in a way. Google is doing this because they know that if they provide more relevant results, they'll do better. And personal relevance and giving you the links that it -- that Google thinks you want is a great way to do that. Larry page, the CEO now of Google and one of the founders says that he wants Google to be a company that eventually gives you just one search result, which is the right result. And for some searches, that sounds great. When I want my dentist's number, you know, that's what I want. But when I want to find out about the world, and when I'm querying something like climate change, it would actually be really dangerous to have a tool that just gave you the one result on that query.
CAVANAUGH: That the algorithm thinks is right for you.
PARISER: That the algorithm thinks is right for you, yeah. And I got into this conversation with a Google PR person where I was saying if I'm a nine 11 conspiracy theorist and I Google 911, is it your job to show me the link that I'm most likely to click on, which is gonna be another conspiracy link, or is it your job to show me that popular mechanics article that debunks a lot of that stuff? And being a PR guy, he said, well, we gotta get back to you on that. But that's the -- that's the concern. That what is most useful is not necessarily always what is most clickable.
CAVANAUGH: Let's take a call. Steve is calling from San Diego. Good morning, Steve, welcome to These Days.
NEW SPEAKER: Thank you. My question is real simple. I was wondering if Google and the other search engine, other than Facebook where you have to log in, tracked by your IP address and thus alter your results by IP address or do they only track when you log in.
PARISER: Yes, they do. It was one of the most surprising pieces of me for this investigation. One Google engineer told me that there are actually 57 different variables that even if you're not logged into Google, they can track. So your IP address, which is kind of your computer's address on the Internet. Your -- you know, what browser you're using, what kind of computer you're using, and you can imagine that people who use a Mac computer for example may be interested in different kinds of search results from someone who uses a PC. So even if you're not logged in, there's a lot of ability to tailor your results.
CAVANAUGH: Now, I'm wondering is there any way around this? Is there any way that a punish -- let's say Google and Facebook don't take your good advice and they continue doing this. Is there any way for an individual to stop that from happening to them?
PARISER: Yeah, there are, you know, unpersonalized search engines like duckduckgo.com. And I have a list of tips on the book website, if you want to opt out of some of this tracking and data collection. It's at the filter bubble.com. Buff the challenge is, really for most people, it really comes down to a handful of websites that define what they do on a daily basis. It would know very hard to replicate the functionality that people get out of Facebook anywhere else.
CAVANAUGH: My left question to you, Eli, is if indeed the Google people and the Facebook people don't decide that you have the right idea about this, is there any idea that you would support any kind of legislation or any way to force the people to stop tracking and stop editing all of this, all of your likes and dislikes into a Facebook page or a Google search result?
PARISER: Well, I think the right place to start is with resetting some of the laws around personal information and how it's used. Because the laws that we have for that are 40 years old. They just don't contemplate the idea that one click on a website somewhere, I mean, websites didn't even exist, that one click on a website would reveal something about you that would be useful to some other website that's all just kind of -- was not part of the package. And so giving people a sense of transparency about when and how their personal information is used and giving them some control over that would be a good start in helping adjust how these companies do this kind of filtering.
CAVANAUGH: I've been speaking with Internet organizing pioneer Eli Pariser, his new book is called the filter bubble. He's speaking tonight at seven at the Ravel forum at the neurosciences institute. Thanks so much, Eli.
PARISER: Thanks so much.