Capitalisn't

Two Billion 'Truman Shows'

Episode Summary

Tristan Harris, a former design ethicist at Google and “the closest thing Silicon Valley has to a conscience,” warns Kate & Luigi about targeted digital advertising that creates individual, orchestrated experiences dictated by nothing more than an algorithm.

Episode Notes

Tristan Harris, a former design ethicist at Google and “the closest thing Silicon Valley has to a conscience,” warns Kate & Luigi about targeted digital advertising that creates individual, orchestrated experiences dictated by nothing more than an algorithm.

Episode Transcription

Tristan Harris: We live right now in two billion Truman Shows. Two billion individual, personalized, orchestrated experiences where an algorithm is deciding exactly what we personally will see.

Kate: Hi, I’m Kate Waldock from Georgetown University.

Luigi: And I’m Luigi Zingales from the University of Chicago.

Kate: You’re listening to Capitalisn’t, a podcast about what’s working in capitalism today.

Luigi: And most importantly, what isn’t.

Tristan Harris: If I talk about persuasion right now, in this room, we’re probably all thinking, “Well, we’re the smart ones here at University of Chicago. If we talk about persuasion, we’re only talking about those people over there, those manipulatable people. But they’re so gullible.” But actually, persuasion works on everyone. A chimpanzee doesn’t get to choose whether a banana is seductive to its instincts, that’s just how we work.

Kate: All right, so what we’re actually going to talk about on today’s episode is how tech manipulates our brains. We had the opportunity to talk to Tristan Harris, who is known for being the closest thing Silicon Valley has to a conscience. Now he actually started by founding a company called Apture, which was acquired by Google, and he worked as a digital designer for Google for a number of years. Then, he went off to start his own nonprofit called Time Well Spent. He spends a lot of time thinking about, as well as researching, this idea that tech companies have the ability to manipulate our perception.

Luigi: Tristan told us that it was very important for him to start as a magician, at least as a kid he was a magician.

Tristan Harris: It’s super important. Yeah. When I was a kid, seeing the world through the eyes of a magician flips your whole worldview around, because instead of looking at choice as an authoritative thing that human beings are doing, basically you’re looking at reverse engineering and breaking down the entire foundation of this thing that we tend to think of as being in a secure enclave, called a mind. A mind does this secure thing called choice making. Being a magician and having that worldview is all about flipping that completely inside out, using all evolutionary instincts, physiology, attention, psychology, against the spectator or subject, and seeing if you can control, shape choices. Make them believe or see things they don’t see, control attention. It teaches you that those things are highly influenceable.

Luigi: Yeah, but how is that related to your concern about the power of digital platforms today?

Tristan Harris: Well, because essentially what we’ve done is we’ve created a channel by which essentially you have direct access to the human skin of two billion people, meaning you can buzz something in their pocket. You can then use colors, social cues. Basically everything you see on a screen is designed by engineers who people never meet, who basically know a set of persuasive techniques to engage or hook you. That’s the addiction layer.

This problem has two layers. The first is, can we hook two billion people so that they check their phones 150 times a day, from the moment they set their alarm when they go to bed at night, to the moment when they unset their alarm when they wake up in the morning? The answer is yes. That first layer is like, “Can we establish a matrix by which two billion people are jacked in to an environment controlled by two or three companies?” The second problem is that you then create these advertising models where basically anyone has access if they pay the guy at the front door of the control room, aka Facebook or Google, to directly target thoughts and influence to any vulnerable population that they want. I think both these things set up huge externalities and a society that we don’t want to live in, and that’s why we have to change it.

Luigi: You use the term we. What part did you have in that?

Tristan Harris: Well, so I was a tech entrepreneur. I should say that too, in the sense that I know the system from the inside out. I used to have a start-up called Apture. We got acquired by Google. I was friends with a lot of people who made this stuff, so we is ... I went to Stanford. I had a computer science degree. The we is the people in the industry. People don’t often understand how this stuff really works. These are not just products and services we’re trying to build to help serve people. That is a motivation, but the main thing is can you get it to work? Can you get it to grow? Can you get users? Can you get usage out of those users? Can you get them to come back tomorrow? Those incentives mean that we, the people, the engineers, the designers, are really shaping culture. We’re shaping politics. We’re shaping public health, and mental health, and loneliness. These are all externalities of a system like this.

Kate: Was there anything about your time at Google that made you uncomfortable in your work? Did that lead to you leaving and starting Time Well Spent?

Tristan Harris: Yeah. It wasn’t something that Google specifically was doing, just to be really clear. That’s not just to be diplomatic. I mean, I was within Google. They acquired our company. I was a product manager on Gmail. It’s not like Gmail’s goal was to hook people to slot machine-like rewards where you pull to refresh, to see if you got new email, and it was an intentional, deliberate, like, “Let’s turn this thing into Vegas.” But I was uncomfortable that of all rooms in the world where there would be people who cared about email’s impact on the stress, mental health, well-being, anxiety, and distraction of the people that it was influencing, I didn’t get a strong enough sense from that team that we had that responsibility. That email was where information knowledge workers spent a third of their day.

I got concerned about this, and I made a presentation at Google. It was a slide deck. I sent it to 10 people, basically asking for feedback. The slide deck basically said, “Never before in history have 50 engineers in Silicon Valley shaped what two billion people will think and do with their time every day. We have a moral responsibility as Google to get this right, from the perspective of someone who understands how people’s minds work, and cognitive biases.”

This presentation spread throughout Google virally. I got a meeting with Larry Page. It was really more about the industry overall. But I thought Google ... The reason I stayed is Google is one of the few companies that can do something differently. YouTube can’t. It’s stuck in this maximizing watch time model. But Android and Chrome are kind of the gateways between a human being’s brain, and then all these things competing for their attention. So the opportunity was, can we make Android and Chrome better defenders of human agency? I couldn’t unfortunately get much to happen while I was inside of Google.

Kate: I think it’s worth taking a step back and talking a little bit about how companies like Google and Facebook make money. We’re used to the product end of the platform. We’re used to using Gmail or Facebook to post pictures. Or Instagram to post pictures, or to tell our friends about our status. But those aren’t in and of themselves ways that the companies make money. They make money by collecting a ton of information about what we do: from the way that our faces look, to the type of words that we use when we describe our days, to the type of words that we use in our emails.

They aggregate that information, and they try to use it in the best way that they can, to target advertising towards us. The way that they’re actually making money is by charging people who are willing to advertise on their platforms, to be able to target ads to us specifically. If I have, let’s say, if I own a store and the store sells clothing to women, it sells Radiohead band t-shirts, and I want to target my ads to women between the ages of 30 and 45 who live in Pennsylvania, who like to listen to Radiohead, I can go to Google and Facebook and say, “Target my ads specifically to these people.” They have the power to do that.

Not only do they have the power to do that, but they can target it even more specifically to people who have been interested in those types of ads before. Or who when, say, scrolling down their news feeds, have stopped and paused, and looked at ads that are similar to mine before. They have this ability partially just through data, but also partially through machine learning and AI, to target things very specifically.

Luigi: Kate, you’re absolutely right. But I think it would be useful to remind our listeners what is unique about digital platforms, because we keep using this term. But besides the digital, what is important is the platform component. Google and Facebook are what economists call two-sided markets. Two-sided markets, I will explain in a second what they are, are very unique in many characteristics. Let’s start with an example from the 20th century, so even older listeners can appreciate it. Think about the market for classified ads that used to be there in traditional newspapers. You want to buy a house, you want to buy a refrigerator, and you want to go to the place where most people are advertising, or posting their refrigerators or their homes. Vice versa, if you are trying to sell the refrigerator or sell your home, you want to be in the place where most potential buyers are.

You have both the buyers and the sellers trying to go to the same place. The platform is the place where they meet. In the old days it was a newspaper. Today it is Facebook, or to some extent it’s Google. What is unique about digital platforms is you have kind of two clients at the same time. You have the buyers and you have the sellers. You want to have the highest number of buyers so you can attract the sellers, and vice versa. So you’re often trying to subsidize one side in order to attract the others.

In the particular case of Facebook or Google, they subsidize us as consumers, because we get the product for free. But they charge, and sometimes they charge very high prices, to the advertisers. So when we think that we get the product for free, we don’t really get the product for free, for two reasons. One is that we pay a higher cost of goods, because the advertisers are paying a higher price for advertising. Second, we pay with our data. When you quickly sign on to, or agree to the terms of use of Facebook or Google, you are giving away a lot of confidential data that Google and Facebook use to target those ads.

Kate: Yeah, you’re totally right. There are plenty of qualms that you can have with the fact that Facebook and Google do target advertising so specifically. But just in the context of whether or not they actually sell data, I think it’s important to make three distinctions. One is that they definitely target advertising based on the information that they have. Second, I wouldn’t necessarily call that selling data, but you can decide on your own whether or not you think that’s problematic.

The second way that they use data is that some companies or apps interact with the data if you want them to. For example, if I have the Tinder app on my phone, the easiest way for me to connect to Tinder is to let Tinder use all of my information on Facebook. It’s just easier to set up. That way Tinder can quickly import all my friends, and they can know who my network consists of. So it’s easier for me to get people or matches that I think I would like. So I’m OK with that too. I mean, of course there’s a question of whether or not that’s OK. But there is some agency in that, in the sense that I’m allowing my Tinder app to use my Facebook data.

Then finally, there’s this question of companies like Cambridge Analytica, to whom Facebook just handed over some data. The way that it was set up was an opt-out process that no one really knew about. So by signing up for Facebook, you were signing over your rights for Facebook to be able to turn over to Cambridge Analytica whatever they wanted. If you really wanted to prevent that from happening, you would have to figure out how to opt-out, which was pretty complicated.

Luigi: Tell us more about the Persuasion Lab at Stanford. What does it do, and what does it teach?

Tristan Harris: Well, so back to the magician’s metaphor, we tend to think that people cannot be persuaded. If you put on a VR headset, you don’t get to choose whether or not all your millions of years of evolution tell you, “Do not take that step forward, because you’re about to walk off a ledge.” I can know with my mind that there is no ledge in front of me. But if I put on a VR headset and that ledge looks like it’s right there, I’ve got too much evolutionary instincts driving me to say, “I cannot take that step off.”

Persuasion is understanding what are these immutable, deep-rooted features of the human mind and our motivational system. It really was teaching engineers to think about the world in that way. It wasn’t this sort of diabolical manipulators’ lab that was trying to teach you to ruin people, to addict people, or manipulate anyone. In fact, the whole point was, could you use it for good. Could you help people go to the gym if they wanted to go to the gym, or floss, or establish social norms that they wanted? But there was always a danger. In fact, the final class was actually about the ethics, and the future of ethics of persuasive technology.

One group came up with the idea that what if in the future, you had a perfect profile of what would persuade every single mind uniquely. With your mind, are you more motivated by hearing that the New York Times says it’s true, so an authority figure says it’s true? Or are there certain people, out of all the people that you trust that I could tell you that if they think it’s true, then you’re much more likely to believe that it’s true? What if in the future we had this perfect map, that for every single mind, you knew exactly which kinds of things persuaded it? That’s exactly what Cambridge Analytica is 10 years later. Cambridge Analytica is a metaphor for an entire system that basically is also what Facebook intrinsically is. It’s not really about Cambridge Analytica. It’s about systems whose business model is coupled with how best can I service the advertiser, the manipulator, to successfully influence your thoughts.

There’s a huge contradiction built into Facebook. One of two things is true. Either it’s true that it’s a neutral platform, it’s just a tool, users are responsible for their own choices, and they get what they want—in which case, advertising is not effective, and they’re deceiving their advertisers—or it’s true that advertising is really effective, and they’re selling advertisers on that, and it is the best tool in the world to influence and manipulate every audience, which is what their entire business is based on, and I think is also more likely true to the point. But so far these platforms have tried to claim that they’re just these neutral objects. They’re just sitting there waiting to be used. You choose who your friends are. You choose who you like. You choose what comments you make.

That’s kind of like a magician. If I say, “Well at the end of a trick, did you choose whether it was a face card or a number card? Yes you did. Did you choose whether it was a red card or a black card? Yes you did. So you made the choice independently, did you not? I didn’t influence your choice in any way.” You think, you nod your head like, “Yeah, I did make that choice.” But of course, I the magician stacked the deck in my favor many steps ago, and you just didn’t see how that happened.

Luigi: I understand that the power of technology now is much stronger. But the fact that advertisers manipulate customers, or try to persuade them in one way, is not new. I read many years ago in an interesting book that Nestle, for example, introduced in Japan some cookies that had the flavor of coffee, because in Japan coffee is not a taste that they ever experienced. In order to sell coffee in Japan, you need to develop this acquired taste, and so they did it. They put a lot of sugar, because we know that sugar ... and McDonald’s puts sugar in our hamburger because we like sugar more, and so on and so forth.

Tristan Harris: Correct.

Luigi: In what way is this different?

Tristan Harris: This is so important, because this comes up all the time. The number one rebuttal to this whole thing that we’re talking about is, “We’ve always had manipulation. We’ve always had marketing. … I don’t see why we should think about anything new.” There’s a few distinct characteristics. One is the level of intimacy that this persuasion can happen. Just think about access, first of all. We check these devices 24/7, 150 times a day if you’re a millennial. That was never true of any other medium. You drove past a billboard. You happened to see an ad on TV, but not ... I’ve got something up against your skin, causing you to reach for something in your pocket, thinking that it buzzed when it didn’t. I have intimate access to your moment-to-moment thoughts. Even when you’re not looking at a phone, and you go off and do something, you’re still thinking things that were driven by what was in the phone.

Luigi: Actually I read somewhere that 20 percent of the people check their messages even when they are making love, so that’s-

Tristan Harris: Yeah, exactly. We’re addicted to these things, and everybody is. That’s the amazing thing.

Kate: That can’t possibly be true. Oh my God, that’s so sad.

Tristan Harris: It is pretty sad. It is kind of where we are right now. But one thing is this intimacy. The second thing beyond intimacy is social intermediation. Before, I didn’t have direct access to manipulate the way in which you relate to, or all channels by which you communicate with other people. In other words, any time Person A communicates with Person B, we’ve introduced Person C, who can manipulate the terms of that relationship. I don’t mean in certain advertisements, I mean like Snapchat.

I gave this example today, all the time, Snapchat actually uses this technique to manipulate kids called streaks. It shows between two children, the number of days in a row that they’ve sent a message back and forth. They introduce that, so right there in your contacts list you see, here’s my most recent messages. Next to each person’s name is the number of days in a row you’ve sent a message with that kid. Now if that kid is your best friend, you’ve got this going for about 150 days. So what they’re doing is they’re socially manipulating the terms of your relationship. The currency of your friendship is, can I keep this streak going. That is a totally new form of manipulation.

Also, for the advertising. The fact that Facebook knows or would make available to advertisers, the keywords that you are ... they don’t sell this data by the way, but they’ll let you target to it. When you talk to other friends on Facebook, and you talk about anything, that’s open and available now to manipulators. That’s a totally new thing, and the scale is unprecedented. The last thing, the third thing, so first was intimacy, the second was social manipulation ...

Luigi: Can I, just for me to understand, it’s the difference between I’m able to show you a picture, or paint you a room, versus The Truman Show, in which I manipulate your entire life.

Tristan Harris: That’s right, and we actually use that metaphor. We live right now in two billion Truman Shows, two billion individual, personalized, orchestrated experiences, where an algorithm is deciding exactly what we personally will see. We think, if you and I had the same friends, the exact same 200 friends ... So we both have Facebook accounts, and let’s say we have the same 200 friends. You would think that if I open up the news feed on both phones, we would see the exact same set of material, because we have the same set of friends. But that’s not how it works. It specifically ranks whatever it wants to show you. There’s thousands of things it could show you. It selects from that, and orders them based on what you specifically have ... has worked on you in the past, what has engaged you and hooked you. That’s also part of it is a new, unprecedented form of this manipulation is it’s personalized.

The fourth one is that it’s powered by AI, and it’s self-improving. If you think of this as a ... how AI works, the fact that this is actually getting better and better over time to do this through automation, is a huge and new feature of the system.

Luigi: In terms of ... Thank you for this explanation. This is very helpful. In terms of the moral responsibility, I think that other businesses have moral responsibility, too. Going back to food companies, they certainly don’t care too much about our waist, or they don’t care about our health. What you are trying to suggest is that digital companies should have a higher moral ground than the rest.

Tristan Harris: Yeah. I mean, we have a name for this relationship. It’s called a fiduciary relationship, where one party ... if you walk into a lawyer’s office, they know way more about the law than you do, so they can just exploit the hell out of you. And because they have so much asymmetric power over you, and they can exploit you, we don’t just say that they if you do get exploited, “Well that was your fault for going with that lawyer.” It’s like, “No, no, no. This is a party that has asymmetric information, in the same way that a surgeon has asymmetric power over a client, or a psychotherapist does.” But now think about if you stack those up: doctor-patient, attorney-client. How much asymmetric power do they have in those situations? Now let’s put right next to it, side by side, Facebook.

Facebook is like a psychotherapist who knows more about every single secret of your life. They know who you’re clicking on, your old romantic partners that you click on at two in the morning when you feel lonely. They know your every single word choice you use around political topics. You use immigration, you always use the same adjective next to it. They know which button colors light up your brain. They know things about your mind that you don’t know about your mind. Now they have this asymmetric access to information.

The very first step, we should say, “Whoa, that’s a whole new species of asymmetry. We’ve never seen that one before. That’s huge.” Now, you ask: the business model. Who pays that person? In the psychotherapist territory, who has asymmetric power over you, you’re revealing all your secrets to them, you’re paying them, so at least they’re in some relationship with you. So when you add to this huge asymmetry, now the business model for Facebook is actually selling that personal, intimate information about how your mind works, and everything about you. They sell that to a third party. They don’t sell the data, I mean they sell access to manipulate you on that intimate ground, to a third party. That is an unprecedented level, and dangerous level of influence.

We were advising Congress when the November 1st hearings happened on Russia, and that is a huge part of this too. We’re not talking about this just because it’s a fun philosophical debate. It’s very serious geopolitical consequences that are emerging from this asymmetry.

Luigi: What Tristan is saying is that Google and Facebook are maximizing our addiction to those gadgets, and it’s almost like the evil empire is conspiring against us. They’re doing it just to maximize profits, but I’m not so sure that this is something that is right, given that they’re really exploiting our weaknesses, our psychological and biological weaknesses.

Kate: I’m sort of playing the devil’s advocate here, because I see your points, and I agree with your and Tristan’s point to some extent. But I do think that it’s important to point out the difference between something that’s addictive and harmful, versus something that’s just purely addictive. If Facebook knows that I like the color pink, and so they show me a bunch of ads that feature pink more prominently, I mean yeah, they’re trying to get me to click on those ads. But how is that necessarily harming me, just because I like the color pink? Maybe I want to see those ads more. It’s not gonna give me lung cancer.

Luigi: That’s true, but in a sense, this is where the idea that Tristan has of fiduciary responsibility is important, because they do know much more about yourself than you know. So if showing pink is not a problem, if I am somebody that is addicted to alcohol, and they keep showing me advertising of booze, I think that that’s a problem.

Kate: You’ve spoken a lot about Facebook, but ironically one of the people who has really bought into your ideas of ethical persuasion has been Mark Zuckerberg. He’s used a lot of your language to influence the public image of Facebook, as well as the morale of people working at Facebook. How do you feel about that?

Tristan Harris: You’re totally right. Did someone brief you on that beforehand? You must have some asymmetric access to information-

Kate: Perhaps-

Tristan Harris: ... on me that I don’t know about. How do you know this?

Luigi: You don’t know who she’s clicking at night.

Tristan Harris: I know. I know, exactly. Yeah, this Time Well Spent thing isn’t just a phrase. It was a concept that we introduced at this TED Talk in 2014. Basically we were saying back in 2014 that there’s this huge problem with a time-spent-maximizing system. We introduced Time Well Spent. Zuckerberg recently co-opted that phrase, and made it the new design goal for the whole company. It was a surprise to me when I heard that call on November 1st, his earnings call, the same day they were testifying before Congress, saying the new design goal was to make time spent on Facebook time well spent.

It has exactly been used to sort of re-moralize or re-engage the employees, that we have a new goal. We’re fixing the problem. But it really isn’t authentic, because they’re only interested in making sure that your time on Facebook is time well spent, when the whole premise of the work was how to go beyond the advertising-maximizing time-spent model overall. Meaning Facebook can’t be in the business of just enabling you to make choices off the screen: go on hiking trips with your friends, be in nature, take the cooking classes you want, whatever it is that your goals are. They can’t just be in that business, and they could be. Yeah, they have co-opted that phrase, and they have co-opted our concept. They haven’t really given much credit over to us in the process either.

Luigi: They don’t pay a copyright?

Tristan Harris: No. I think honestly this business model is going to come tumbling down. I think that in the future we’ll look back and say, “Why would we have ever given someone who had this amount of power and access, why would we have ever allowed them to have a business model which directly incentivizes them to do things that are not in the interest of the people that they’re serving?” Now the question is, what is a new accountability model that does guarantee that they are ... even with psychotherapists, they can still not be in your interest, even though you’re paying them. So how do we guarantee that trust? In nature and biology, the reason that we can trust a parent who controls the food supply of the child is through genetics. The fact that the child has the genes of the parent means that a parent is intrinsically motivated. Everything in their biology makes them want to be aligned with the party that it’s serving. But we don’t have that in technology.

How could you replicate that? How could you create a system by which something that impacts two billion people treats the agents, not really the agents, the people that it’s serving, with the same level of care and true, genuine, sort of it-can’t-be-any-other-way compassion as we have in biology?

Luigi: Fantastic.

Kate: Tristan, it’s been really great having you.

Tristan Harris: Thank you for having me