I’m Alan Alda and this is Clear and vivid, conversations about connecting and communicating.
Jaron: The bad thing is the manipulation.The problem is that these platforms and I mean specifically things like YouTube, Facebook, Twitter, the only way they can make money is when somebody is paying them out of the belief that they’ll be able to manipulate the users. It’s actually a pretty simple system at its core. It’s one that is not survivable. If it just keeps on running the way it is, it’ll destroy us.
Jaron Lanier was there at the beginning. He helped build the Internet, and he was one of the creators of virtual reality. But now, he thinks something has gone very wrong with our new digital world. Actually, something very, verywrong. And this at a time when we communicate with one another, it seems, more through social media than actual socialization.
Alan: 00:00:00 Jaron, this is really great to be talking to you. You’ve written this provocative and scary book, and the provocative part is right on the cover, 10 arguments for deleting your social media accounts right now. The scary part is when you open the book and you actually read what your arguments are, they’re truly scary. They’re watching us all the time and when I say ‘they’ I don’t mean people in these corporations. From what I get from your book is that there are algorithms watching everything we do online.
Jaron: 00:00:41 Yeah. It’s the peculiar feature of our times. We have these programs that are gathering data about us relentlessly. They’re created by people who don’t really have the means to fully understand the implications of what they’re doing because everybody’s compartmentalized.
Alan: 00:01:07 So they didn’t start out to be evil about watching us, I think you make that point or did they?
Jaron: 00:01:17 I was there when the whole system started up and my assessment of the people I knew is that it was entered into by mistake and innocently. I believe that Google started sweetly and with good intentions, but here’s the thing. There’s a fellow named Sean Parker, who was the first president of Facebook who has said publicly, “Yeah, we knew we were using techniques that were engineered to be addictive that had been meticulously crafted and refined. We knew we were doing all the things.” I knew him at that time. I really don’t think they did know and I think he’s imagining himself in retrospect to be more of an evil Bond villain than he really was. [crosstalk 00:01:58]
Alan: 00:01:59 You quote them in the book as saying, “We have to give them, the user, a shot of dopamine every once in a while”, meaning this dopamine as a reward hormone where we feel good when we get an indication that somebody read our posting and liked it and we were liked by somebody and that makes us feel good that I presume the point to that is to keep us online. Is that the main point?
Jaron: 00:02:30 If we’re for speaking non-scientific terms, we should say that there’s a lot about the brain that isn’t understood and the dopamine hit is a common term in Silicon Valley for the reward function and behavior modification regimes, but the exact function of the different neurotransmitters is still … there’s still a lot to be understood.
Alan: 00:02:52 So it’s shorthand for a reward.
Jaron: 00:02:55 Yeah. It’s a shorthand, it’s a techy way of talking about some very old science of behaviorism. But the thing is these semi-periodic rewards that are so successful in behaviorism occur in the midst of punishment. That’s the crazy thing.
Alan: 00:03:15 So you get positive and negative feedback at the same time?
Jaron: 00:03:19 Just to recall the history of behaviorism, one of the earliest behaviorists and probably the first celebrity one was Pavlov..And one of his famous experiments was first conditioning a dog to get the reward which was a treat when a bell rang and then to ring the belt and see that the dog salivated even without the food. What he proved is that you could use a symbol as the reward when you’re doing training or behavior modification.
A lot of addictive technologies have few symbols because they don’t have any other choice, for instance gambling does exactly that. You might say, “Well, it’s real money.” Well, it is. Online gambling actually is more like playing a video game and video game addiction is definitely symbolic for the most part.
Alan: 00:04:21 I play words with friends and they just downloaded a whole new version of it that is full of all kinds of childish rewards, badges and things.
Jaron: 00:04:42 Yeah.
Alan: 00:04:43 I had cut a swath through it and turn off so many examples of that. What are they trying to get me to do? To play more times?
Jaron: 00:04:58 Almost anybody who designs a game will incorporate some of the techniques of symbolic addiction creation to get you interested but that in itself is not necessarily terrible. It becomes bad when it becomes manipulative and damaging. The thing is that in classical behaviors experiments, if the reward has to be punctuated like if the candy is always available, then it’s nothing. It has to be separated by gaps where the candy isn’t available or the bell or whatever the symbol would be for the reward.
In fact, one of the most interesting and surprising results is that let’s say you have a dog that’s trying to get a reward. If the reward is not entirely reliable, if it comes late sometimes or sometimes it just doesn’t come, that kind of ambiguity seems to create more fascination within the brain. The brain is trying to understand the pattern and when it’s elusive, the brain latches on more and more. Imperfect feedback actually is more addictive than perfect feedback.
Alan: 00:06:08 This sounds exactly like what happens when you’re in front of a slot machine. you don’t … the motion of pulling the lever becomes as important as winning but you can’t predict when it’s going to come up but you’re constantly saying things like, “This hasn’t given me a payoff in a thousand tries, it due for one now.”
Jaron: 00:06:31 That is the process of psychological addiction right there in one sentence.
OK, so I get how slot machines and video games can become addictive. But I was curious how Jaron took the leap from there to saying that, when I use them, social media like Facebook and Twitter deliberately manipulate me – along with millions of other users. What techniques do they use? And is it really that bad?
Jaron: 00:14:38 Addictive techniques you can find in gambling and in video games and just retail designs. There’s all kinds of people trying to get you addicted to whatever they do. That’s not necessarily terrible.. But here’s where it gets really bad. The bad thing is the manipulation.The problem is that these platforms and I mean specifically things like YouTube, Facebook, Twitter, the only way they can make money is when somebody is paying them out of the belief that they’ll be able to manipulate the users only. That’s their only economic incentive. There’s nothing else,
Alan: 00:15:23 They’re not just making money out of putting ads up there. They’re putting, they’re selling their ability, the company, let’s say Facebook, is selling the ability to change my behavior. Is that right?
Jaron: 00:15:36 There’s a line that’s been crossed. I think this is a clear redline, at least to my point of view. Advertising per se getting a message out there in the hopes of persuading people, you can find it annoying, you can dislike it, but it’s not that bad, and it’s probably overall helped people adjust to modernity. I think on the balance sheet. the biggest balance sheet it’s probably positive even. What’s going on now is something that we shouldn’t call advertising even though the companies do call it advertising, because what we’re doing is we’re taking in data from people in real time and then algorithms are testing what variations of the experience that’s given back to people will have an effect on the person’s behavior and then they incrementally adjust the experience a person is given to find some way to adjust that person’s behavior so you’re locked into this relentless incremental algorithmic approach to modify your behavior in a way that you can’t really be aware of and that nobody understands. There’s no scientist tracking these things to understand exactly why they work. It’s just pure correlation
Alan: 00:16:49 Is the behaviorthat they’re changing just the behavior involved in buying a product? Are there other behaviors that they’re selling the ability to change?
Jaron: 00:17:01 Well here is where it gets dark. If this were strictly only about getting you to buy one brand of coffee instead of another brand of coffee, honestly, I would not be worked up about that. However, it’s this huge red carpet, this huge invitation to bad actors to come and do exactly the same thing whether they become official customers or not. A very typical thing that will happen is a bad actor will try to repress a vote in certain areas by getting people cynical, doubtful, just dismissive or too angry to trust anyone. That’s a very common technique that’s been used all over the world.
Another thing that the platforms can be used for is to stir up and amplify ethnic or other divisions to destabilize a society or harm a minority. That’s happened all over the world, especially the developing world and always reliably after these systems show up.
Alan: 00:18:15 You remind me that the UN blamed Facebook for the genocide against the Rohingya.
Jaron: 00:18:24 Yeah. There’s a very similar pattern in parts of rural India and also in parts of Africa, where somebody realizes, “Wow! You can use these platforms really effectively to stir up trouble and in an irrational way. If somebody sees it in their interest than it’s inexpensive and readily available so they do it.
Alan: 00:18:49 One of the things that I thought was interesting about this in a personal way because I occasionally plow through some of the privacy alert that they ask you to read and attend and nobody does, and I usually don’t either. But, soon as I realize that I feel fairly confident that they’re not targeting me personally, they’re not identifying me by name in the information and data that they collect. I feel a little reassured and I click OK. But it’s not okay now that I read your book, because they’re putting me in a basket with other people who have responded to certain stimuli who have various attributes similar to mine. I don’t have to be targeted personally, I’m still targeted along with the thousand or hundred thousand other people. Am I on the right track in describing what you say?
Jaron: 00:19:56 Yeah, that’s exactly right. It’s a giant statistical machine and it’s important to understand that. It’s not so much that each person is specifically targeted, but rather the person is targeted as part of groups that are detected as being similar because that’s the only way they can perform enough experiments for the algorithms to adapt. In other words, in order to perform a thousand experiments on you to figure out what would get you to do something differently it could take months. but if they have 10,000 people who have appeared to be similar in enough ways they can do them all simultaneously move much faster.
Algorithms are sampling the results of their little micro experiments, and the micro experiment might be something like changing the color in an ad or the … just how soon you see one kind of a thing after another thing.
In some ways, these correlative algorithms can notice things about you that you don’t know about yourself. They might notice the onset of disease that you haven’t been informed of. They might notice all kinds of things. If you go to a session where Silicon Valley companies are selling their abilities to still make extraordinary claims, we can tell where a woman was is in her cycle and use that to get her to buy things. That’s a very routine claim, pretty, pretty wild stuff. Another one is they can detect pregnancies before the people know.
Alan: 00:34:12 That’s real? It’s true?
Jaron: 00:34:15 Well, here’s the thing. All of these things are statistical, so there are incidents that are real. My suspicion is that it’s only barely true. A lot of these effects are slight. For instance, let’s say that the predictive rhythm manipulative ability is just a percent above random. If you can have a 1% ability then you get the people who are your customers to pay more into you because they’re afraid that that 1% will start to be leached away from them, and so gradually start to accumulate more and more power and influence over time and the society as a whole becomes more chaotic and dark.
Alan: 00:24:51 One of the things I don’t get is, if it’s so evil why isn’t it outlawed? It sounds like it’s really possible to cheat people out of their money with this. For instance, on this podcast I love it that more and more people are listening. It makes me feel great. If I bought the services of some company that offered fake downloads, downloads by machine and no human was listening to them but the machine was downloading it and it got registered as a download, then the people who advertise on this and through their ad buys are really supporting the center for communicating science at Stony Brook. It would be … I’d be like Robin Hood but I’d be nevertheless robbing the advertisers by virtue of, as far as I can tell, chances to do this that now exist..
Jaron: 00:26:52 Yeah. That style of fraud is universal at this point and has become the core of our society. Fake people run everything now. Fake people determine what’s popular to get selected. I don’t mean that fake people are voting, I mean that fake people create social perception that then affects who gets elected. There’s something to say about this which is that slightly geeky if it’s okay, but if not.
Alan: 00:27:21 Sure. But if it’s too geeky I’ll get you to explain it to me.
Jaron: 00:27:24 This is geeky not in a science way or tech way but actually in a legal and finance way. When Facebook went public they talk the SEC somehow into accepting these new kinds of metrics for the value of a company of just how much people are looking. Then subsequent companies like Twitter that went public were forced to accept the same metrics. What that means is that the companies have a direct cash incentive to not purge themselves of fake people. Whenever they do a big purge of fake people they lose money. Their value gets dinged and they have every incentive in the world not to know how many fake people there are, and so nobody actually knows.
Alan: 00:28:36 It sounds like you’re saying they can actually engineer the algorithm so that it’s … what’s happening is even hidden to the company itself.
Jaron: 00:28:47 Well, let’s say for instance, if a company like Facebook was really serious about never having a fake person, they could demand all kinds of authentication, they could demand … they can have people … It might be expensive but then it would be very real and this type of fraud you talked about wouldn’t happen. In fact, it’s very easy to start fake accounts and any of these. I’ve never had a real account of any of these, and if there’s constantly fake versions of me on Twitter and all the rest of them.
Alan: 00:29:18 Anybody thinks they’re hearing from you on Twitter, they’re not.
Jaron: 00:29:22 They’re not. But the various fake versions of me on Twitter are convincing. I’ve had Microsoft that sponsors my research tweed out, “Oh, here’s Jaron’s Twitter account.” Even they were fooled. I found a … I’ve had … because the fakes are good and the fakes are created automatically in a way that’s effective. Now, could the companies really detect these fakes and get rid of them? There might be some regulations that suggest they should be doing a better job of getting rid of all the fakes but the direct monetary reward is created by these valuations and that’s definitely pro fake at this point.
Everything about it to its core has from the start and this … oh god this has a long history. When we originally did the Internet’s architecture and I say we because I was involved back then. The decision was made to make it very bare-bones to create room for entrepreneurs to build businesses. That was a humongous mistake. It didn’t have any way of representing people at the start and that that function had to be filled by companies like Facebook, which then turned it into a monopolist, manipulative society destroying thing. It’s horrible.
Alan: 00:55:08 Wasn’t your own personal experience one of suffering at one point from some of this addictive behavior with social media?
Jaron: 00:55:21 Well, there was a really interesting formative experience for me back in the 80s when we were just starting to experiment with what we call social media today, where people could get together and post on forums for the first time. What was remarkable about that is that it’s … and that was before there was a commercial motive, that was before this whole manipulation machine, even just in the most bare-bones forum it did seem to bring out the worst in people. It did seem to … I call the inner troll. It seems to bring out a troll aspect of a lot of people and I found it in myself much to my amazement. I have some theories about why it’s there and what it is that I can tell you about, but I think the key point is that it was a horrible thing and it made not only me but a lot of people swear off the stuff until we could figure out a better design that wouldn’t bring out the worst in people. Companies like Facebook and Google have essentially monetized that monster. They have … instead of trying to design away from it, they have said, “Okay this is great. This is a thing we can make money from and more and more money.
Alan: 00:56:31 I don’t want to invade your privacy, but what was your … what did you go through that you can tell me about?
Jaron: 00:56:41 No, no, no, it wasn’t anything. There’s nothing. There’s no … there’s nothing to keep secret here. I noticed myself becoming a creep and I hated it. I’m sure this is an experience that’s familiar to a lot of people. You’re in an online forum and suddenly you start saying horrible vindictive things about somebody else, you try to trap them in some way that humiliates the other person or you get into the spat over nothing. I think the first time it happened to me, it might’ve been about brands of pianos or something.
Alan: 00:57:16 There’s something important that you had to say about pianos.
Jaron: 00:57:19 Listen, I care. I am piano snob.
Alan: 00:57:25 What do you mean?
Jaron: 00:57:27 If you don’t accept that the Mason and Hamlins are the best American piano, we’re going to have a problem. The thing is, if you and I are talking about piano makes now, we would do it civilly and it would be fun, but online, somehow, it can become weirdly cranky and vindictive and nasty.
I have a theory about it, which is a … I think people can function broadly speaking in one of two frameworks. We can either be lone wolves or wolves in the pack. You could substitute for wolf any of a number of other species where individuals can sometimes function socially and sometimes function individually.
The thing is, when you’re functioning as a member of the pack, your whole way of thinking about the world is different. The most important thing in the world to you becomes politics. You’re thinking about who’s competing with me in the pack, who’s above me in the hierarchy, who’s below me. The thing you have in common that bonds you to other people as opposition to other packs and so politics is reality.
If you’re a lone wolf, you’re responsible for your own food and shelter. You have to assess the world to become a scientist. It’s reality becomes dominant instead of politics. it’s a totally different, when you use the fancy word, it’s a different epistemology. I think what’s happening, when you’re thrust into these online groups where there these other people, you’re a little disconnected from reality and there’s nothing to gain but mind games. The only thing available is politics. And so this inner pack psyche comes out and you become just hyper political and that’s when you become nasty.
So, if all this manipulation is making us weirdly cranky with one another and if it can make us even hate one another – what can we do about it? We’ll find out right after this break…
Jaron seemed to feel that one of the main problems we face on the web is anonymity. So I wondered…
Alan: 00:31:32 Do you think that it would be better if we were all personally identified on the Internet, that you couldn’t have the situation like the New Yorker cartoon where the dog says on the Internet they don’t know you’re a dog. They’d have to know that you’re Jaron Lanier.
Jaron: 00:31:49 Yeah. That New Yorker dog cartoon was from very early when we were just, we had just come out of those debates. At the time the internet was designed, there is this very strong distrust of the government. It came from the left because of the Vietnam war and the pot laws in the draft and all kinds of things, it came from the right because of the speed limits that Jimmy Carter had imposed and the way that people used the anonymity on the CB radio to evade the speed limits. At any rate, all young people thought the thing to do is to hide from the government and so the feeling was that the best way to protect liberty and make the world good would be to not have people represented online so people could be anonymous and slick about.
I think that was a structural mistake, a core mistake. There should be a way for people to be authenticated but with protections where that can’t be abused. That’s actually doable, whereas if you have a world where everything is fake that’s very hard to work with.
Alan: 00:35:18 That really leads me to ask you about the provocative title of the book. Should we do what you suggest in the book and delete our social media accounts right now? If we do, what effect would it have? I doubt all of us would do it especially if we’re addicted.
Jaron: 00:35:40 There’s no way I’m going to get everybody to leave at once, and the reason why is both the addiction and also what we call network effect, which is if everybody else is on Facebook then it feels like leaving Facebook disconnect you from everybody and everything.
Alan: 00:35:57 So we’re stuck, what do we do?
Jaron: 00:37:38 Yeah. My belief … In fact, my certainty is that companies like Google and Facebook can function to the benefit of society instead of to the detriment of society. What they have to do is change their business model. Right now, anytime there’s any connection between people online it’s financed by other people who want to manipulate the people who were connecting. That’s a recipe for a society based on manipulation, almost by definition, and it’s almost impossible to get out from under that to something that’s clear and sane.
Alan: 00:38:31 It sounds like you’re saying that these benefits that we believe we’re getting from Facebook, the community with other people, for instance, will be served in the future when things are more positive because we’ll pay for them rather than people paying for them who simply want to manipulate us.
Alan: 00:40:46 Does that mean that as that multiplies there’s an unseen, presently unpredicted problem that those with more money will have more access to information and the sense of community that you’d get from being on Facebook? Will we excommunicate those with not enough money to keep up with the rest of us?
Jaron: 00:41:15 Right. Well, to some degree, it’s hard to know what the overall effects will be of a societal design until you actually try it, which is unfortunate. The other piece that’s maybe even more important is I think people should be paid for their contributions
Jaron: 00:40:26 If somebody adds value to Facebook by adding a great post that people like, they should get paid. If somebody uploads a video to YouTube that increases the value of YouTube, that person should get paid and not just because YouTube handpicks a few people, like a communist central planner, but just as a matter of course, people who add value should get paid.
So the plan I’m talking wouldn’t be utopia and it wouldn’t be perfect, but it ought to create a broader economy and ought to support more people. It ought to create a broader distribution of dignity and self-determination, and yeah, it would probably leave some people out and we’d have to adjust for that. The problem is I don’t know a perfect plan. I don’t think there’s such a thing as a perfect societal design or perfect economy.
Alan: 00:42:40 Every new advance probably will always give us, I would imagine, things to deal with that we don’t know how to deal with until we realize how complicated it is, and we’ve never had anything as complicated as the algorithms that run the internet. Would you say?
Jaron: 00:43:33 That’s probably true, although complexity is a bit in the eye of the beholder and the manipulation system I’ve described, once you see it in those terms actually becomes rather simple.
Alan: 00:44:11 Simple in what way?
Jaron: 00:44:13 Just the motives and the way it works is all very clear. Third parties come along in the hopes of gaining power by manipulating people over a universal information system that’s designed to addict and manipulate them via algorithms and money concentrates as a result. It’s actually a pretty simple system at its core. It’s one that is not survivable. If it just keeps on running the way it is, it’ll destroy us.
Alan: You say in the book charmingly we should all be more like cats than dogs. What did you mean by that?
Jaron: 00:12:20 Believe me, I had to work this through very carefully with my dog lover friends. Let us admit for second that cats are different because they’re only semi-domesticated. You could put the cat out in the world, in the wild and it will survive. The cat will figure out how to manage. There are strong arguments that cats domesticated themselves. It wasn’t a matter of a human going out and finding a cat and saying this would be a useful animal, they just showed up and they’re here on their own terms.
That gives them an independence. They’re not as reliably trainable. If you look at the history of behavioral experiments, let’s just say there’s a lot more with dogs than with cats because dogs actually respond this stuff and cats don’t. My favorite … when people do try to train cats, the wonderful thing is that they’re not reliably trained. They can learn the tricks but whether they do the trick is a little unpredictable.
I think their quality of integrating into a high-tech society but still being yourself, still being independent is the thing that’s so charming about cats and why they’re so popular online. My theory is that this obsession with cats in videos and whatnot online is people longing for the receding independence. It’s this longing for us not to lose our cat natures.
Alan: 00:44:44 You talk about, in the book, in your 10th argument, the threat to us spiritually, which is different in severe ways from the threat to privacy or our own decision-making. I get the impression you feel that’s the most important argument against what we’ve been talking about.
Jaron: 00:45:11 The spirituality algorithm argument I think is maybe the closest to my heart, because i feel that what’s happened in the tech culture around these gigantic power concentration that’s come about since we run these networks, it’s essentially a new religion and it’s a medieval religion. It’s a religion in which you believe that the world’s going to end and that those who were true believers can get on the inside of the right circle, will gain immortality and that those who aren’t will die.
Alan: 00:46:55 Let me try to simplify, so I understand it. Are you saying that we’ve created a god-like entity, that we expect will save us from our own depravity?
Jaron: 00:47:23 I think it’s worse than that.
Alan: 00:47:27 That’s pretty bad already.
Jaron: 00:47:32 If you talk to a lot of the key people, with friends in the tech world, like the folks at Google, for instance, they’ll say the reason they’re gathering all this data is because ultimately what google is, it’s not an advertising empire but it’s an AI empire and that what it’s doing is it’s building the super creature, the super AI that will inherit the earth… and then, either the true believers will be able to upload themselves into the giant computer in the sky and live forever or something. Although some believe that mankind will perish it’ll be for the better because there’ll be a higher form of life that we’ve brought about.It’s participating in that machine that really is the ritual of this new religion. It’s really something also new and distinct. It’s a new peril.
Alan: 00:59:52 Divorcing yourself from social media never, never having engaged with them or married them you have given yourself the chance to turn to one of your loves, which is music, which is soul stirring instead of soul stealing. I notice in my personal life, if I need to downgrade the tension I feel, I’ll sometimes play a card game on my iPad. I see my wife on the other hand, go to the piano and make music. For somebody like you with 1500 instruments in your house sounds to me like you gravitate toward music for solace in a way that is much healthier than those of us who go to the card games on the screen.
Jaron: 01:00:53 Healthy, you’d have to talk to my wife and daughter about that because I think five instruments is healthy maybe 50, 1500? I think there should at least be a question asked.
Alan: 01:01:07 What are they? What range? They’re not all pianos, right? You don’t have that big a house.
Jaron: 01:01:12 No. They’re not all pianos. I just go through these different periods of becoming fascinated with some instrument that I haven’t played and wanting to learn all about it and learn to play it and travel to wherever it’s from an meet the people who are involved in it. I’ve been doing that now for decades and after a while you do end up with a lot of instruments.
Alan: 01:01:34 Until you have ouds. How many ouds do you think you have?
Jaron: 01:01:38 Listen. If you know how many ouds you have, you don’t have enough ouds. The first rule of the oud.
Alan: 01:01:53 Do you have Chinese string instruments, anything like that? Are they all weird? Are they all unfamiliar to Americans?
Jaron: 01:02:00 No, not all of them. I love a lot of traditional American instruments too. So my current obsession. I have a few current obsessions, one of them is pedal steel, which is-
Alan: 01:02:58 What is that mean?
Jaron: 01:02:59 A pedal steel, have ever noticed in country music like older country music they’ll often be the horizontal device with somebody seated and there’s a bunch of pedals and it plays this incredibly luscious angelic continuous sound that’s just like wafting through the music.
Alan: 01:04:18 Great, great. Have you ever seen or played, or do you own a Franklin glass organ.
Jaron: 01:04:24 Yeah, those are called glass armonicas.
Alan: 01:04:27 That’s what I meant, yeah.
Jaron: 01:04:28 Yeah. I do play that and that’s exceptionally wonderful instrument that’s purely American. That was a thing that, oh the stories about that. Benjamin Franklin was in Paris and saw someone playing music on wineglasses by moving their fingers around the wineglasses and had this idea for an invention where he could turn the glasses on their side and cup them inside each other to make a road that you could play like a keyboard where it would rotate against your fingers. He made this thing and it just sounded angelic and stunning and something completely new in the world. It had remarkable effects on the world. One thing is that there was this crazy guy named Mesmer, who decided to use its angelic sounds as a early form of hypnosis, that’s where mesmerize comes from.
Alan: 01:05:20 Yeah.
Jaron: 01:05:21 He also-
Alan: 01:05:24 I didn’t know he used the franklin invention for that.
Jaron: 01:05:28 Yeah, that was his device of hypnosis that was.
Alan: 01:05:29 I think history crisscrosses in so many interesting ways.
Our conversations on this podcast often touch on empathy because it seems to me that’s central to good communication and to good relations with other people. I was interested to see in your book that you feel that these platforms that we’re talking about break down the ability, they inhibit the ability of people to have empathy for one another. Am I right about that?
Jaron: 01:10:43 Yeah, I did make that claim.
Alan: 01:10:46 How did it happen?
Jaron: 01:10:48 The problem is when everybody is being given different experience feeds and those feeds are calculated to certain ends or try to manipulate them, then people just it follows by definition that people will have less common experiences with each other.
Alan: 01:11:10 We don’t hear from anybody who doesn’t share our point of view so we don’t have the opportunity to take on the point of view of another person, which is one of the functions of empathy.
Jaron: 01:11:20 We don’t know what the other people have experienced. We haven’t been in a common environment and perceived it differently. We’ve been in different environments that are invisible to each other. That circumstance makes it exceptionally hard to gain a sense of sympathy or empathy for anyone else. I should say there’s an interesting history to the word empathy, especially in this context because it was originally invented by psychologists about a century ago, approximately in anticipation of virtual reality. This idea that the original meaning of the term was that you could imagine yourself in any place in the universe. You could imagine what it would be like to be a mountain or a leaf.
Then, in the 80s when we were starting virtual reality, I started to use it as a suggestion for the betterment of humankind, that using virtual reality maybe we could project ourselves into the shoes of others to get more of a sense of what their experience was, to understand where they were coming from.
Alan: 01:13:06 There are artistic examples in our culture to go back a long time like plays, story, where we are introduced to the experiences of other people and are invited to see the world through their eyes. It sounds like that’s what you had as a hope for virtual reality. Is it still possible?
Jaron: 01:13:34 Well, if you had interviewed me in the 80s, you would’ve heard me, I think, speak quite persuasively and passionately about the potential for virtual reality to foster empathy. I still think it’s possible, but any possibility of that kind is currently overwhelmed by the horrible incentives where sneaky manipulation is the only thing that makes money. In that context, it’s very, very hard to have anything like genuine empathy appear.
Alan: 01:19:51 You’re making me think twice about the tweet I’m about to write after we finish our conversation about the latest podcast we just posted yesterday.
Jaron: 01:20:02 I’ll tell you what, Alan, instead of just tweeting start to develop some other methods and have a goal of not having to tweet in five years or something. Like if you have the pot … as I point out in the book, the podcast is one of the least corrupted media forms on the Internet so far. It could be destroyed.
Alan: 01:20:33 Of all the podcasts, this is the least corrupt.
Jaron: 01:20:37 Of course, of course. You can reach people through neutral email systems all kinds of ways that aren’t part of the manipulation engine. Maybe if you start to work towards that, It would be of some small benefit.
Alan: 01:20:50 Okay we’ll start researching that. I so much appreciate your bringing in these ideas very stimulating, very provocative, and very scary. Thank you, thank you so much Jaron. Before we end, we always ask our guests if it’s okay to answer seven quick questions with seven quick answers. Are you up for that?
Jaron: 01:06:04 I’ll give it a try.
Alan: 01:06:06 All right, it’s not embarrassing.
Jaron: 01:06:07 It’s the quick answers part that’s going to be the challenge.
Alan: 01:06:13 Okay, here’s the first one. What do you wish you really understood?
Jaron: 01:06:23 Right now, I wish I understood space time. I’m really going crazy trying to understand gravitation and I have a bunch of friends also interested in it.
Alan: 01:06:35 What do you wish other people understood about you?
Jaron: 01:06:47 I feel pretty well-understood actually. I don’t think I’ve created too much of an artifice of myself. Yeah.
Alan: 01:06:54 Okay, what’s the strangest question anyone has ever asked you?
Jaron: 01:07:04 That’s it.
Alan: 01:07:08 Okay. How do you stop a compulsive talker?
Jaron: 01:07:16 I’ve occasionally had that flaw in my own character. Fireworks.
Alan: 01:07:23 Fireworks. Carry a firecracker around, roman candle your light at the oddest moment.
Jaron: 01:07:30 Pull the plug, I don’t know.
Alan: 01:07:32 Okay. Is there anyone for whom you just can’t feel empathy?
Jaron: 01:07:41 Yeah, this is a poor question to ask a quick answer. There are horrible people in the world. There are people who were appalling who we just have to exclude from our empathy, Nazis and so on. The problem is they’re getting to be enough of them that we risk narrowing the world to the point where we can’t do anything with it if we exclude them, so it’s hard, it’s really hard. This is what I keep on coming back to it or technologies are bringing out the worst in people. I think if we can try to not bring out the worst in people at Least we can make that question less difficult to answer.
Alan: 01:08:27 How do you like to deliver bad news? In person on the phone or by carrier pigeon?
Jaron: 01:08:46 As I think about an answer, I have to say I feel stunningly fortunate to have not been tested in that way as much as many people I know so far in my life, although of course that won’t last forever.
Alan: 01:09:21 Sounds good and lucky to me. Last question, what if anything would make you end a friendship?
Jaron: 01:09:30 Yeah, this happens rarely but it has happened. It’s usually a loss of trust, a feeling that trust can never be regained. At that point, you become stuck. It’s a horrible thing to come to that spot but I’ve experienced coming to it.
Alan: 01:09:52 This has been a really fun interesting conversation with you. Thank you so much for going through the trouble and [crosstalk 01:10:00].
Jaron: 01:10:00 Thank you. Really, it’s been great fun.
We recorded my conversation with Jaron last fall, and I was curious to find out if he thought any of the bad publicity Facebook in particular has received since then has given him reason to hope things might be changing for the better. For instance Facebook claims it has removed millions of fake accounts. So we called him up and asked for an update.
This has been Clear + Vivid, at least I hope so.
My thanks the sponsors of this episode. All the income from the ads you hear go to the Center for Communicating Science at Stony Brook University. Just by listening to this podcast, you’re contributing to the better communication of science. So, thank you.
I haven’t deleted my Facebook account, yet. But … I have to say, this interview with Jaron has made me rethink everything about social media.
Jaron is one of the most fascinating intellects of our time and he’s an enlightened, engaged person who clearly has a handle on the ethical ramifications of modern technology – for better and for worse.
You can learn more about Jaron on his web site: www.jaronlanier.com. And – as you might imagine – there are no links to a Twitter or Facebook account. But there arelinks to his many fascinating books about technology and AI, including his latest one called, “10 Arguments for Deleting Your Social Media Accounts.”
This episode was produced by Graham Chedd with help from our associate producer, Sarah Chase. Our sound engineer is Dan Dzula, our Tech Guru is Allison Coston, our publicist is Sarah Hill.
You can subscribe to our podcast for free at Apple Podcasts, Stitcher or wherever you listen.
For more details about Clear + Vivid, and to sign up for my newsletter, please visit alanalada.com.
You can also find us on Facebook and Instagram at “Clear and Vivid” and I’m on Twitter @alanalda.
Thanks for listening. Bye bye!
Next in our series of conversations I talk with Christian Picciolini…
Christian: I can sit across the table from a neo-Nazi, whether he’s wearing khakis and a polo, or has swastika tattoos on his face, and I can let all the ideological talk just kind of fly by me.Sometimes it means sitting with my fists in a ball under the table and being really angry internally. But what I do is I introduce them to the people that they think that they hate. I can tell you that every single time I’ve done that, I’ve never had a bad experience and everybody’s always walked away different.
At a time when there’s an awful lot of hate in the air, Christian Picciolini has helped hundreds of neo-Nazis to turn away from bigotry and violence. He knows them well, because he was once a leading neo-Nazi organizer. Listen to our fascinating conversation next time Clear + Vivid.