Sherry Turkle on How We’re Losing Touch With One Another and What We Can Do About It

Sherry Turkle
I’m Alan Alda and this is Clear and vivid, conversations about connecting and communicating.
Sherry: What I’m interested in and what the computer can never do, the robot can never do, these empathy machines that are being built can never do, is give you the feeling that I’m here for the duration to keep trying because I’ve had human experiences too and loss and pain and I’m gonna try to relate mine to yours. I’m not anti-technology, I’m pro conversation.
Sherry Turkle knows her stuff when it comes to conversation. She has a joint doctorate in sociology and personality psychology from Harvard University. She’s an expert on culture and therapy, mobile technology, social networking, and sociable robotics. She’s an advocate for more and better conversation. And she’s an awfully good conversationalist herself.
Alan: 00:00:00 This is so great to be talking to you, because you know that I talk about all the stuff that you spent decades studying. So I want to hear what goes into all these things. I’ll tell you what, the main reason that I’m so happy to be talking with you is that we never call these interviews. We call them conversations. And you’re the queen of conversation right now in our culture.
Sherry: 00:00:25 I’m very excited to hear you say that.
Alan: 00:00:28 Well, you really stand up for conversation. You show us in ways you studied the problem how important conversation is. We’re surrounded by devices that hold back conversation.
Sherry: 00:00:43 I think so. I think that’s the biggest danger of these devices at the end of the day.
Alan: 00:00:48 I forgot my iPhone today on my way here. I had a moment of panic. I had to choose between being late or being on time without the iPhone. I almost chose being late to get the iPhone.
Sherry: 00:01:03 I know. I’ve had that experience too where I’m going some place where I know I don’t need the iPhone and I don’t have it and then all of a sudden, I need to have it unless I go, “Why do you need to have it? You need to have it because you’re used to having it.” But in fact, I think moments like that are good because they force us to realize that you can do without it and to also accept our dependence on something that we really don’t need and we do better if we-
Alan: 00:01:38 But we feel like we’ve come to need it. I describe it as a moment of panic and it was a moment of panic. It’s like, I have so much of my life in that and I almost never make a phone call. It’s ironic that it’s called a phone.
Sherry: 00:01:56 Yes. Well, I see this as an arc. An arc that’s moving very quickly. It’s very early days with this technology and at first, there’s the infatuation. Everybody has to have it. We’re at this point where we need it, everything’s on it, we can’t do anything without it. You see that at Silicon Valley, people hire nannies to watch their children to make sure they have zero time on the phone, zero time on the screen.
Alan: 00:01:56 And the nannies are supposed to be there without their phones.
Sherry: 00:02:40 And the nannies can’t have their phones. So the most sophisticated people are now on a zero tolerance, treating it as though it’s crack cocaine, it’s a drug, it’s the worst thing in the world.
Alan: 00:02:50 The very people who are bringing us this technology.
Sherry: 00:04:00 Yes. Well, that’s why I think it’s an arc that first comes the moment of infatuation, then comes the moment that it’s the biggest danger in the world and the place we’re at now is to realize that this is something that we just need to get control over because, in fact, to give it to children as the go to object so that they don’t develop the capacity for boredom, but instead run to their phones, that is a real danger. That’s not a made up danger. That’s not a fake thing to be nervous about.
Alan: 01:12:37 One of the things that I learned talking with scientists on the television show, Scientific American Frontiers, was this idea of the default state of the brain. Apparently, boredom is very useful because we tend to go into the default state where things … You can describe this better than I can, I’m sure, but the idea is that creative impulses thrive in this default state where we think we’re doing nothing, we think we’re thinking of nothing, but the brain is roaming and roving and it seems to be a very important part of being alive and coming up with solutions or coming up with new ways of thinking and behaving.
Sherry: 01:13:38 Actually, the default mode network does more than that. It’s the time, it’s the process during which the brain lays down a kind of stable sense of the autobiographical self.
Alan: 01:13:52 What did you mean by that?
Sherry: 01:13:55 Who was I, who am I now, what’s my future, kind of lays down …
Alan: 01:14:02 So this is conscious or unconscious?
Sherry: 01:14:05 Unconscious. In other words, that default work doesn’t just include creativity. Part of that creativity is also the creativity of your selfhood.
Alan: 01:14:19 Right. So in a ways, getting in touch with yourself.
Sherry: 01:14:22 Yes.
Alan: 01:14:58 And if we reach for the phone when we sense, “Oh, here comes boredom,” we’re denying ourselves apparently this very valuable, age old human function that the brain wants to exercise in those moments that seem like nothing’s happening.
Sherry: 00:05:09 Absolutely. And to be able to put down the phone so that you can have a genuine moment of solitude where you learn to look into yourself and get to know yourself There’s a wonderful expression in the psychoanalytic literature. If you don’t teach your children to be alone, they’ll only know how to be lonely. In other words, you need to have that capacity to be with yourself in order to look to another person as a separate person, as somebody different that you can have a relationship with.
Alan: 00:06:05 Yeah. We often hear you have to love yourself. But first you have to find yourself interesting.
Sherry: 00:06:12 Exactly.
Alan: 00:06:12 And you don’t find yourself interesting unless you spend time with self.
Sherry: 00:06:16 Exactly.
Alan: 00:06:17 But what about the idea that we’re addicted to these devices? Are we addicted in your opinion or not? What about my moment of panic that I left my phone at home?
Sherry: 00:06:50 You know what? Did you leave your phone at home?
Alan: 00:06:53 I don’t have it.
Sherry: 00:06:54 Well, good for you. You don’t look so bad to me. You’re looking pretty good. You’re talking to me. If this is withdrawal … And there we’re answering the question.
Alan: 00:07:07 I’ll be okay as long as you ding every few seconds.
MUSIC BRIDGE
Alan: 00:09:10 I remember so many wonderful conversations with my grandkids when they were little in the backseat of the car. Both of us are talking into the air but we’re making contact in an unusual air that we wouldn’t if we were just sitting in a living room together. There was a freedom to free associate, talk about what’s out the window, and things would come up that were surprising and fun. This was before everybody had a phone to play games on.
Sherry: 00:09:48 And there’s that sense of shared space. I think that’s one of the things that’s come out so poignantly in my research is that when you go to your phone, you’re basically saying, ” I’m leaving the shared space.” When you asked me if I wanted to put on headphones before we began the interview and I said what’s the difference, and you said if you put on the headphones, you have a little more of a sense of being in the bubble in the conversation with me. And I thought that was kind of interesting.
Alan: 00:10:20 Is it true? Does it happen that way?
Sherry: 00:10:21 Yeah, I think so that when you take out a phone, you aggressively leave the common space of the people you’re with. So a wonderful experiment, this was a turnaround experiment with me, where if you’re at a table, could be a kitchen table, could be a lunch table, a business meeting, and you put down your phone, turned off and face down on the table, two things happen to the conversation. It becomes more trivial in its content and the empathic communication between the two people becomes less. Even if the phone is turned off and turned face down.
Alan: 00:11:03 That’s sort of amazing. Just the presence of the phone is some kind of symbol to your unconscious.
Sherry: 00:11:13 Of all the elsewheres you could be, if you’re out of the bubble of presence to the [crosstalk 00:11:15].
Alan: 00:11:15 The opposite of what we have now with these headphones. We’re tuned into each other. But the other thing, empathy. Really, we talk a lot about empathy in our conversations on this show and I’ve heard you say some really amazing things about empathy.
Sherry: 00:12:29 There was this study over a 10 year period that included the years of the introduction of phones. And in that time, over 30 years, there was a 40% decline in standard measures of empathy in that period. And most of the decline in empathy took place in the final 10 years, which were the years of the introduction of the technology.
And the principle researcher was so upset by this. I had a conversation with her, a great researcher named Sarah Conrad. And she immediately set herself the task of writing empathy apps for the iPhone.
Alan: 00:14:08 Can you do it? That’s the question. Can you increase someone’s empathy with an app?
Sherry: 00:14:12 And sort of almost philosophically, if you feel like the problem is the kids are looking at screens.
Alan: 00:14:19 Yeah. And losing empathy, can you fix that with an empathy … It’s a good question.
Sherry: 00:14:25 It’s a good question and I think this really is kind of the place that we’re at and the conversation we need to have because I teach at MIT where the philosophical position is that if there’s a problem that technology has created, technology will find the solution. Let’s look to technology to find the solution.
Alan: 00:14:52 So far though, I agree with what I think I heard you say, which is that the ultimate empathy app is another person.
Sherry: 00:15:00 Yes. My belief is that we are the empathy app. Talking like this, this is the empathy app.
MUSIC BED

OK, so maybe apps are not so good for teaching empathy. But I asked Sherry what if we are confronted with a much more sophisticated thing, like a social robot – something that seems to have empathy…
Sherry: 00:30:46 Well, this is something I feel strongly about and I think a burden of proof here is on the side of people who want us to be intimate with objects that have no empathy, these pretend empathy machines. So it’s a little bit like where we were with marketing iPhones where the people who were building the phones knew that they were creating devices that would be addictive or building things into the devices that would be addictive and did it anyway, knowing that this really would not be good for people.
Alan: 00:31:49 It was business. It’s why Coca Cola is said to have put cocaine in their early products.
Sherry: 00:31:56 Exactly. So then there was a cycle and it was found out and you had to walk it back. So now there’s a cycle and the tech companies are trying to walk it back a little. But instead of going through that cycle with this notion that we’re going to have dolls for children that pretend that they are real people and have real emotions and real empathy to offer and instead of pretend relationships with children, why not show a little common sense from the start and ask the question what is the universe on which it is good for a child to have a pretend empathic relationship with an object?
Alan: 00:32:43 Well, don’t they anyway?
Sherry: 00:32:44 No, they don’t. It’s not the same as having a relationship with a doll. When a child has a relationship with a doll that they see as an imaginary friend, the object is inert. The doll is inert and the child projects onto the doll their own needs, their own fantasies.
Alan: 00:33:09 Right. So you don’t get fake feedback.
Sherry: 00:33:11 Exactly. That a little girl who has just broken her mother’s crystal will put her barbies in detention. When she punishes the barbies, she’s working through her own feelings of guilt and wanting punishment and punishing the barbies and talking about why she did it. And in fact, this is the basis of child psychotherapy. Therapy with children is having them play with dollhouses and dolls and letting their imaginations play.
That’s very different than a barbie that comes out of the box and says to the child, a robot barbie, that says, “Hi, I’m Barbie. I have a mom and a sister and I am a little mad at my sister and my mom. Are you mad at your mom? Are you mad at your sister?”
Alan: 00:33:11 Well, what does that do? How do we know that’s bad?
Sherry: 00:34:03 Well, first of all, it puts the child into a play space of what I’m calling an “As If” relationship, which is never good for any of us to be in relationships that are simulations, in a sense, of real relationships where there is no real emotion and real feeling and real connection happening because there is no real mother, there is no real sister.
Alan: 00:34:43 Sounds a little like emotional pornography. Stands in for something that you can’t do.
Sherry: 00:34:52 You can’t go any place with it. So if a child says, “Yeah, my mom doesn’t understand, duh, duh, duh,” the barbie has nothing to offer since the barbie has no … So then the child is in this halfway zone between nothing and nothing. And to what purpose? To sell a doll?
Alan: 00:35:17 Yeah. Well, the idea that it could be destructive and to sell the doll and not check into whether it’s destructive or not before you have a bad effect on millions of children is not a good idea.
Sherry: 00:35:32 And then what are you going to do when the child says to the barbie, “You know, actually, I’m mad at my father because he touched me in a bad place.”
Sherry: 00:35:52 And then the child feels as though they’ve said something important and that needs a response. The child has shared something that desperately needs a response. [crosstalk 00:36:02]
Alan: 00:36:01 And Barbie’s not yet ready to …
Sherry: 00:36:04 Barbie’s like this robot. This is how I spend my days is playing through this scenario and trying to make it come out in happy place, and I cannot make this come out in a happy place.
Alan: 00:36:27 No. Well, I can imagine, on the one hand thinking about this a lot is exciting, and on the other hand, depressing because you’re talking to the ocean to a certain extent. It’s a vast body of water that’s been set in motion by industries worth billions and billions.
Sherry: 00:36:47 I don’t feel that way because look at how quickly we’ve gone from everybody needs to have a phone, every eight year old needs a phone, every five year old needs a phone to phones are bad for kids.
Alan: 00:37:01 Yeah.
Sherry: 00:37:02 I think that actually, what you need to have in this business is tenacity, the right question, and I think the right question is what human purpose does this serve? What is the human end game? If you have that question and you just keep saying love the technology, you’re just a genius, it’s very smart, it’s very clever, what is the human end game?
Sherry: 00:23:42 I’m thinking of a demo at MIT that was many years ago where they were showing how great this new Internet of Things was and the demo showed how using the internet of things, you could order a coffee, like a cappuccino, frappuccino, really a complicated coffee, and on your phone, you could get to the Starbucks, let’s say, without passing your ex-spouse, ex-girlfriend, anybody you’ve ever had a fight with, because your phone would know who you’re trying to avoid. You could make a route to Cambridge to get you to the Starbucks.
And it occurred to me that the technological culture that I was living in had this value system where it just was obvious that vulnerability and conflict and emotional conflict was something that, definitionally, you were trying to not have.
Alan: 00:23:42 Not experience. Yeah.
Sherry: 00:24:52 Not experience. And that’s what technology was for. It was to create this friction free. And the expression that was used was that the internet of things would give you friction free emotional life.
Alan: 00:25:08 Yeah. Friction free life means as soon as you experience some friction, you’re devastated by it for one thing[crosstalk 00:25:15] and you don’t get the other advantages of friction.
Sherry: 00:25:22 Yes.. But I was very struck by the technology’s values reveal themselves in these ways when they present you with a product. Everybody was saying, “Oh my God, it’s so cool. It’s like Harry Potter’s invisibility cloak. I can get to where I want to go and I never have to have a vibe that’s negative.” And then I thought, “Well, what is this model of life where everything, you never have to have a negative though, you’re never going to have to confront a person, you never have to learn how to say I’m sorry, you never have to learn how to apologize, you never have to …”
Alan: 00:25:19 And then meanwhile, your refrigerator is spying on you.
I tried desperately to think of some kind of an app that would allow us to connect better with one another. And Sherry sort of liked one… sort of… when we come back from this break.
MIDROLL
This is C+V, and now back to my conversation with Sherry Turkle.

Alan: 00:37:38 There are companies, I think, trying not so much to engender empathy, to build up your empathy, but to point you toward a social interaction. Does that sound like a good idea to you?
Sherry: 00:37:56 No. Because there’s no social interaction. Social interaction, saying oh, it’s not an empathic interaction, it’s just a social interaction. That’s not how human beings work. When human beings have social interactions that aren’t empathic interactions, we say they’re autistic. When people come in and go through robotic moments of social interactions without making eye contact, without feeling something for the other person, we say that there’s something wrong with them. So is that what we want to train? We want to train people that can have social interactions without there being human connection? We can’t.
Alan: 00:38:39 If you have it … I guess all I’m trying to say if there’s an app of some kind who’s sole purpose is not to get you to use the app more but to actually get you to relate to another person in a human way, that sounds like it might be a good idea, a way to help ween us off the devices.
Sherry: 00:39:02 So the point of the app is to ween you off the app?
Alan: 00:39:07 Maybe.
Sherry: 00:39:08 Well, I’d have to see it. I’ve seen some apps that are interesting. Not every app that has a “social interaction” with a human being makes me want to run screaming from the room. For example, there are apps that run people through simulations of job interviews. I think those are very interesting.
Alan: 00:39:08 That does sound interesting.
Sherry: 00:39:38 That’s interesting. So you’ve had a job interview 50 times before you actually sit down for the job interview. Or situations that are fairly, particularly for people who haven’t had a lot of social interactions, where really just the experience of … And they correct you. Just the experience of what’s expected and the hello, what’s expected in this, what’s expected in that. Social interactions that I would say are fairly stereotypically where they’re not the interactions where we’re looking for complex connection or deep connection. But fairly stereotypical ones where you’re encouraged to share things about your work and your resume and do a self presentation.
Alan: 00:41:00 However, the more we talk about it, the more I am reminded of the fact that saying the right things is not nearly as important very often as how you present yourself in your regard toward the other person.
Sherry: 00:41:20 Absolutely. What I’m interested in and what the computer can never do, the robot can never do, the empathy app can never do, these empathy machines that are being built can never do, is give you the feeling that I’m here for the duration to keep trying because I’ve had human experiences too and loss and pain and I’ve gone through illnesses and mothers and fathers. I’ve gone through some things and I’m gonna try to relate mine to yours.
Alan: 00:48:00 . In addition to that, it seems to me empathy happens when I have a feeling for what’s going on inside you that I’m not just listening to the words you say. I’m getting under that. And partly I get under that my looking you in the eye. I see these subtle changes in your face as you listen to me. And I have a view of you that’s much more from your point of view than it was before I started looking so carefully.
Sherry: 00:49:47 That it’s best done in person, it’s best done in conversation, it’s best done by looking … I think it’s best done in the presence of the body. I think that empathy is something that, again, when we have our bodies in the game.
MUSIC BRIDGE
Alan: 00:50:20 So what about connecting with each other on FaceTime or Skype?
Sherry: 00:50:26 It’s great if you put it in its place. I’m not anti-technology. I like to say I’m not anti-technology, I’m pro conversation.
Alan: 00:50:35 Right. But can we have a conversation on the screen that’s as valuable as a conversation where we’re body to body in the same room?
Sherry: 00:50:46 No. It’s not the same, but it doesn’t mean that it’s bad. I think we have to get out of this … At some point, we got into this is it as good, is it better, is it 90%. It doesn’t have to … We don’t have to judge it. We have all of it.
Sherry: 00:51:18 It’s just a different experience.
Alan: 00:51:24 You talk about these digital experiences being considered good enough. What’s an example of a good enough experience?
Sherry: 00:51:33 My favorite one is I did a study of people and their robot dogs.
Alan: 00:51:46 I don’t know much about robot dogs. How do they work?
Sherry: 00:51:49 Well, they program, so first they’re a puppy and then you teach them and they learn and they’re really sophisticated artificial intelligences. So you can teach them tricks and they learn to recognize you. They become, from puppified to dogified, then they’re more mature dogs, they know a lot of tricks, they recognize you as the owner.
Alan: 00:52:12 But they don’t require you to train them not to pee on the rug?
Sherry: 00:52:16 No, but they pretend to pee. They do all kinds of stuff.
So I did the study of people in nursing homes and the robot dogs and families and buying robot dogs for their grandparents. Anyway, it began with people saying things like, “Well, I’m gonna get my grandmother a robot dog because it’s better than nothing. She can’t have a real dog because she’s allergic.”
Alan: 00:52:46 Yeah.
Sherry: 00:52:46 Then it was, “Well, it’s better than something in some ways because it doesn’t pee and she doesn’t have to take it walking, she doesn’t have to go outside with it.” And then all of a sudden, it’s better than anything because it will never die and make her sad.
The first time I saw this pattern, I was like, “Hold on a second.” Then all of a sudden, I realized that this in fact is a pattern that we have with technology. It starts off the argument is, “Listen, this robot dog, it’s just better than nothing. This robot babysitter is better than nothing. This robot that will raise your child is better than nothing.” Then all of a sudden, it becomes better than anything that a human being can provide. And I think that’s where I think it’s very important to recognize this tendency that we have to start to glorify the digital, to start to glorify the technology until the technology becomes better than anything we can provide for each other.
Alan: 00:54:01 Is that ever going to be possible? The question of whether or not machines can feel ever has been one that has been debated since machines started to try to learn. But that doesn’t’ sound to me like the important question. The important question is can we actually relate to a machine as though it was feeling in a way that didn’t hurt us, but helped us?
Sherry: 00:55:18 I think it’s more a psychoanalytic question for us. Why are we so stuck on this? Why can’t we make ourselves and this very dark time for the human race when we’re showing such an inability to care for the earth, such inability to care for each other and be kind to each other, why can’t we focus on our human questions and increasing our and thinking of how we can use machines to increase our empathic potential to each other and work on that? What machines can help us increase our capacity to care for each other as opposed to worrying about can we get machines that will love us and whether we can love ourselves.
I think that there’s some kind of displacement of where our real problems are now that it fascinates me, as a sociologist, as a psychologist, but I don’t think is really where we should … It’s not about AI. It’s about how we’re using artificial intimacy, not to think about human intimacy.
Alan: 00:57:11 AI now meaning artificial intimacy].
Sherry: 00:57:12 Right. The new AI is artificial intimacy. Artificial intelligence is so yesterday. Now we’re into artificial intimacy.
Alan: 00:57:21 So it sounds like you’re saying that you’re studying how conversation can be an antidote to that.
Sherry: 00:57:31 I think conversation is the talking cure, as Freud would have said. Because in conversation, you realize … In this conversation, you realize even in just a simple conversation between two people who are trying to understand each other, get to a shared space together, you realize, “God, this is great.”
Alan: 00:57:56 Yeah. As soon as you hook up with another person, you think, “Well, how long has this been going on?”
Sherry: 00:58:02 Yeah, this is amazing. This is excellent.
Alan: 00:58:08 Why was I avoiding this?
Sherry: 00:58:10 Yeah. Why don’t we teach our kids in school? Why are all these kids in schools being issued, spending all day looking at screens? Can’t we get them doing this?
Alan: 00:58:19 Yeah, that’s why I want to spend some more time introducing the idea of improvisation training in schools because the essence of that is connecting to the other person.
Sherry: 00:58:32 Absolutely.
Alan: 00:58:33 And even short of that, just getting kids to converse. We’re talking about conversation as maybe the defense against these bad uses of technology. How would you, in really simple terms, how would you describe good conversation? What makes for good conversation?
Sherry: 00:58:56 Listening.
Alan: 00:58:58 Yeah. Yeah.
Sherry: 00:58:59 Just listening. That’s why I think the association for improvisation is so excellent because when you watch people improvise, the entire exercise is listening.
Alan: 00:59:13 One of the basic ideas in improvisation is the notion of yes and, which is now entering the culture and I’m glad to hear it. Because when you say no to whatever the other person has said, that’s canceling them out. Whereas “Yes, and” is to say, “Yes, I get what you’re saying and this maybe goes on top of it or maybe it doesn’t, but it adds to what you’re saying rather than dismissing what you’re saying.” That seems to me to be essential. Because how am I going to learn where you are in your thinking and your feelings unless I acknowledge what I’m hearing from you?
Sherry: 01:00:09 Absolutely.
Alan: 01:00:10 And then together, we can make and. Because if I do an and to you, you can do a yes and back to me and we’re getting to some place that neither of us thought we were going to go.
Sherry: 01:00:20 Right. I think this is what children should learn in school. Because what they learn from screens, no matter how brilliant the content, and again, I’m not criticizing the wonderful content that people are putting on screens, the imagination of it. It’s awesome. But it doesn’t substitute for the deep humanity of what kids get when they study history and then talk about it.
Alan: 01:00:52 Yeah. Yeah. And expose their responses, their reactions to one another.
Sherry: 01:00:59 Right.
Alan: 01:00:59 And begin to learn about one another and begin to learn how to have a constructive exchange.

MUSIC BRIDGE
Alan: 01:03:26 One of the most touching stories I’ve ever heard is when you tell about the father who bathed his infant and then he had another child to bathe years later. How did that go? I forget.
Sherry: 01:03:42 Well, I was talking to people about how the change in technology had affected their lives and one guy said, “You know, I think you’re right. I have these two daughters and one was in the pre-iPhone years and I used to love giving her baths. She used to have these little toys, her little guys in the bath and we used to sit and we used to tell stories. Bath time was a time for conversation.
And now I have a two year old and I give her a bath too. That’s my job and the distribution of labor in the house. I put her in the bath and I make sure she’s safe and I put down the seat on the toilet and I take out my iPhone and I just do my email while she takes her bath.
Alan: 01:04:34 It takes the breath out of me to hear that. To have a little kid that you can get a world of pleasure from and you take out your iPhone. But does he at least recognize that that’s something to move away from?
Sherry: 01:04:46 Well, in other words, it was an unfolding conversation in which the conversation did not begin with him on my side. It was more like, “Uh, Sherry [inaudible 01:04:58] again. Are you back? Are you back pitching your woo?” And then he said, “Well, method of interviewing is pretty conversational.” I said, “Well, just what’s an example from your life?” And then he had no trouble.
Alan: 01:05:18 He came right up with it.
Sherry: 01:05:19 Came right up with it.
Alan: 01:05:20 So did you get any indication he saw his behavior differently?
Sherry: 01:05:23 Well, absolutely. He said I think you’re right. But it was interesting. He said the damage is to me and my relationship with her.
Alan: 01:05:23 Yeah.
Sherry: 01:05:32 It’s not just what I’m doing to her. It’s what I’m missing because I remember that those hours we spent together with the girl before the iPhone, with the daughter before the iPhone, really are the basis of our relationship.
Then there was another dad who was divorced and his greatest, the thing he wanted to do most was go on school trips with his daughter until he realized that on the bus, on the school trip, all he did was either take pictures of her and put them on Instagram or take pictures of her and upload to the Facebook. His entire school bus trip and his entire time with her …
Alan: 01:06:20 He wasn’t experiencing it.
Sherry: 01:06:22 No. So it has to do with presence and I think that what the phone does at its worst is take us away, give us an alternative to presence.
Alan: 01:06:37 It’s so interesting you say that, because while you were talking, I was thinking I have so much enjoyed our conversation and it wasn’t so much the words we said. It was the presence that we experienced in one another. I see it in your face and it makes me happy to see it and I wouldn’t get that in a text message. I’m not gonna do any more conversations by text. Before we go, we do seven quick questions at the end of every …
Sherry: 01:07:05 Like the interview? Sure. Yeah.
Alan: 01:07:07 If it’s okay with you.
Sherry: 01:07:08 Yeah, of course.
Alan: 01:07:09 Seven quick answers. They’re basically in some way, related in some way to communication. First question, what do you wish you really understood?
Sherry: 01:07:31 The relationship between attention and empathy.
Alan: 01:07:37 What do you wish other people understood about you?
Sherry: 01:07:44 That I’m not a Luddite. How much I love my iPhone.
Alan: 01:07:52 What’s the strangest question anyone has ever asked you?
Sherry: 01:07:55 Would you want to have sex with a robot?
Alan: 01:08:01 Was this a robot speaking at the time?
Here’s one directly related to communication. How do you stop a compulsive talker? Are you doing it now?
Sherry: 01:08:25 I’m thinking. This is a toughie.
Alan: 01:08:32 Do you stop? Do you try to stop a compulsive talker?
Sherry: 01:08:41 In my personal life, yes. In my professional life, not so much. In my personal life, I say, “Let me tell you something about my day.”
Alan: 01:08:51 Just right out there. Frank.
Sherry: 01:08:53 Yeah. Sometimes I think they need to be slapped down a little bit. But in my professional life, I’ve got to say, sometimes I listen to my interviews and I do let them go on.
Alan: 01:09:03 Yeah. Is there anyone for whom you just can’t feel empathy?
Sherry: 01:09:17 I can feel empathy and deep hatred. I think that you can get into an empathic state where you understand and can walk in their shoes and then deeply disapprove and hate.
Alan: 01:09:39 So saying there’s no one you can’t feel empathy for doesn’t mean you agree with them …
Sherry: 01:09:43 No, Hitler, I mean I think got the picture.
Alan: 01:09:45 Exactly. Yeah. I can see that.
Sherry: 01:09:48 So to me, empathy isn’t a feel good. Maybe other people see it that way, but to me, empathy is not a feel good. At the end of it, you’re not in a happy place.
Alan: 01:09:58 It’s not touchy feely.
Sherry: 01:09:58 It’s not a happy place at the end. You can feel empathy and say, “And then I despise this. I despise it. I denounce it.” To me, empathy is a methodology.
Alan: 01:10:11 I read a story once about a prize fighter who was winning fights well into his 50s because he was figuring out what the other person was thinking and about to do next and used that knowledge not to make friends with him, but to knock him out. Knowing where the other person is doesn’t mean you agree with them, you lessen your struggle against them I would think.
How do you like to deliver bad news? In person, on the phone, or by carrier pigeon? Bad news?
Sherry: 01:10:42 Definitely in person is best.
Alan: 01:10:45 We all know that. How do you like to do it?
Sherry: 01:10:48 Everybody wants to do it by text. But in person is the only way.
Alan: 01:10:53 Okay. Last question.
Sherry: 01:10:56 Everybody wants to do it by text, but that’s the human desire. But that’s what I write about is pull yourself together and at least make a call.
Alan: 01:11:07 Yeah. I thought I knew what the answer to that would be. Last question. What, if anything, would make you end a friendship?
Sherry: 01:11:17 Betrayal. Significant betrayal.
Alan: 01:11:23 Well, I’ve had a wonderful time talking with you.
Sherry: 01:11:23 Me too.
Alan: 01:11:25 Thank you. That was so much fun. Thanks for coming in.
Sherry: 01:11:29 My pleasure.
Sherry: 01:11:41 Oh no, I think I have a better answer for the end.
Alan: 01:11:44 Oh good. Okay. So the last question?
What, if anything, would make you end a friendship?
Sherry: 01:11:54 If I found out the person was a robot.
Alan: 01:11:59 If you just find out at the last minute, maybe they’ve been pretty successful.
Sherry: 01:12:02 Maybe, but it’d just be a deal breaker.
Alan: 01:12:05 I’ve met a few robots in my life.

This has been Clear + Vivid, at least I hope so.

My thanks the sponsors of this episode. All the income from the ads you hear go to the Center for Communicating Science at Stony Brook University. Just by listening to this podcast, you’re contributing to the better communication of science. So, thank you.

Sherry Turkle has spent the last 30 years researching the psychology of our relationship with technology. She is the Abby Rockefeller Mauze Professor of Social Studies of Science and Technology at MIT, and she’s the founding director of the Initiative on Technology and Self, a center of research and reflection on the evolving connections between people and artifacts.

Her New York Times best-seller, “Reclaiming Conversation: The Power of Talk in the Digital Age” focuses on the importance of conversation in digital cultures. Her previous book, “Alone Together: Why We Expect More from Technology and Less from Each Other” describes technology’s influence on relationships between friends, lovers, parents and children, and new instabilities in how we understand privacy, community, intimacy and even solitude.

You can find her TED Talk on YouTube and find out more about Sherry on her web site at: sherryturkle.com

This episode was produced by Graham Chedd with help from our associate producer, Sarah Chase. Our sound engineer is Dan Dzula, our Tech Guru is Allison Coston, our publicist is Sarah Hill.

You can subscribe to our podcast for free at Apple Podcasts, Stitcher or wherever you listen.

For more details about Clear + Vivid, and to sign up for my newsletter, please visit alanalada.com.

You can also find us on Facebook and Instagram at “Clear and Vivid” and I’m on Twitter @alanalda.

Thanks for listening. Bye bye!