The “Inmost” Self

Sherry Turkle is one of the most articulate and knowledgeable critics of AI as a source of friendship and support.  Her book, Reclaiming Conversation is a wakeup call and a tightly reasoned argument. I agree with her concerns, but I am worried about the rationale.  I would really like a stronger one and I am not sure there is one.

She says that chatbots do not “have” empathy; they “perform” empathy.  That seems right to me, but is

it a criticism worth making?  Human society, as I experience it and as generations of sociologists have understood it, is a web of performances.  We greet each other with a courtesy we may not feel at the time, but we “perform it” because it is expected.  We ask after each other’s health, each other’s children, whether the vacation was all our friends had hoped.  We may sometimes experience the feelings from which these questions would naturally arise, but they will arise anyway because they are necessary.

They are performed interest, at the very least, and are done so as to suggest “empathy”. So what kind of criticism are we making when we say that the chatbots only “perform” empathy?  The only solid criticism I can think of is that than can not feel the feelings we feel—cannot by definition, “have” empathy.

That means that we can take for granted that the performance of an emotion by a chatbot is “insincere.”  It is “inauthentic.”  On the other hand, the performance is very good, and if that is what matters to us most, it may be something we will come to prefer.

Certainly we are saying that chatbots so not have the same feelings we have.  That would be empathy.  But are we saying that matters if the performance is good?  Turkle is worried that we may not continue to choose actual humans.  I am too, but I wonder what strong rationale there is for continuing to choose the variable performance of humans over the reliably competent vacuity of chatbots.

Here is Turkle’s reflection about a visit from psychologist Erik Erikson.

“I was a young faculty member at MIT in the late 1970s when the psychoanalyst Erik Erikson visited to talk about engineering education. After his presentation, he asked me what I was doing as a humanist at an engineering school. I told him I was studying how computers change people’s ideas about themselves, and he made this comment: ‘Engineers, they’re not convinced that people have an interior. It’s not necessary for their purposes.” 

And Turkle summarizes, “They see the complexity of inner life not as a feature but as a bug.”

According to Turkle, we need to continue to choose interactions with humans not because empathy is guaranteed, but because it is possible.  With the bots, it is not possible by definition.  Some argue that the illusion of human feelings—compassion, anxiety, pleasure—is good enough.  Here is Turkle’s argument that it is not good enough

“I argued for this assertion of agency in 2015, and now I argue ever more fervently. There is more than a threat to empathy at stake; there is a threat to our sense of what it means to be human. The performance of pretend emotion does not make machines more human. But it challenges what we think makes people special. Our human identity is something we need to reclaim for ourselves.”

She uses powerful words to sketch in what is at stake.  She says that “what it means to be human” is at stake.  OK, what does it mean to be human?  We know what it has meant, but is that what it fundamentally means?  How would we know?

She says it challenges “what we think makes people special.”  Are we right in thinking that the exchange of authentic emotions is “what makes people special?”  In the superficial sense, of course, it does.  If the bots cannot, even in principle, experience empathy, they human beings are “special” by definition.  But surely Turkle means more than that.

Turkle never talks about souls.  And there is no reason why she should.  I don’t talk about them either and I suspect it is for the same reason. [1]. But I do think that is where her logic will lead her.  If the superficial features of humans and bots are similar, then humans and bots will have to be distinguished by the authenticity of the superficial expressions.  Do they, in other words, express genuine feelings.  I think she and I would both say that bots don’t have “genuine feelings” no matter how effectively they perform them.

But if we cannot reliably say, based on our own experiences, which expressions are authentic and which are not, then we will have to continue on into the interior to make our case.  And what else is there?  Will we have to argue that humans are “better” because we have souls and the bots to not?

That is where I see the argument heading.  If we want to continue to prefer humans to bots and if we can no longer—or not much longer—distinguish the performance of emotion from the expression of an inner feeling, then what is left?  Souls are the next entity; that is, they are even more elusive than “authenticity.”  We have built our societies on the performance of empathy, requiring authenticity only of our most intimate relationships.  Is the next step to grant “authenticity” to the bots and if it is, what is left that they cannot have but that we can?

I think it is souls.  I am not happy about that.

Turkle’s actual program is unobjectionable.  In fact, I think it is crucially important. She says that we need to pay attention to what we are doing.  We need to consistently prefer humans, even when the immediate experience is not as pleasant as the reliable “camaraderie” of socially competent bots.  I think she is right.  Nothing but daring to prefer what only humans friends can give us, will keep us from morphing slowly into the bots most reliable accessory.

[1] My reason is that I would not know what I was talking about if I claimed some particular virtue for a “soul.”

Unknown's avatar

About hessd

Here is all you need to know to follow this blog. I am an old man and I love to think about why we say the things we do. I've taught at the elementary, secondary, collegiate, and doctoral levels. I don't think one is easier than another. They are hard in different ways. I have taught political science for a long time and have practiced politics in and around the Oregon Legislature. I don't think one is easier than another. They are hard in different ways. You'll be seeing a lot about my favorite topics here. There will be religious reflections (I'm a Christian) and political reflections (I'm a Democrat) and a good deal of whimsy. I'm a dilettante.
This entry was posted in Uncategorized and tagged , , , . Bookmark the permalink.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.