Back in the old days, when there began to be science fiction fantasies about robots taking over the world, reassurance came with the line, “You can always just unplug them.” And, with a quibble occasioned by robotic security measures, that is still true. But what if you didn’t want to unplug them. Then what?
I’m going to be working with some of the implications of Sherry Turkle’s superb book, Alone Together: Why We Expect More from Technology and Less from Each Other. This is Dr. Turkle. It’s the “ expecting less from each other” part that first caught my interest and it still the part I find most worrying. I began to find it worrying well before I got to the part about Roxxxy, the sex robot.
Here are some fragments of a conversation she had with “Bruce.”
In 1983, thirteen year-old Bruce talked about robots and argued for the unique “emotionality” of people. Bruce rested his case on the idea that computers and robots are “perfect,” while people are “imperfect,” flawed and frail. Robots, he said, “do everything right;” people “do the best they know how.” But for Bruce it was human imperfection that makes for the ties that bind. Specifically, his own limitations made him feel close to his father (I have a lot in common with my father…we both have chaos.”)
Twenty-five years later, Howard, fifteen, compares his father to the idea of a robot confidant and his father does not fare well in the comparison.
Howard thinks the robot would be better able to grasp the intricacies of high school life. “Its database would be larger than Dad’s. Dad has knowledge of basic things, but not enough of high school.” In contrast to Bruce’s sense that robots are not qualified to have an opinion about the goings-on in families, Howard hopes that robots might be specially trained to take care of “the elderly and children”—something he doesn’t see the people around him as much interested in.
From Bruce to Howard, Turkle says in summary, “human fallibility has gone from being an endearment to a liability.” What’s going on here?
Part of it, the least scary part, is that the people Turkle has been talking to feel that robots have less of a downside than humans. Here are Harry, a 42-year-old architect, and Jane, a 36-year-old teacher. Both are “relating to” a robot named AIBO, seen at the left.
Harry knows that AIBO is not aware of him as a person but says, “I don’t feel bad about this. A pet isn’t as aware of me as a person might be…Dogs don’t measure up to people…Each level of creature simply does their best. I like it that he [AIBO] recognizes me as his master.” Jane says she turns to AIBO for “amusement,” but also for “companionship.” Jane looks forward to its company after a long workday. Jane talks to her AIBO. “Spending time” with AIBO means sharing the events of her day, “like who I’m having lunch with at school, which students give me trouble.” Her husband, says Jane, is not interested in these topics. It is more comfortable to talk to AIBO than to force her husband to listen to stories that bore him.
These functions of the robots match the criterion Turkle calls “better than nothing.” Someone, we don’t know who, is not regarding Harry as “the master” (her master?) in the way he wants and AIBO is “better than nothing.” Jane’s husband is not interested in who Jane had lunch with at school or which students are giving her trouble, so she tells AIBO. That is, after all, “better than nothing.”
I’ve always said that if there is only one horse in the race, you know who is going to win. But I am not sure these examples are like that. Harry wants to be someone’s (something’s?) master but has apparently not found anyone who wants him as a his/her/its master. AIBO is not going to help Harry wonder what it is about himself that wants to be a master
Jane wants “someone” to tell the day’s stories to. Her husband isn’t interested, so she tells them to AIBO—“sharing,” she says, the events of her day.[1] She gets to tell the stories without watching her husband be bored. The husband gets off without having to hear the stories at all. But why does the husband not want to hear the stories? Why does Jane not tell the stories in a way that interests her husband? These stories are about her, after all. Does she tell too many stories? Do the stories crowd out some things her husband would like to say if he had the chance? What is going on in this marriage? For all those questions, AIBO is a step in the wrong direction.
Which brings us to Roxxxy, the sex robot.
Roxxxy cannot move, although it (she) has electronically warmed skin and internal organs that pulse. It (she) does, however make conversation. The robot’s creator, Douglas Hines, helpfully offers, “Sex only goes so far—then you want to be able to talk to the person.” So, for example, when Roxxxy senses that her hand is being held, the robot says “I love holding hands with you” and moves into more erotic conversations when the physical caresses become more intimate. One can choose different personalities for Roxxxy, ranging from wild to frigid.
If Harry and Jane didn’t make you want to stop and think, how about Roxxxy? Again, the first justifications come from the down side of the equation. Here’s Wesley.
Ex-wives have told him he is too moody. He sees himself as a “pressure” on a woman and he feels pressure as well because he has not been able to protect women he cared for from his “ups and downs.” He likes the idea of a robot because he could act naturally—it could not be hurt by his dark moods. Wesley considers the possibility of two “women,” one real and the other artificial: “Maybe I would want a robot that would be the perfect mate—less needs—and a real woman. The robot could take some of the pressure off the real woman. She wouldn’t have to perform emotionally at such a high level, really an unrealistic level…I could stay in my comfort zone.
Here we have the “better than nothing” defense again; this time mixed with a “real woman.” Wesley thinks he might be able to live with a real woman if he had a robot who could “take some of the pressure off.” Here’s Roxxxy and a friend. It looks like the friend who is having the drink.
Turkle worries—and now that I have read her, I worry too—about the transition from the “better than nothing” defense to the “better than anything” defense. I’d like to take a look at that next time, but there is a darker side to even this first step. The Furbies, a children’s robot of the 1998 holiday season, were built so they would shut down if they were “abused.” Scholars who work in Turkle’s field debated at the time whether shutting down was the best response for a robot (does not provide reinforcement to the abuser) or whether it would be better for the abuser to experience the simulated distress a Furby was perfectly capable of producing. It’s hard to know.
But when it comes to sex robots, it is even harder to know. Let’s say this sex robot—bound and gagged in duct tape—is Wesley’s “other woman.” What happens when his human woman objects to doing for Wesley what Roxxxy doesn’t seem to mind?
Uncomfortable yet? Me too.
[1] I have been wary for some time now of the perversion of the word sharing. Sharing is “a good thing” because the thing being shared is good. Sharing food with people who need food is good; sharing warmth with people who need warmth is good. Sharing the stories of the day might be good if someone wanted to share them, but Jane doesn’t have anyone who wants to share them. Telling them to her husband is not likely to seem like “sharing” to the husband. Telling them to AIBO is not, in fact, “sharing” them at all, but Jane feels better for having told the stories even when they are not “shared.”
Are you thinking that this is a widespread phenomenon, or will be? I honestly think that this is all theoretical at this point, since I can’t imagine “normal” people–about 99 percent of us–would find current robot technology adequate companionship.
There have always been those people who find substitutes for human interaction. For some it’s plants, for others it’s dolls that look eerily like a baby, for Tom Hanks it was a volleyball. I think Ms.Turkle is looking down the road here. She may be right that there is a storm a-brewin’ but I don’t think it’s anywhere close to being defined enough to worry about.
There may come a day when robots can approximate human behavior closely enough to suffice as a surrogate for a fairly wide range of tasks, but I seriously doubt even your grandchildren will see that day.
It sounds to me like you’re fearful of people choosing the company of robots over people, which sure sounds like something worth being fearful about. But if we get to a point where it’s hard to tell the difference, not just aesthetically but by interaction, will that still be a problem? Haven’t you, in essence, just created another human?
Well, look at that. I buried the lead.
-Doug
Love your “buried the lead” reference to Broadcast News. The perfect touch.
Turkle has been working this beat for thirty years or so. She says that in that time, she has seen a substantial difference in how people–children and elders, particularly–react to robots. The two boys, whose interviews I excerpted to begin the post, represent the direction of the difference. The inconsistency of humans endears them to us and helps us to identify with them. On behalf of parents everywhere, I say thank you. The second is that parents have “too small a database” to really “understand the world I live in.” The attention of that second teenager goes to size of “database.” Wow.
Turkle says, in a part of the book I am not going to deal with, but which would probably be more interesting to you, that we have today, parents who were raised by parents who spent the family dinnertime texting their friends. Given that we learn our parenting styles by remembering how we were parented (and by vowing not to repeat all those mistakes), we are looking now at parents whose memory of “the family dinner” features THEIR parents texting. Not only is this effect not “on down the road;” it is at least a generation in our past.
Finally, to borrow a line from the movie, I Robot, “I like robots because they are safe!” This is Brigit Moynihan reproving Will Smith who, in all fairness, really IS not safe in this movie. The robots Turkle talks about give the “illusion of intimacy” but without any of the responsibilities that ordinarily accompany it.
I think we are THERE.