I had never heard of the Foundation for Responsible Robotics (FRR) until I read “Bad Day for Human Dignity” in This Week magazine, [1] but there is such a foundation and I’d like to explore today what it might propose for us.
I am sure they mean to cue up the contrast between responsible and irresponsible robotics and to say that they are against “irresponsible robotics,” whatever that is. But of course, that begs the real question: does “responsible robotics” mean anything? Can it mean anythiing?
It is typical of American culture to put sexual ethics first among our ethical concerns. And that being the case, we would expect a foundation interested in “responsibility” to pass by our responsibility to our senior citizens and to children and to pass by ethical considerations of violent behavior and focus, instead, on sex. And they have.
What happened to Samantha
Let’s start with why Greg Nichols is lamenting the “bad day for human dignity.”The incident happened at the Arts Electronica Festival in Linz, Austria. Sergi Santos, an engineer from Barcelona, Spain, was showing off a robotic doll he calls Samantha. The interactive robot is reportedly programmed to respond to “romance.” (The quotation marks are in the original article, meaning that the author isn’t quite sure what the word means in this context.)
Not much romance happened. Instead, “the men at the festival treated Samantha like, well, an object.
“The people mounted Samantha’s breasts, her legs, and arms. Two fingers were broken. She was heavily soiled,” Santos told Britain’s Metro. “People can be bad.”
Yes, we can. And I have no wish to minimize the bad behavior of the men who did this to Mr. Santos’s toy. If they had done it to his wife or his daughter, we would all know how to feel about it. The way it happened, you have to dig a little to say why it was wrong. Or whether it was wrong. Or whether it was just boorish. How do we even ask questions like that?
Fortunately, I have a good guide for questions like that. Sherry Turkle is the author of Alone Together, a study of the human-robot connection. She and a collection of her students gave robots with a limited vocabulary, a rudimentary ability to “learn,” ( and big blue eyes) to a bunch of grade school kids. And when she did that, she gave them a real dilemma. They all knew this robot was not a person. It was a machine. They knew it wasn’t alive—not, at least, in the ordinary sense of the word.
But they invested their emotions in it anyway and they attributed to it whatever traits were necessary to make sense of that investment. If they said they loved the little robot, then they said that it loved them back or that it enjoyed being loved by them. Eventually they came up with a formula. It wasn’t “alive” exactly, but it was “alive enough” for the relationship they had built up with it.
Alive enough. Really?
Very similar experiments with seniors in senior residences produced the same dilemma. The old people, mostly women, started into their time with the cute little robots knowing firmly that they were “not real.” But the robots were designed to elicit “actions of caring” and they did. Now it is the old person’s job to invent a rationale in which it is not ridiculous for them to behave the way they are behaving. So they began with ambiguous justifications, but just as you would expect, when those justifications were probed by the experimenters, they reacted in different ways. Some gave it all up and confessed that they were just pretending. Some doubled down and insisted that the little robots had really learned to love them and that they were only reciprocating.
Only reciprocating? Really?
The Foundation for Responsible Robotics
The Foundation for Responsible Robotics (FRR) wants to “promote the responsible…implementation…of robots imbedded in our society.” That’s in their mission statement. And this is their official icon, by the way. Is the little robot who forced the children to invent a standard like “alive enough” a responsible implementation of embedded robots? Is the display of a sex robot who responds to “romance” a responsible implementation?” Is the production of such a robot better or worse than a pack of men going rogue and abusing the robot?
These are hard questions, I think. No one I have read has any idea what to do with the genuine feelings of love (the children) or lust (the purchasers of sex robots) that are projected onto the robots. Let me say it the other way: the feelings of love or of lust that the robots “elicit.”
How is that different? We commonly say that people have “feelings” and that they “project” these feelings onto various objects. That’s how men can become completely infatuated with women they have never met and can imagine that these women would welcome romantic initiatives from them. You can imagine a mechanism in which these fantasies the men have are within them and that they are projected onto the women in the way that a movie is projected onto a screen.
This isn’t like that.
The cuddly toys and the “rapable” dolls are designed to elicit certain feelings in the children and the men, respectively. [2] The company “designs, develops, and implements” a doll that “has” a personality of a certain kind. She leads you on—by what means I don’t really know—and then starts being unresponsive. At that point you unleash on her behaviors that if she were a woman, would be called “rape.” She has just done what she was designed to do and so have you.
She is following her programming and so are you. But she has no choice. Her programming is the only operating system she has. We have a “here’s what I want to do” system, but we also have a “here’s what is right to do” system. The human programming, the part of you that “responds” to the romantic cues from the robot, is the “here’s what I want to do” part of you. Presumably the Foundation for Responsible Robotics is interested in the other part.
Are robots different?
From the standpoint of the men in question, the objects of their sexual desires don’t need to be robots. They can be real women, provided that the real women have very little choice in how to respond. Here’s an example.
I want to introduce you to a character in one of Ursula LeGuin’s novellas. Here is the story the young woman Rakam tells.
We were sent across to the men’s side most nights. When there were dinner parties, after the ladies left the dinner room we were brought in to sit on the owners’ knees and drink wine with them. Then they would use us there on the couches or take us to their rooms. The men of Zeskra were not cruel. Some liked to rape, but most preferred to think that we desired them and wanted whatever they wanted. Such men could be satisfied, the one kind if we showed fear or submission, the other kind if we showed yielding and delight. [3]
I want to pause for just a moment here to remember what we are doing. We are trying to explore whether some notion of “responsible robotics” can be formulated and we are using sex dolls, like Samantha, as our test case.
It is easy to see the differences between the sex robots and the slave women. But is there a difference in the two situations so far as the men are concerned? I don’t think so. The men desire to express their sexuality in some way or another—two modes are illustrated in Rakam’s story—and they want the object of their desires to respond “appropriately.” That means “fear and submission” in the one case and “yielding and delight” in the other case. These “responses” are performed by the sex dolls in the one setting and by the slave women in the other.
Philosopher Charles Ess says, “Since the machines are incapable of real emotions, they are simply “faking it”, no matter how persuasively.” I’m sure he is right, but the slave women are “faking it” too and so are the sex workers who make their living by faking it night after night. None of these considerations takes into account whether something really dreadful is, as I fear, being done to the men who really know better and are having the experience by pretending they don’t know better.
I’d like to close by offering you some of the questions the Foundation for Responsible Robotics thinks is worth asking.
1. Could robots help with sexual healing and therapy?
2. Could intimacy [sic] with robots lead to greater social isolation?
3. What kind of relationship could we have with a sex robot?
4. Will sex robots change societal perceptions of gender?
First, I want to say that I don’t think they are stupid questions. If “interactive” robots are coming, we need to have someone thinking about what kinds of effects they will have.
But, second, these all seem to me superficial questions. They are public policy questions, which are important in their own right, but there are also more personal, more meaningful questions. Is there going to be a hollowing out of people whose lives are more and more based on fantasies they know, at some level of consciousness, are not real. Will these relationships become the dream we can’t wake up from?
I really don’t want to find out, but I am afraid we are going to.
[1] “ Bad Day for Human Dignity,” page 6, This Week, October 13, 2017
[2] You can buy a Roxxxy TrueCompanion robot for just under $10,000 and set her “personality” to reject your “advances.” At which point, you could just go ahead and “rape” her. This “personality” you would have chosen for her is called “Frigid Farah” by the company who sells it. See Laura Bates lead paragraph in a recent New York Times Op-Ed:
[3] From “A Woman’s Liberation,” one of the stories that makes up LeGuin’s Four Ways to Forgiveness, see page 223. These “strategies” that Rakam and the other women employ are “settings” for the sex dolls.