Interview: Should sex robots be able to say no?

You don’t want a refrigerator that says “No, I won’t open” every now and again. That would be an infringement of your autonomy. But an alcohol activated immobilizer in the car of a person who likes to drive under the influence is a good idea. And a sex robot? Should it be able to say no? According to Pim Haselager, associate professor of Theoretical Cognitive Science at the Donders Institute and PhD candidate Anco Peeters, that depends on the context. “Technology shapes our society and our psyche. We have to reflect on what that means.”

The article ‘Designing Virtuous Sex Robots’ by Pim Haselager and Anco Peeters caused some commotion. “People wrote things like “They want to let sex robots loose on college campuses”, but often they hadn’t really read the article.” Haselager can laugh about it. “This wasn’t unexpected. After all, we weren’t talking about bycicles. Often, people don’t get any further than the word ‘sex’.” The one-sided reaction to the article underscores the importance of the debate that Haselager and Peeters want to promote on the societal and psychological consequences of social robots. “Health-care robots and sex robots are coming, and they’re going to have an influence on the way we perceive attachment, love and sex. We want people to think about this: what do they want and what do they perhaps not want at all? What do we think is possible and acceptable? And what about the differing opinions on the subject?”

Relationships with robots

Robots are more than just appliances. “We already know that we build up emotional attachments with technology,” says Peeters. Think of Furbies and Tamagotchies, or soldiers who bond with their military robots. There already are people who have emotional relationships with sex dolls. It is plausible that people could develop strong feelings for sex robots. Haselager adds, “People bond easily, and robots can be designed to make that emotional bond as strong as possible. It is very likely that this bonding will become even stronger if sex is involved. What do want to do about this? And how can we approach it? That is a relevant discussion.”

The discussion is not just about sex robots, but also about the advantages and disadvantages of all sorts of technology, including robotics. As a result, the pair like to use examples that have nothing to do with sex. Haselager explains, “Suppose you have a robot that clears the table. It really matters whether the robot does this in an empty room, a room with adults, or during a children’s party. A children’s party is chaotic, and the chance that mistakes will be made is greater. Then the robot would have to give signals like “I’m not sure” so that the people can take action. Sometimes, a robot even needs to be able to say no. If children give it an instruction, it mustn’t just heedlessly obey. Robots are machines that can sometimes cause damage. Saying no is a safety module.”

‘In order to build robots we can trust, we have to build robots that distrust themselves,’ states Pim Haselager at TEDxRadboudU.

Technology shapes us

Peeters stresses that we must realise that our dealings with technology shape our character and our habits. “The way you perceive your interaction with technology shapes the way in which you deal with it, and eventually your view of the world.” Again, Peeters illustrates this with a simple example. “Think of the way we write a text. You get a different text when writing with pen and paper than when writing on a computer. Research has been done on this. With pen and paper, you’re less concerned with individual words and you think more on sentence level. With a computer you’re editing constantly, because the computer allows you to do so. It’s a mistake to think that these instruments are subservient to your will; they influence the process. The same is true of sex robots. They shape the way you think about love and sex.”

The article by Haselager and Peeters describes how sex robots not only shape our thinking but can be addictive as well. For example, they see parallels between sex robots and video games. However, Peeters points out, “Robots are different from games in one important aspect. Robots demand a different kind of interaction because they resemble people. This means that robots elicit reactions that we normally reserve for other people. Robots can thus create a pattern of expectations that can influence the interaction between them and people.” Haselager explains further, “Suppose that boys were to have early sexual experiences with robots. How would that influence their capacity for dealing with rejection? How would that influence their ability to make advances in a socially acceptable manner? This could cause a social handicap to develop.”

Moral character

Both Haselager and Peeters have a background in philosophy and artificial intelligence. They view social robots through the lens of virtue ethics. The word ‘virtue’ is not especially appealing, but Peeters puts this in perspective: “It has a patriarchal undertone, but for the ancient Greeks, what we now call ‘virtue’ meant ‘excelling in something’. This could also involve sports or playing an instrument.” Virtue ethics is about the development of moral character. It studies how judgements are weighed in a particular situation and asks questions such as “How does technology influence our social skills?” On the other side of the spectrum, across from virtue ethics, lie duty ethics and consequentialism. These ethical approaches examine duties and the effect of actions. Can a robot track someone in a store? Can it intervene if you’re driving too fast?

“Duty ethics and consequentialism are easier to apply in robotics. At first appearance, it seems relatively straightforward to program them into a system”, says Haselager, “but this doesn’t mean you’ll always get the results you’re hoping for.” Virtue ethics focus on the context in which the action takes place. “Deciding whether something is good or bad in moral terms is dependent on the situation. And the situations in which robots find themselves are always different,” the researchers explain. “It’s not easy to implement virtues in a robot, but that doesn’t mean that we should ignore them. It’s risky not to think about this before sex robots appear on the market. Societal and ethical discussions take time”, Haselager warns.

Therapeutic uses

In short, the researchers are wondering, “Could we use sex robots to cultivate virtues?” Haselager and Peeters take this further than mere theoretical discussion. One of the possibilities they envision is therapy with sex robots. Their paper provides an impetus for contemplating this idea in cooperation with psychologists and sexologists. For example, Haselager and Peeters suggest that social interactions concerning love and sex, such as saying yes or no, can be practised with a robot. “In a therapeutic context, that could be useful. Or maybe not, but that’s exactly what we need to discuss.”

Image by DJ Johnson via Unsplash. This interview first appeared on the 20th of December 2019 with Radboud Recharge: https://www.radboudrecharge.nl/en/article/should-sex-robots-be-able-to-say-no

Facebooktwitterredditpinterestlinkedinmail