top of page

If you can have your service robot gendered, would you like it to be female or male?

I believe that I can predict your answer to this question when it comes to service robots that help users in their homes. If you are like the participants in a study I just made, I predict that you want your service robot to be female gendered. Most likely, this is a function of your stereotypical beliefs about who are best suited – males or females – to take care of things at home.

Producers of service robots and other non-human agents often design them so that they have
gender-related attributes – such as a female or male type of name and voice. For example, the
private digital assistant in my phone speaks with a female voice, and IKEA once had a virtual agent for customer service called Anna. Theories about human-to-machine interactions suggest that this may be a viable approach, because humanlike attributes increase the perceived similarity between a non-human agent and a human. And when this similarity increases, we typically feel more comfortable in our interactions with non-human agents. We also like the agent more if it is similar to us humans. Indeed, “what-is-humanlike-is-good” is a general reaction pattern when we encounter non-human agents.

Several studies have been made when it comes to robot gendering, and most of them indicate that only a few cues are needed for participants to perceive a robot as male or female. In one such study, putting pink earmuffs on a robot was enough to make the participants think about it as more female than male. Several studies also show that once a robot is perceived as having this or that gender, it is also perceived as having more or less of other characteristics. For example, a robot gendered as male is likely to be seen as having more agency than a robot gendered as female, while a female robot is likely to be seen as more friendly than a male robot.

Much research in this area has been of the experimental type: the researcher manipulates robot gender and examines what happens when participants in a study are exposed to either a male or a female version of the same robot (typically, there are only male and female robots in such manipulations, alternative identities such as non-binary are yet to be studied).

Existing research, however, has not produced results that indicate a strong general pattern of preferences for one particular robot gender. Some studies show that robot gender does not influence perceptions of the robot’s other characteristics; some studies show that preferences are conditioned by the type of task the robot perform and by the participant’s gender. A male type of robot may be preferred for stereotypical male tasks, such as guarding a house and transporting goods, while a female type of robot may be preferred for stereotypical female tasks (e.g., providing care). Some studies also indicate that people prefer robots of the opposite gender, while some show that we prefer robots of the same gender.

I had a chance to examine some robot gendering issues with an alternative approach. I showed 300 participants (150 males and 150 females) a description of an advanced and versatile robot for use in the home. This robot can do many things; it can use a vacuum cleaner, it can read literary fiction for you, it can be your personal trainer, and much more. The participants were informed that the robot could be customized by the user in terms of a set of attributes that are of the gender-signaling type, and the participants were asked to indicate if they preferred that the robot’s name, voice, body shape, and face should be male or female.

The results show that the participants preferred the robot’s name, voice, and face to be female rather than male. I also examined if there were differences between the males and females who participated in the study. And there were differences. They male participants preferred that the robot should be female gendered more than the female participants. So, overall, a robot with female attributes was preferred more than a robot with male attributes – and this preference for what is female was even more pronounced for male participants.

Why is this so? The data suggest that the participants’ preferences were influenced by their stereotypical occupational beliefs about a human service employee who is doing what the robot could do. That is to say, the more they believed that a female human employee is more suitable than a male employee for doing what the robot could do, the more they preferred that the robot should have female attributes. Thus, gendered stereotypes, which are prevalent in the context of human-to-human interactions, were applied to a non-human agent.

The implication for customer-oriented producers of domestic service robots, then, is relatively straightforward: robots should be provided with female attributes. Or? Critics would say that this is likely to reinforce stereotypical beliefs about what is male and female characteristics. But it must not be like that. One day, in the not-so-distant future, we will be exposed to many service robots, and prolonged exposure to them may affect our social beliefs – particularly if there is some general theme in robotic behavior that repeats itself. So there is an opportunity for change here: robot producers concerned with societal implications of what they do may challenge our gender beliefs by deliberately giving robots counter-stereotypical attributes.

bottom of page