An interesting question: why are most AI bots, voice assistants, etc., female-identified or female-set, such as Siri, Cortana, and Alexa?
Some say it’s because most of the developers are male. Obviously, they are more interested in working on female AI; or from a consumer perspective, male consumers must also be predominant for technology products such as voice assistants and smart speakers. The truth is that it may not always be clear to the developers on this issue, and the real reason needs to be stated from a deep human psychology background.
Therefore, this article attempts to combine Takuma’s R&D experience in AI human-computer interaction to talk about the mental model of AI-human interaction from a psychological perspective and how good human-computer interaction should be close to public psychology.
We are starting with two psychological laws of human socialization.
1.Self-expression is a significant factor in human socialization.
The study points out that AI bots also need to confide in users, and social exchange exists between humans and AI bots. Only AI confessions and disclosures can gain more trust from users and thus make them more willing to confide in the bot.
Often, however, developers of AI conversational bots miss this point. They want the bot to communicate with users and gain their trust. However, they need first to give the bot enough credible and complete information about its background and life so that, in the process of communication, the bot not only plays the role of a mere listener but, more often than not, acts as an ‘exchange’ to tell the user about its background In this way actively, the bot not only takes on the role of a mere listener, but more often than not acts as an ‘exchange’ for the user’s background information, and even feelings and worries, thus facilitating the building of intimacy.
2. The robot’s personality and style are also essential in social interaction.
According to Isbister’s research, robots must be consistent in all aspects of their personality, including text, voice, image design and personality settings. People may prefer a robot with a different personality from their own.
Personality is a very complex influence; everyone has more or fewer personality preferences when making friends. Personality preferences are also influenced by ‘identity’, so it is difficult to have a ‘universal’ robot personality that will work in all scenarios. For example, in a study by Tay, it was found that people’s personality preferences for robots were also influenced by the stereotype of the character they were playing; for example, people preferred introverted security robots, extroverted healthcare robots, etc.
Reference
Kang, S. H., & Gratch, J. (2011). People like virtual counselors that highly-disclose about themselves. The Annual Review of Cyber Therapy and Telemedicine, 167, 143-148.
Isbister, K., & Nass, C. (2000). Consistency of personality in interactive characters: verbal cues, non-verbal cues, and user characteristics. International journal of human-computer studies, 53(2), 251-267.
Tay, B., Jung, Y., & Park, T. (2014). When stereotypes meet robots: the double-edge sword of robot gender and personality in human–robot interaction. Computers in Human Behavior, 38, 75-84.
Siegel, M., Breazeal, C., & Norton, M.I. (2009, October). Persuasive robotics: The influence of robot gender on human behavior. In Intelligent Robots and Systems, 2009. IROS 2009. IEEE/RSJ International Conference on (pp. 2563-2568). IEEE.