artificial intelligence and gender

From last November until next June 6, at the Futures Festival organized by the Smithsonian Institute, an event where you can admire the innovations that are destined to change the world, we can hear a very special voice: Q.

Introduced in 2019 as the first genderless virtual voice, Q was created to be used by virtual assistants to spark a debate on gender in Artificial Intelligences.

In fact, as Ryan Sherman, one of the co-creators of the project, says: "Q was designed to initiate a conversation between insiders and the public about why Artificial Intelligence technology - of a genderless nature - was gendered." To design Q a team of linguists, sound engineers and creatives collaborated with non-binary individuals sampling their voices, to create a unique sound range neither female nor male.

Does the gender in Artificial Intelligence favor stereotypes and discrimination?

When Q was announced several years ago, it was hailed as "the genderless digital voice that the world needs right now," recognizing the potential harm that today's virtual assistants, in their quintessential feminine being, perpetuate in proposing misogynistic stereotypes, representing women as performers without decision-making autonomy.

Project Q was highlighted and recognised in a UN report on gender divisions in digital skills. "Almost all the assistants have been feminized in name, voice, models and personality" you can read in this report entitled I'd Blush If I Could, that is the answer that Siri originally provided to users who apostrophe her with vulgar or sexist epithets.

Today's context: how the relationship with virtual assistants is changing

Today the situation is fortunately changing: at the beginning of 2020 Apple eliminated the default "female" option for Siri, and now allows you to choose a male voice from a set of items called 1, 2 and 3 (in Italy there are 1 and 2). Only at the end of February 2022 Apple added another voice, number 5: this, the company said, was recorded by a member of the LGBTQ+ community and sounds much more gender-neutral. We keep in mind, of course, that these kinds of updates are very often not distributed in the same way and at the same time around the world, so in fact even today the scenario is different from country to country.

But to overcome gender stereotypes, more needs to be done than simply adding a male or gender neutral vocal stamp. In fact, even the idea of a "genderless" voice reveals some of the misconceptions we still face when we think of ways to avoid reinforcing the stereotypes themselves.

Returning to Q, for example, its use could reinforce the stereotype that non-binary individuals are neither men nor women, but “something in the middle” rather than “outside it.” In fact, it is not a question of fighting for "neutrality" but rather of starting to face much deeper reasoning, which goes from the very roots of the construction of the relationship between human beings and digital

A step in this direction was taken by Yolande Strengers, associate professor at Monash University and co-author, with Jenny Kennedy, of The Smart Wife: Why Siri and Alexa Need a Feminist Reboot.

In this text the authors state that they do not think that the solution is to completely remove the gender from the equation of Artificial Intelligences, because "this excessively simplifies the way in which these devices treat the gender, which does not only concern the voice, but also the type of things they say, their personality, their form and their purpose".

Queering: a possible solution to the problem

So Y.Strangers and J.Kennedy propose to "queerize the intelligent wife". What does it mean?
Queering the smart wife means offering virtual assistants different personalities who more accurately represent the many versions of femininity and masculinity that exist around the world, as opposed to the pleasant and servile personality that many companies have chosen to adopt for their assistants.

Artificial Intelligence Gender

An example could be Jibo, a robot introduced in 2017 that uses male pronouns and has been marketed as a social robot for the home. Jibo is characterized by a "sweet and effeminate" masculinity: for example, Jibo answers questions politely, with a flirtatious look, and often rotates and approaches people in an extravagant way.

Queering virtual assistants can also mean fighting stereotypes with irony. This is the case of Eno, the bot of Capital One Bank, launched in 2019 which, if asked about its gender, jokingly answers: "I am binary. I don't mean I'm both, I mean I'm actually just one and zero”.

“A bot is a bot is a bot”: providing a specific gender to Artificial Intelligence

Another approach, perhaps more courageous, is that of Kai, an online banking chatbot developed by Kasisto, an organization that builds artificial intelligence software for online banking. Kai completely abandons human characteristics, and rather assumes a specific identity of the robot.

When asked if he is a real person, Kai responds, “A bot is a bot is a bot. Next question, please", thus indicating to users its non-humanity, not even claimed, given also the no-sense of the answer itself.

That's also why Jacqueline Feldman - the bot's creator - said she designed Kai to be able to deflect and stop harassment. For example, if a user repeatedly harasses the bot, the bot responds with a phrase, like, "I'm imagining white sand and a hammock, please try later!"

This example then raises another problem of humanized bots. According to Feldman, when companies that design bots make them too human, making it difficult for users to understand whether they are talking to a machine or a person, it creates a point of friction with the often frustrating experiences that bots themselves offer. Simply put, if you are thinking of talking to a human, and the experience is not up to it, it inevitably generates a negative reaction in the user.

artificial intelligence gender

For example Google Duplex, a technology now integrated into Google Assistant, imitates the human voice to perform activities such as making restaurant reservations or making an appointment to get a haircut: this technology has often been called misleading to the user.

For this reason in 2019 California became the first state to require bots to identify themselves as such and, although the law has been described as imperfect, it can certainly be defined as the first step of a new path in the context of the rules governing communicative relationships.

Conclusions

To redefine the future of the relationship between digital assistants and human beings, companies and society must be willing to face deep reflections, with significant impacts in various work and private environments.

Reflections that we at Neosperience are also pursuing regarding the identity of our virtual assistant, Sofia. Although it has a feminine name, in choosing it we have been guided by the desire to refer to Σοφία, wisdom, in its ancient meaning, which derives from the adjective saphés ("clear", "manifest", "evident", "true"). Sofia the wise and the true, then.

Sofia is also fully customizable, in appearance and personality. It can assume any kind and style of communication, depending on the wishes and needs of users and customers. In addition, although the virtual avatar is extremely convincing, the voice is deliberately made more mechanical than other virtual assistants, so as not to risk deceiving the interlocutor about the real nature of Sofia.

But our intention is to continue to evolve Sofia, so that it is suitable to be used by everyone always, without putting in difficulty or creating damage to the sensitivity of people.

Only thanks to the heterogeneity and the approach of realities that deal with developing an advanced Artificial Intelligence will we be able to carry out truly inclusive digital developments, starting from a correct representation of gender.

Progress comes only from the acceptance and appreciation of differences, and the technological field is no exception.