UN: virtual smart assistants reinforce gender stereotypes

Studies by UNESCO claim that submissive and flirtatious answers from voice assistants to many questions, including offensive ones, reinforce the idea of ​​women as subordinates.

The report of UNESCO is called I'd Blush if I Could ("I would blush if I could"). The authors took this from Siri's answer when a user tells her: Hey Siri, you're a bi *** ("Hey Siri, you sh *** a"). In April 2019, the creators of Siri changed the answer to I don't know how to res10ond to that ("I do not know how to respond to such words").

"The speech of most voice assistants is female, because of this the idea may arise that women are helpful, obedient and eager to please.

After all, helpers are accessible at the touch of a button or with a rude voice command, such as “Hey” or “Okay,” the report says

in the report

. “The assistant does not have any authority. He executes commands and answers requests regardless “master.” In some societies, this reinforces generally accepted gender bias that women are willing to tolerate poor treatment. ".

According to UNESCO Director for Gender Equality Sania Gulser Korat, more attention needs to be paid to how and when artificial intelligence technologies become gender-based and, most importantly, who genders them.

The organization’s report urged not to make digital assistants women by default. Technology firms should explore the possibility of developing a neutral gender voice. This is necessary in order to prevent insults on the basis of gender. In addition, companies should not position a smart virtual assistant as a person.

At the Russian company Yandex, the virtual assistant also has a female name - Alice - and a voice.



Related Articles