gender equality – AI News https://news.deepgeniusai.com Artificial Intelligence News Wed, 25 Mar 2020 05:13:02 +0000 en-GB hourly 1 https://deepgeniusai.com/news.deepgeniusai.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png gender equality – AI News https://news.deepgeniusai.com 32 32 UN: AI voice assistants fuel stereotype women are ‘subservient’ https://news.deepgeniusai.com/2019/05/22/un-ai-voice-assistants-stereotype-women/ https://news.deepgeniusai.com/2019/05/22/un-ai-voice-assistants-stereotype-women/#respond Wed, 22 May 2019 14:01:44 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=5675 A report from the UN claims AI voice assistants like Alexa and Siri are fueling the stereotype women are ‘subservient’. Published by UNESCO (United Nations Educational, Scientific and Cultural Organization), the 146-page report titled “I’d blush if I could” highlights the market is dominated by female voice assistants. According to the researchers, the almost exclusive... Read more »

The post UN: AI voice assistants fuel stereotype women are ‘subservient’ appeared first on AI News.

]]>
A report from the UN claims AI voice assistants like Alexa and Siri are fueling the stereotype women are ‘subservient’.

Published by UNESCO (United Nations Educational, Scientific and Cultural Organization), the 146-page report titled “I’d blush if I could” highlights the market is dominated by female voice assistants.

According to the researchers, the almost exclusive use of female voice assistants fuels stereotypes that women are “obliging, docile and eager-to-please helpers”.

The researchers also believe the lack of mannerisms required in speaking to current virtual assistants is also problematic. They claim the fact a virtual assistant will respond to requests no matter how it’s asked reinforces the idea women are “subservient and tolerant of poor treatment” in some communities.

Similarly, the fact virtual assistants can be summoned with just a “touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” makes it appear like women are available on demand.

Most virtual assistants use female voices by default but offer a male option. Technology giants such as Amazon and Apple have in the past said consumers prefer female voices for their assistants, with an Amazon spokesperson recently attributing these voices with more “sympathetic and pleasant” traits.

The report highlights virtual assistants are predominantly created with male engineering teams. Some cases even found assistants “thanking users for sexual harassment”, and that sexual advances from male users were tolerated more than from female users.

Siri was found to respond “provocatively to sexual favours” from male users, with phrases such as: “I’d blush if I could” (hence the report’s title) and “Oooh!”, but would do so less towards women.

The lack of ability for female voice assistants to defend themselves from sexist and hostile insults “may highlight her powerlessness,” claims the report. Such coding “projects a digitally encrypted ‘boys will be boys’ attitude” that “may help biases to take hold and spread”.

In a bid to help tackle the issue, the UN believes gender-neutral and non-human voices should be used. The researchers point towards Stephen Hawking’s famous robotic voice as one such example.

Alexa, Google Assistant, and Cortana all use female voices by default. Siri uses a male voice in Arabic, British English, Dutch, and French.

deepgeniusai.com/">AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, , & .

The post UN: AI voice assistants fuel stereotype women are ‘subservient’ appeared first on AI News.

]]>
https://news.deepgeniusai.com/2019/05/22/un-ai-voice-assistants-stereotype-women/feed/ 0