Digital assistants like Alexa and Google Assistant spread sexism, says UN

Author: HANNAH DAWSON FOR THE DAILY MAIL
Published: Updated:
Photo: Amazon via MGN

Alexa is a common cry in most homes and the smart speaker has become a household staple along with Apple’s Siri.

But AI-powered voice assistants with female voices may not be norm in the future as they encourage harmful gender biases.

A UN study revealed that the voices used by smart speakers reinforce ideas that women are ‘subservient’ as they are portrayed as ‘obliging and eager to please.’

It also criticized the way female AI’s respond to gender-based insults with ‘deflecting, lacklustre or apologetic responses.’ The Unesco report, called ‘I’d blush if I could’ calls for technology firms to stop making voice assistants female by default and to hire more women to work on them.

The title is borrowed from a standard response from Siri, Apple’s first mobile assistant, and is what the automated voice said in response to being called a ‘b**ch’.

The 146 page report said: ‘Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminized digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.

‘Because the speech of most voice assistants is female, it sends a signal that women are docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.

‘The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility.

‘In many communities this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.’ Approximately 100 million smart speakers were sold globally in 2018, according to research firm Canalys.

These voice assistants manage one billion tasks a month, from playing music and telling you about the weather to providing recipes for a home-cooked meal.

The report calls on developers to create a neutral machine gender for voice assistants and goes as far to suggest voice assistants are programmed to respond to and discourage gender-based insults.

By 2020, research firm Gartner predicts some people will have more conversations with their voice assistant than with their spouse.

At the outset of every interaction the device should announce it is non-human, the report therefore suggested.

Scientists, sound designers and linguists are currently grappling to create a genderless digital voice named Q.

On their website, the creators said: ‘As society continues to break down the gender binary, recognizing those who neither identify as male nor female, the technology we create should follow.’

The Unesco report concluded that more women are needed in technology and the development of smart speakers which will prevent machines from responding playfully to abuse or insults in the future.

Originally published May 21, 2019 by Daily Mail. 

Copyright ©2024 Fort Myers Broadcasting. All rights reserved.

This material may not be published, broadcast, rewritten, or redistributed without prior written consent.