Jump directly to the content
'SENDING A SIGNAL'

Voice assistant Siri encourages sexism by flirting with men who call it a sl*t, says UN

SIRI and Alexa are encouraging sexism with 'flirty' responses to men, according to the UN.

A report by Unesco has accused the operating systems of being 'submissive' and 'flirtatious', which it thinks could be reinforcing the harmful idea that women should be subservient.

 A UN report found Apple's virtual assistant promoted sexist attitudes
2
A UN report found Apple's virtual assistant promoted sexist attitudesCredit: Alamy

Researchers found that when told “you’re a slut” Apple’s assistant Siri replied “I’d blush if I could,” “Well, I never!” and “Now, now.”

The report also claims that the tech assistants are “docile helpers”, even when faced with insults,  which is said to be 'entrenching' gender biases and has the potential to cause harm.

According to the UN, submissive female AI assistants create the idea of “a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment”.

The UN’s Educational, Scientific and Cultural Organisation said: “In many communities this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

 Senior Vice President of Amazon Devices Dave Limp introduces a redesigned Echo Dot in Seattle, Washington last year
2
Senior Vice President of Amazon Devices Dave Limp introduces a redesigned Echo Dot in Seattle, Washington last yearCredit: Getty Images - Getty

Researchers blamed the “overwhelmingly male engineering teams” of Apple and Amazon, which programmes the Alexa virtual assistant.

They said the assistants “should not invite or engage in sexist language”.

Their report is called 'I'd blush if I could' and is calling for change so that future AI assistants won't be flirtatious and could even have neutral voices.

It is worth noting that both Siri and Alexa can already be changed to have a male voice by their users but male sounding AI assistants appear to be less popular.

Unesco is also calling for more women to work on the tech teams who create these assistants.

The 146 page report summarised: "Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation."

iPhone users trick Siri into saying shocking swear word by asking THIS question

In other news, Amazon is turning Alexa into security ‘watchdog’ that spies on your home – and sends you audio clips of intruders.

Google’s ‘robot caller’ that can book haircuts and doctors appointments using creepy human voice has been rolled out for some iPhone users.

And, here are some of the best Amazon Echo tricks you can try today.

Do you think that female voice assistants are enforcing sexist stereotypes? Let us know in the comments...


We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . We pay for videos too. Click here to upload yours.


Topics