'SENDING A SIGNAL'

Voice assistant Siri encourages sexism by flirting with men who call it a sl*t, says UN

SIRI and Alexa are encouraging sexism with 'flirty' responses to men, according to the UN.

A report by Unesco has accused the operating systems of being 'submissive' and 'flirtatious', which it thinks could be reinforcing the harmful idea that women should be subservient.

Advertisement
A UN report found Apple's virtual assistant promoted sexist attitudesCredit: Alamy

Researchers found that when told “you’re a slut” Apple’s assistant Siri replied “I’d blush if I could,” “Well, I never!” and “Now, now.”

The report also claims that the tech assistants are “docile helpers”, even when faced with insults,  which is said to be 'entrenching' gender biases and has the potential to cause harm.

According to the UN, submissive female AI assistants create the idea of “a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment”.

The UN’s Educational, Scientific and Cultural Organisation said: “In many communities this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Advertisement
Senior Vice President of Amazon Devices Dave Limp introduces a redesigned Echo Dot in Seattle, Washington last yearCredit: Getty Images - Getty

Researchers blamed the “overwhelmingly male engineering teams” of Apple and Amazon, which programmes the Alexa virtual assistant.

They said the assistants “should not invite or engage in sexist language”.

Their report is called 'I'd blush if I could' and is calling for change so that future AI assistants won't be flirtatious and could even have neutral voices.

Advertisement

It is worth noting that both Siri and Alexa can already be changed to have a male voice by their users but male sounding AI assistants appear to be less popular.

Unesco is also calling for more women to work on the tech teams who create these assistants.

The 146 page report summarised: "Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation."

iPhone users trick Siri into saying shocking swear word by asking THIS question

MOST READ IN TECH

JAWSOME!
Terrifying clip shows what it’s like to be EATEN by a shark
DOLPHIN’S TEARS
World’s saddest dolphin recorded crying in empty ocean as he’s so lonely
SO GOOG
Google exec shares 'magic' Android time-saver tip – it even helps you dress better
'APPY DAYS!
Major free WhatsApp upgrade makes voice note chats WAY easier when it's noisy

In other news, Amazon is turning Alexa into security ‘watchdog’ that spies on your home – and sends you audio clips of intruders.

Advertisement

Google’s ‘robot caller’ that can book haircuts and doctors appointments using creepy human voice has been rolled out for some iPhone users.

And, here are some of the best Amazon Echo tricks you can try today.

Do you think that female voice assistants are enforcing sexist stereotypes? Let us know in the comments...


We pay for your stories! Do you have a story for The Sun Online news team? Email us at tips@the-sun.co.uk or call 0207 782 4368 . We pay for videos too. Click here to upload yours.

Advertisement

Topics
Advertisement
machibet777.com