Is Siri sexist? UN cautions against biased voice assistants | Inquirer Technology

Is Siri sexist? UN cautions against biased voice assistants

/ 11:48 AM May 23, 2019

INQUIRER FILE PHOTO

NEW YORK — Are the female voices behind Apple’s Siri and Amazon’s Alexa amplifying gender bias around the world?
The United Nations thinks so.

A report released Wednesday by the UN’s culture and science organization raises concerns about what it describes as the “hardwired subservience” built into default female-voiced assistants operated by Apple, Amazon, Google and Microsoft.

ADVERTISEMENT

READ: Google’s AI Assistant AIMS to Transcend the Smart Speaker

FEATURED STORIES

The report is called “I’d Blush If I Could.” It’s a reference to an answer Apple’s Siri gives after hearing sexist insults from users. It says it’s a problem that millions of people are getting accustomed to commanding female-voiced assistants that are “servile, obedient and unfailingly polite,” even when confronted with harassment from humans.

The agency recommends tech companies stop making digital assistants female by default and program them to discourage gender-based insults and abusive language.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

TOPICS: Apple, gender issues, Siri, technology, United Nations, voice assistant
TAGS: Apple, gender issues, Siri, technology, United Nations, voice assistant

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

We use cookies to ensure you get the best experience on our website. By continuing, you are agreeing to our use of cookies. To find out more, please click this link.