People, Prosperity, Europe, Equal Opportunity & Human Rights, Inclusive Growth

UN warns on gender bias in artificial intelligence

Responsible Business | Jun 05, 2019

SHARE

Digital voice assistants, such as Apple’s Siri and Amazon’s Alexa, could reinforce existing gender biases within societies, according to a new report from the United Nations Educational, Scientific and Cultural Organisation. The technology, which is becoming increasingly widespread, typically projects the assistants as female. Since the technology is designed to be submissive and subservient, even in the face of verbal abuse and sexualised requests, this risks deepening already entrenched inequalities, the report warns.

Many researchers have warned that artificial intelligence, which ‘learns’ using existing data and practices, could encode existing prejudices. Women make up only 12 per cent of AI researchers, and only 6 per cent of software developers, the UN said.

Saniye Gülser Corat, UNESCO’s director for gender equality, said: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

The Weekly Briefing is delivered to you by Responsible Business, with the exclusive news, insights and content to deliver practical solution to global challenges. Sign-up here to receive weekly updates and stay up to date.

 

SUBSCRIBE FOR UPDATES

Thank you for subscribing!

Your subscription has now been confirmed. We look forward to keeping you up to date on the latest news around sustainable development in your chosen fields.

NESTLE

Your subscription has now been confirmed. We look forward to keeping you up to date on the latest news around sustainable development in your chosen fields.

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere
Stay connected. Be informed. Join the conversation

Sign up for our Weekly Newsletter and receive updates on all the latest sustainable development news, tools and insights straight to your inbox.