Digital voice assistants, such as Apple’s Siri and Amazon’s Alexa, could reinforce existing gender biases within societies, according to a new report from the United Nations Educational, Scientific and Cultural Organisation. The technology, which is becoming increasingly widespread, typically projects the assistants as female. Since the technology is designed to be submissive and subservient, even in the face of verbal abuse and sexualised requests, this risks deepening already entrenched inequalities, the report warns.
Many researchers have warned that artificial intelligence, which ‘learns’ using existing data and practices, could encode existing prejudices. Women make up only 12 per cent of AI researchers, and only 6 per cent of software developers, the UN said.
Saniye Gülser Corat, UNESCO’s director for gender equality, said: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
The Weekly Briefing is delivered to you by Responsible Business, with the exclusive news, insights and content to deliver practical solution to global challenges. Sign-up here to receive weekly updates and stay up to date.