Gender, Social Problems, Technology

Siri and Alexa Reinforce Gender Bias, U.N. Finds

By Megan Specia

May 22, 2019

Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?

The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.

Link