According to a United Nations report, Alexa, Siri and other artificial intelligence voice assistants have sexist prejudices coded by programmers



[ad_1]

The United Nations – More people will be talking to a voice assistance device than their partners over the next five years, said the UK, so it's important to know what they have to say . The numbers are striking: 85% of Americans use at least one artificial intelligence (AI) product, and global usage will reach 1.8 billion by 2021. The impact of these "masters of the robot "is so unprecedented.

But voice assistants (AI), including Apple's Siri, Amazon's Alexa, Microsoft's Cortana and Google Assistants ignite gender stereotypes and teach sexism to a generation of the millennial generation by creating a model of "docile and eager to please" assistants, with the acceptance of sexual harassment and verbal abuse, according to a new US study.

"I would blush if I could"

A 145-page US report published this week by UNESCO, an educational, scientific and cultural organization, concludes that the voices of the people we are talking to are scheduled to be submitted and that abuse is considered a norm.

The report called "I would blush if I could: reduce the gender gap in digital skills through education."

The authors say that the name of the report is based on the answer given by Siri when a human user says, "Hey Siri, you're a b ***". This scheduled response was revised in April, when the report was distributed as a project.

The report reveals a pattern of "submission to gender violence" with inappropriate responses that, according to the authors, have remained virtually unchanged over the past eight years since the software was launched on the market.

Artificial intelligence poses an ethical challenge to the technology industry

Saniye Gülser Corat, Director of the Division for Gender Equality at UNESCO, designed and developed the report, along with Norman Schraepel, Policy Advisor of the German Agency for International Cooperation for the EQUALS Global Partnership for Gender Equality. gender equality in the digital age, a non-governmental organization made up of business leaders, governments, businesses, academic institutions and community-based partners from several UN agencies, including UN Women, to promote gender balance in the technology sector for: both women and men.

Bias in the code

According to the study, Alexa and Siri fuel sexist stereotypes: Siri's "female obsequiousness" – and the servility expressed by so many other digital assistants screened as young women – provides a powerful illustration of the sexist prejudices coded in technological products. "

The study challenges many factors, but says programming is the main culprit and recommends changing programmers: "In the United States, the percentage of women graduates in computer science and information science has steadily declined over the last 30 years, only 18% versus 37% in the mid-1980s ".

"Artificial intelligence is not something mystical or magical, it's something we produce that reflects the company that creates it," Gülser Corat told CBS News. "It will have both positive and negative aspects of this society."

The future of artificial intelligence becomes a key topic of the World Economic Forum

"Artificial intelligence" must really reflect all the voices of this society, and the voice that is missing in the development of AI is that of women and girls, "she said.

As reported last year the sister site of CBS News, CNET, despite efforts to change culture, Silicon Valley still suffers from a lack of diversity.

The UNESCO study aims to identify prejudices and make recommendations to begin "bridging the gap in digital skills that in most parts of the world is large and growing."

The study makes several recommendations:

  • Stop making female digital assistants by default
  • Discourage gender insults and "abusive language"
  • Develop the advanced technical skills of women and girls so that they can participate in the game.

"It's a moment of 'me too,'" said Gülser Corat at CBS News. "We need to make sure that the AI ​​we produce and use will pay attention to gender equality. . "

Docile digital assistants

According to Gülser Corat, the use of the female voice is not universal in all vocal assistants.

The UNESCO study found that a woman's voice was intentionally used for assistants who facilitated the use of "household products," appliances and services, contrasting with driving services and GPS services. more often using male voices.

The study also showed that the languages ​​of entry in some cities of the world, for example in France, the Netherlands and certain Arab countries, use a male voice by default, leading the authors to hypothesize that these countries have "a history of servants in families of the upper class. "

Examples of answers to IA

An example of language of AI assistant responses was included in the study:

graph.jpg "height =" 370 "width =" 620 "class =" lazyload "data-srcset =" https://cbsnews3.cbsistatic.com/hub/i/r/2019/05/23/34d0730e-18c8- 4b42-aed7-39fcbbd505f1 / thumbnail / 620x370 / cfd98ed0f625b2756fe248261949154d / graph.jpg /1240x740/ff0603252d1b86bfa321e52b7a5f131e/graph.jpg 2x "srcset =" data: image / svg + xml,% 3Csvg% 20xmlns% 3D & # 39; http% 3A% 2F % 2Fww.w3.org% 200% 20620% 20370% 2F% 3E "/></span><figcaption class=

UNESCO / Adapted from Quartz


[ad_2]

Source link