Technology

How to put an end to gender biases in internet algorithms

Algorithms (2022). DOI: 10.3390 / a15090303″ width=”800″ height=”530″/>

Scopus indexed articles for various gender-related terms. attributed to him: Algorithms (2022). DOI: 10.3390/a15090303

Endless numbers have been framed as to whether the internet algorithms we constantly interact with suffer from gender bias, and all you need to do is do a little research to find out for yourself.

Read:The Best Energy-Saving Smart Home Gadgets to Help You Save on Utility Bills

But according to the researchers behind a new study seeking to reach a conclusion on this matter, “So far, the controversy has not involved any scientific analysis.” This new article, by a multidisciplinary team, poses a new way to approach the question and suggests some solutions to prevent these data anomalies and the discrimination they imply.

Algorithms are being used more and more to decide whether to grant a loan or accept applications. As the range of uses of artificial intelligence (AI), as well as its capabilities and importance, increases, it becomes increasingly necessary to assess any potential biases associated with these processes.

The researchers, whose study was published in Open Access, stated in Algorithms magazine, primarily focusing on gender bias in various fields of artificial intelligence.

Read:Remedy confirms Control 2 is in development for PS5, XSX and PC

Such biases can have a significant impact on society: “Biaases affect whatever is discriminated against, excluded, or stereotyped. For example, a gender or race may be excluded in decision-making or, simply put, a certain behavior may be excluded.” It is assumed because of the person’s gender or skin color,” explained the lead researcher of the research, Juliana Castañeda Jimenez, industrial doctoral student at Universidade Operta de Catalunya (UOC) under the supervision of Ángel A. Juan, from Universitat Politècnica de València, and Javier Panadero, from Universitat Politècnica de Catalunya.

According to Castañeda, “It is possible for arithmetic operations to discriminate because of gender, even when they are programmed to be ‘blind’ to that variable.”

The research team – which also includes researchers Milagros Sáinz and Sergi Yanes, both from the Gender and ICTs (GenTIC) Research Group of the Interdisciplinary Internet Institute (IN3), Laura Calvet, from the Salesian University School of Sarria, Assumpta Jöver, explains: Universitat de València and Ángel A. Juan illustrate this with a number of examples: the case of a well-known recruitment tool that favored males over female applicants, or that of some credit services that offered terms less favorable to women than to men. .

Read:How TT Ethernet can be exploited to disrupt critical systems • The Register

“If you use older, unbalanced data, you’re likely to see negative conditioning in terms of black, gay, and even female demographics, depending on when and where the data is,” Castañeda explained.

Science for boys and arts for girls

To understand how these patterns affect the different algorithms we work with, the researchers analyzed previous work that identified gender biases in data processes in four types of AI: those that describe applications in natural language processing and generation, decision management, and speech and facial recognition. Recognition.

Overall, they found that all of the algorithms identified and categorized white men better. They were also found to reproduce false beliefs about physical traits that should define a person depending on their biological sex, ethnic or cultural background, or sexual orientation, and also to create stereotyped associations that associate men with the sciences and women with the arts.

Many of the procedures used in image and voice recognition are also based on these stereotypes: cameras find it easier to recognize white faces, and sound analysis has problems with higher-pitched sounds, which primarily affect women.

The cases most likely to suffer from these problems are those whose algorithms are built on the basis of analysis of real-life data related to a specific social context. “Some of the main reasons are the underrepresentation of women in the design and development of AI products and services, and the use of datasets with a gender bias,” noted the researcher, who argued that the problem stems from the cultural environment in which they are developed.

“An algorithm, when trained with biased data, can detect hidden patterns in a society and, when run, reproduce them. So if men and women are unequally represented in a society, the design and development of AI products and services will exhibit gender biases.”

How do we put an end to this?

The many sources of gender bias, as well as the peculiarities of each particular type of algorithm and data set, mean that weeding out this skew is a very difficult – though not impossible – challenge.

“The designers and everyone else involved in their design need to be aware of the potential for biases associated with the logic of the algorithm. What’s more, they need to understand what measures are available to minimize, as far as possible, potential biases, and implement so that they do not occur, because if they are Familiar with the types of discrimination that occur in society, they will be able to identify when the solutions they develop reproduce it,” Castañeda suggested.

This work is innovative because it was carried out by professionals in different fields, including a sociologist, an anthropologist, and experts in gender and statistics. “The team members provided a perspective beyond the independent mathematics associated with algorithms, helping us to consider them as complex social and technical systems,” said the study’s principal investigator.

“If you compare this work with others, I think it is one of only a few that presents the issue of bias in algorithms from a neutral point of view, highlighting the social and technical aspects of determining why an algorithm might make a biased decision,” he finished.

more information:
Juliana Castañeda et al., Dealing with Issues of Gender Bias in Data Algorithmic Processes: A Sociostatistical Perspective, Algorithms (2022). DOI: 10.3390/a15090303

Provided by Universidad Operta de Catalunya (UOC)

the quote: How to End Gender Bias in Internet Algorithms (2022, November 23), Retrieved November 24, 2022 from https://techxplore.com/news/2022-11-gender-biases-internet-algorithms.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.

Previous post
Idris Elba transforms into Luther for new film… swapping the city for a brand new location
Next post
Turton Wines hold a Christmas Market