IT developers are just men – International



[ad_1]

Anyone who has already googled pictures of CEO knows the result: many, many faces of men. Old, young, bald, world famous. Only when you scroll down, some women come to light. No Merkel moms, no, but ladies of attractive power. Of course, you think, there are many more male CEOs. And women, who go to the top, like to imagine dynamic young people. So where is the problem?

Google's results are not the reality. Meanwhile, several studies have proven: Internet users are manipulated by search engines. The results do not just suggest wrong facts. They also reinforce prejudices without users knowing. In other words, Google's hits are more sexist and racist than our current society.

With regard to CEO images, only 11% of CEOs appear in Google's results, even though their actual share in the US is 27%, according to a study from the University of Washington. Similarly, the distorted results provided Google to employees of the call center. 64% of the images showed women, although in real life, there are as many men as women on the phone.

How so? This is due to the self-learning algorithms that determine the results of the research we see. Contrary to popular assumptions, such mathematical codes are anything but objective and neutral.

Men in the kitchen are classified as women

They learn from billions of images, texts and videos of Internet How does our world. If an algorithm finds a lot of images and texts in which "woman" and "kitchen" often appear together, it recognizes a motive. Bloss: On the Internet, there is also historical data that distorts the result, if we assume the conditions today. The result: the search engine gives clichés of cement.

Unlike humans, a program is unable to recognize prejudices learned, let alone correct them. Only its developers can do it. However, they too are biased. The vast majority of IT developers are white and Asian men. Of course, they are not all sexist or racist. But they automatically point to their own reality of life when they are programming. And it also means that they – blindly – are blind to concerns that do not concern them, but that are important to others.

Two years ago, a computer professor at the University of Virginia randomly noticed that his image recognition software always associated cooking with women. Was he sexist? The case did not leave him alone, that's why he launched an investigation with colleagues. And now, the collections of images, especially those of Microsoft and Facebook, often used to form self-training algorithms, show an important distortion effect: the scenes of cooking, shopping and washing are mainly feminine , sports and shooting, however. The software took over these stereotypes without question – and ad absurdum reinforced: Although the men were seen on the kitchen photos, they were classified by the software as women.

Google automatically completes the stereotypes

likes to entertain. Just like Lapsus, than Google 2015, when his photo service confused dark-skinned men with gorillas. But "stupid" algorithms are not really laughable. They are fatal because they determine how the world comes to us. You decide what Facebook news feeds and purchase suggestions we receive. The search engines choose for us the information that they deem relevant. Yes, they even complete our search queries. Are you on "women need.", "Adds Google" wearing skirts. "The problem with that: no matter how cliched, something always hangs unconsciously.More affirmations and false news are repeated, the more they

Some scientists have recognized the implications of this distortion effect.One of them is Safiya Umoja Noble.She is doing research at the University of Southern California on how the world is opening up to us via the Internet, and in her latest book, Algorithms of Oppression, she concludes that it is precisely the search engines, which are now among the most important sources of information, that are

A lot of pornographic content appears on "black girl"

"Google search is very useful if you want to get simple information, for example . But as soon as you look for insights into complex relationships loaded with valuations and prejudices, Google fails miserably, "says Noble in his university's newsletter. For example, she found that white women on Google are represented differently than black women. The term "white girl" produced differentiated results, while "black girl" and "Asian girl" had mainly pornographic content. The algorithm simply reflects what it finds most frequently. He does not notice that the result is morally and morally wrong. And those on Google that should notice, it does not seem to care.

Noble finds it particularly explosive that prejudices seem hidden on the Internet. "If an algorithm classifies me as unworthy of credit because some of my friends have debts on my social media network, I'll never know," says Noble. In other words, without his knowledge and without comparison with reality, the algorithm has placed in a category that is detrimental to him. The magnitude of these hidden decisions takes on almost strange dimensions. For example, when Google offers men with better paid job offers than women, according to researchers at Carnegie Mellon University in Pittsburgh.

In their 2015 study, they showed, among other things, that the algorithm learned from user behavior that men are more likely to seek well-paying jobs than women; so that men can see more. The software makes a preselection completely ineligible. In real life, you can defend yourself against such discrimination, you can sue them. "But at the present time, we do not have the legal means to bring a lawsuit against the algorithms if they classify us into erroneous categories," Noble says.

Women Should be Involved in IT from the Beginning

Some researchers even believe that they divide us into winners and losers. For example, in the processes of employment, which are increasingly determined by algorithms. If they are more likely to associate the profession of "programmer" with men, they will also be more likely to search for men when they are searching for suitable candidates online. Discrimination can also be corrected retrospectively by optimizing data and programs. And this happens constantly. If you entered "Thailand" in Google a few years ago, pictures of young Thai prostitutes appeared. Today, tourist landscapes appear, as is the case in other countries.

However, this correction process would be accelerated if there was more diversity among IT developers. However, according to statistics on career choices, only a few women are still interested in these jobs. An error, finds Karin Frick trend researcher from the Gottlieb Duttweiler Institute in Rüschlikon ZH. "As long as women leave a professional field for men, they can claim only retrospectively that search engines are sexist." It would be much more effective if women were already involved in the development process

. "Every toaster needs to be approved today, but we do not have an independent body that tests self-learning algorithms for their ethical relevance," Frick says. . Especially in relevant areas such as work or education, developers should prove that their programs are fair. Because as long as the internet serves us the so-called most modern day-to-day technology, real life equality efforts remain largely ineffective.

(SonntagsZeitung)

Created: 15.07.2018, 00:19

[ad_2]
Source link