1:26 pm – False news: the algorithms in the dock



[ad_1]

At the heart of spreading false news, the algorithms used by websites and social networks often point to false or manipulated information, with heavy consequences.

– What role do algorithms play?

These computer programs, responsible for making decisions on their own, occupy an essential but invisible place. They search for imperceptible links in huge amounts of data that allow them to manage financial transactions, customize the cost of insurance or perform medical diagnoses.

In the media and on social networks, they rank the results on the search engines, manage the Facebook news feed, censor unwanted content (racism, badgraphy, violence …), recommend videos or

Complex tasks, sometimes sensitive, are thus delegated to these increasingly autonomous systems, "black boxes" that develop their artificial intelligence thanks to the data.

– A biased vision of the world?

19659002] "Algorithms can guide us through the mbad of information available on the internet," Margrethe Vestager, the European Commissioner for Competition, said in June. But "the risk is that we only see what these programs – and the companies that use them – choose to show us," she notes.

By reorganizing online content, algorithms create what we call "filter bubbles" where divergent opinions disappear.

During the 2016 US elections, Facebook was accused of promoting Donald Trump's candidacy by circulating false militant information, often anti-Clinton, locking up his supporters in a world where everyone thinks like them.

Algorithms also tend to make extreme views "more visible than ever," according to Lorena Jaume-Palasi, founder of the NGO Algorithm Watch. However, their impact on readers' opinions is difficult to measure, she adds: we can not attribute the rise of nationalisms in Europe, for example, to algorithms alone.

– Propagation of false news

The main mission of social network algorithms is to circulate the most popular content, without judging its veracity. They can amplify the diffusion of false news.

On YouTube in particular, the thousands of videos defending conspiracy theses are much more "recommended" than the verified contents, according to Guillaume Chaslot, a former engineer of the platform. This polemical content, for example, arguing that the exploration of the moon or global warming is a lie, makes the Internet more responsive and more in demand. They spend more time on the platform and the traditional media are discredited, says Guillaume Chaslot.

– More ethical algorithms?

For many observers, the algorithms can be reprogrammed "in the service of human freedom ".

NGOs demand above all more transparency. "These companies should be able to control their products (their codes), without necessarily revealing their formulas," as is done for Coca-Cola, says Lorena Jaume-Palasi.

The French commission for the protection of privacy , the CNIL recommended at the end of 2017 a state control of the algorithms but also a real education of their users which should "allow each human to understand the springs of the machine."

The new European regulation on data protection addresses also the question of algorithms in general: we can now challenge the decision of an algorithm and "get a human intervention" in case of conflict.

The major web platforms have begun to take action in recent months: Facebook seeks to automatically identify and label bad news, while YouTube has strengthened the human control of the videos it broadcasts for children.

Silicon Valley "repentants", gathered in the NGO Center for humane technology, say, however, that we can not wait for change to come from companies that rely on + the economy of attention + like YouTube, Facebook, Snapchat, or Twitter because "it goes against their business model."

© 2018 AFP. All rights of reproduction and representation reserved. All information reproduced in this section (news, photos, logos) is protected by intellectual property rights held by AFP. Therefore, none of this information may be reproduced, modified, reposted, translated, exploited commercially or reused in any way without the prior written consent of AFP.

[ad_2]
Source link