[ad_1]
Facial recognition is a perfectible system. The American Union for Civil Liberties (ACLU) has again provided proof this week with a study in which it was asked Amazon's software, Rekognition, if the photos of members of Congress corresponded to one of 25,000 portraits
These latest images, also known as "mugshots", are taken by the police and recorded in a public database. A test that cost $ 12.33
And surprise: 28 out of 535 politicians were identified by the machine. Either an error rate (because it is about errors) of 5%. The algorithm thus seems quite relevant, except that the percentage of false-positives goes up to 39% for the black representatives, that is to say 11 individuals. In concrete terms, this means that these people could be arrested or even brought to the police station because of these identifications.
Amazon's ambition is to sell Rekognition to the authorities, and the system is already used by Washington County. in Oregon. The sheriff uses the images of surveillance cameras to compare them to a database of 300,000 "mugshots."
Fears that are confirmed
As if the story wanted to prove them right, among the elected officials that Rekognition has unknowingly identified, six are members of the Congressional Black Caucus. The group wrote a letter to Amazon's boss, Jeff Bezos, in June to voice his concerns. He feared that facial recognition would result in "unwanted and negative consequences for African Americans, illegal immigrants and political opponents."
"We are very concerned that bad decisions are made because of biased databases produced by police practices that we consider unfair, and sometimes unconstitutional ", was it written.
Artificial intelligence using data on arrests that have already occurred to infer rules, it can very well to accentuate discriminations in police controls if these controls were already practiced in a discriminatory way by the authorities previously. The data is biased.
Amazon shareholders have also gone to their letter to call on Jeff Bezos to stop marketing Rekognition: "We are concerned that this technology is used unfairly and disproportionately to monitor people from color, immigrants and badociations.
But the problem highlighted by the ACLU is above all that the error rate is much higher for black people than for white people. The work of the MIT researcher, Joy Buolamwini, had already shown that this figure for the identification of a white man was of the order of 1%, while it was 35% for black women. [19659002] Since the ACLU study, three elected officials have sent a letter to the boss of the American firm asking him to shed light on how the algorithm was tested and contracts already signed with the police. 19659002] Amazon has at the moment responded that "if 80% is an acceptable threshold of confidence for pictures of hot dogs, chairs, animals, or other uses on social networks, it would not be appropriate for identify individuals with a reasonable level of certainty. By using facial recognition for police services, we invite our clients to set a higher threshold of at least 95% or above. »Threshold which is not reached for certain categories of the population
Rémy Demichelis
Source link