Orlando Police give up a second time Amazon's facial recognition platform



[ad_1]

Amazon's controversial recognition platform, its facial recognition software based on artificial intelligence, is no longer used by Orlando's law enforcement forces, thus ending the the second attempt to use technology during a pilot phase in central Florida. The reason: The city did not have the necessary equipment or bandwidth to make it work properly and could never test it live.

The news, reported today by Orlando Weekly, marks another major setback for Rekognition, which has been undermined by criticism of its contributions to the maintenance of order, illegal surveillance and racial profiling, as well as by the clandestine way that Amazon has sold to police while she is still active development.

The application of the recognition by the police force was unearthed last May, only thanks to documents obtained and made public by the American Union of civil liberties of the north of California. At the time, Amazon was selling the cloud-based face recognition platform to police departments in Orlando and Washington County, Oregon, but it had not disclosed its case to the police forces. order and had in fact taken measures such as non-disclosure agreements to protect them.

However, Amazon has received a lot of feedback from the AI ​​community, activists, and civil rights organizations fearing that its inherent flaws will contribute to illegal surveillance and surveillance. other infringing activities, such as racial profiling and wrongful arrests. Research has shown that the Amazon system could return a significant number of false matches and that it was more difficult to accurately identify the bad of individuals and women with darker skin.

Amazon remained firm in its defense that Rekognition was to be used as an auxiliary tool for maintaining order, and that the officers were instructed to trust it only when it had identified a correspondence with an accuracy of 99%. But it is unclear to what extent Amazon actively monitors participating agencies for violations of its terms of service, which the company says allows it to suspend or ban organizations and individuals who use Rekognition's illegal or unethical way. The company announced last year that it would continue to sell software to US law enforcement, despite many criticisms from outside and from society. Critics included employees, shareholders, and leading AI researchers.

Under pressure, it appeared that Orlando was letting his contract with Amazon expire at the end of June of last year. The New York Times reported. But the pilot program started again, Orlando Weekly reported last October that the police had tried to operate the system with four cameras around the police department headquarters and a camera outside a community recreation center.

About 10 months later, the program has the ax again. According to the local police, the installation costs too much and is far too cumbersome to install. Amazon employees have failed to help the city get any reliable live feeds to run the software in real time. The company reportedly offered to supply its own cameras, but the city refused to rely on Amazon hardware.

"At that time, the city was not able to devote resources to the pilot project to enable us to make significant progress in completing the necessary configuration and testing," wrote the city's main administrative office in a note to the city council. The Orlando Police Service has "no immediate plans to explore this type of facial recognition technology with future pilots". Rosa Akhtarkhavari, the city's information officer, said Orlando Weekly Second test phase: "We have not even created a channel today. We talk about it more than a year later. Akhtarkhavari said the system has never been tested on a live image, even once.

Matt Cagle of the ACLU, a lawyer for technology and civil liberties law and voice recognition spokesperson who helped publicize Amazon's work with law enforcement, said in a statement The edge, "Congratulations to the Orlando Police Service for finally understanding what we have long warned: Amazon's surveillance technology is not working and is a threat to our privacy and civil liberties." Cagle added, "This Unsuccessful pilot program shows precisely why be made by the public through their elected leaders, not by companies secretly exerting pressure on police officials to deploy dangerous systems against the public. "

It's far from the end of recognition. The software is still in use in Washington County, Oregon, with an April article from The Washington Post claiming he "supercharged" police efforts in the state. This implementation, which consists mainly of a database able to cross uploaded photos of faces with known criminal databases, appears to be less invasive than a real-time video stream running facial recognition technology on citizens unsuspecting.

Nevertheless, US cities are beginning to oppose the unregulated use of facial recognition, with Oakland, California joining its counterpart in the Bay Area, San Francisco, as part of a vote to ban the use of technology by the government yesterday. Somerville, Mbad., Is the third and only other city to have a law prohibiting the use of the software to police. More cities should in the future defend themselves against the use of facial recognition software, even if Amazon strives to offer it to the country's agencies. The company was not immediately available for comment.

[ad_2]
Source link