[ad_1]
Microsoft says its facial recognition tools improve the identification of people with darker skin tones than before, according to a company blog today. Error rates were reduced up to 20 times for men and women with darker skin and nine times for all women.
The company says it has trained its AI tools with larger and more diverse datasets, which has made progress. "If we train machine learning systems to mimic decisions made in a biased society, using the data generated by that company, then these systems will necessarily reproduce its biases," said Hanna Wallach, a senior researcher at Microsoft.
In February, a report from the MIT Media Lab tested facial recognition systems from Microsoft, IBM and Megvii in China and discovered that up to 35% of dark-skinned women had been misidentified by the systems. The report has only confirmed what many have suspected for years – facial recognition systems may be biased by limited data sets and other factors such as systemic racism. In 2015, Google identified the black friends of a software engineer in a photo like "gorillas", and had to apologize for the mistake.
Yet, if Microsoft's announcement today indicates a reduction in racial bias in its facial recognition system, if law enforcement puts forward the improved facial recognition tool, we do not know how it could be done for people of color. . Microsoft is a partner of the US Office of Immigration and Customs (ICE) and its facial recognition tool is offered to government agents as a resource. Microsoft CEO Satya Nadella said last week that "Microsoft is not working with the US government on plans to separate children from their families at the border," but he did not explain how recognition facial could play a role in work.
[ad_2]
Source link