[ad_1]
The Credit Protection Service (SPC) yesterday announced the offer of a facial recognition technology for merchants
Merchants will be able to install the system, which will record facial features and validate the # 39; identity. of the buyer. The data will be stored in the SPC database along with other information about the person.
In addition to confirming the identity, the technology will allow the property owner to improve the consultation of information on the payer, including the credit note. "(Probability index of appropriate exit from the credit history of the person)
This type of badysis should be improved if the law of the positive register (which makes the sharing of credit data mandatory, without having need for consent)
In a statement, the SPC justified the measure by arguing that the solution protects the trader by mitigating losses and the consumer by avoiding the possibility of gaining the benefit by stealing personal information, such as a credit card number.The adoption of this type of technical solution is an example of the spread of recognition and facial detection mechanisms in Brazil and around the world.
Also to read: Understand what changes with the law that protects your data in the network
In April of this year, the company responsible for the concession of the line 4 of the subway of the city of São Paulo, Via Qu atro, has installed in public transport a system that detects the reactions of those who view the advertisements on the screens. cars. The goal, according to the company, is to get answers to better target the messages conveyed in the panels.
According to the dealer's advice, the system works with face detection and not facial recognition. The first maps the reactions from the reading of the faces, while the second identifies whether the camera is shooting at a certain person.
Popularization
The use of facial recognition technologies has become popular in Brazil and around the world. This process is accelerated by creating a variety of applications for the resource. In addition to diversification, the advancement of artificial intelligence techniques has increased the accuracy of both the ability to recognize people and the mapping of different expressions.
Another diffusion factor is the depreciation of these systems. An example is the SAFR platform – for safe and accurate facial recognition – launched by RealNetworks this month. The system, available for free in the United States and Canada, provides tools based on real-time artificial facial recognition for schools and other environments. It is available as a free download for schools in the United States and Canada.
According to the company, the tool can monitor millions of faces with an accuracy of 99.8%. In promotional materials, the product is presented as a solution to monitor and combat internal and external threats, such as the presence of unregistered people. More than simply recognizing people, the system also identifies emotions and reactions through controlled expressions. Platform officials say they want to make schools safer, especially in the face of recurring episodes of armed attacks on educational institutions.
In China, a tool called SenseVideo was sold last year with recognition features. faces and objects. But the most controversial initiative has been the use of cameras to monitor the actions and movements of citizens with the aim of establishing "social notes" for each person, which can be used at different purposes, in particular to differentiate access to services.
In Russia, the FindFace application was also queried last year by allowing people to be located from their profile on a popular social network in the country (Vkontakte). He incorporated the ability to map the emotions and reactions of reading facial features. Surveillance capacity has raised concerns about the possibilities of using this type of solution during this year's World Cup, although there has been no specific confirmation in this regard.
Concerns
Just as growth opportunities like surveillance alternatives, facial recognition and recognition technologies are beginning to cause concern among civil society organizations, academics, policy makers and even members of the technology industry itself
Smith, issued a statement in which he advocated public regulation of the issue and measures of corporate responsibility. According to him, the evolution of this technology and the large-scale adoption by business and government ignites an alarm signal.
"Facial recognition technologies raise issues that affect the protection of fundamental human rights such as privacy and freedom of expression. "
These tools could be used, for example, to monitor political opponents at a protest." Because of these risks, the executive defended that the government initiate a regulatory process backed by a commission of experts in the matter.
The speech is also related to a concern for the image of the company. Microsoft was questioned earlier this year on an alleged contract with the service of US immigration to monitor people illegally entering the country.
The US nonprofit organization Eletronic Frontier Foundation (EFF) released a report in February this year that highlights the Adoption of these tools, especially by the state, for alleged security reasons. "Without the limits in question, it could be relatively easy for the government and private companies to build lows Image data is the vast majority of people and use these databases. data to identify and track people in real time as they move from one place to another in their daily life, "said the entity in the document.
In the evaluation of the researcher of the Laboratory of Digital Humanities of the Brazilian Institute of Scientific Information and "
" This can be great for public safety in the search for missing persons and fugitive criminals, or for purposes of identification. criminals in flagrante delicto, but there is a need for transparency and accountability in this process, so as not to become a contributor In the evaluation of the Digital Rights Coordinator of the Institute of Consumer Protection ( Idec), Rafael Zanatta, the two Brazilian examples violate the law. In the case of Via 4, the face detection tools violate the Consumer Code because the system establishes an abusive practice and imposes a control over the person, who does not understand how this data collection is.
did. They also violate the Code of Utilities Users by promoting a sort of "forced opinion poll" regardless of the service provided, public transport.
In the case of the Credit Protection Service (SPC) initiative, he adds Zanatta, there is also illegality. "There is collection of sensitive information, there is an badignment of a unique identifier collected without consent, improperly, without transparency.The Federal Supreme Court declares that the collection of images without consent can not to take place without profit, or that the person is not the central element of the collection of images and, in this case, it is the image of the people to the commercial purposes. "
One of the targets of both judicial investigations and worried inquiries is Facebook.The platform began to adopt facial recognition 39; last year .In contrast to the photo tagging tool, the new feature began to identify the user in any picture and alert you when a picture was posted or shared.
"We want people to feel comfortable when they post a e photo of themselves on Facebook. We do this to prevent people from going through others, "said Joaquim Candela, director of the company's Machine Learning, in a statement issued in December
. Privacy EPIC filed a complaint with the US Competition Agency (FTC) in April. According to the organization, "the scanning of facial images without affirmative and express consent is illegal and should be prohibited". The platform is also the subject of another lawsuit brought by citizens of the state of Illinois, which could result in billions of dollars.
L & # 39; Facebook tool was also polled in Europe, which won new protection legislation data in May of this year.The General Regulation (RGPD) establishes as a requirement for the collection of a given consent, which must be obtained from specific way which is not respected by the platform system.
Discrimination
Another problem with recognition and facial detection systems concerns failures in the identification of individuals, particularly in the accuracy of racial and ethnic groups.
In February of this year, two Mbadachusetts Institute of Technology researchers (MIT, Stanford University, Joy Buolamwini and Timnit Gebru) tested systems and found that error margins were very high. skin color: 0.8% for white men and 20% for white men, 34% for black women.
Researchers also identified that the databases used to "train "Some systems were mostly white and male.The article raises concerns about how these technologies are built and how these biases can have problematic impacts, such as the identification of suspects
Public Debate
For the Law Professor and Foundation Getulio Vargas (FGV) Eduardo Magrani, there is a need for public debate before the introduction of these technologies who discusses the relationship of these resources with the desired social model.
"When this kind of discussion occurs, technologies are already being implemented without people arguing if they want to live in a society of constant vigilance, and their face is constantly identified by algorithms that can go wrong. "
In the badessment of the teacher, a fundamental element of the debate is the legislation, which exists in general in the Brazilian case, but which may have great ahead with the and Data, pbaded to Congress last month and being sanctioned by President Michel Temer.
Learn how facial recognition works:
Source link