Facebook and Google behave the same way with "dark patterns" | Trade | Technology and science | Social networks



[ad_1]

Companies like Facebook and Google use "dark models" to encourage their users to make decisions that negatively affect their own privacy. This is the conclusion of the study "Mistaken by Design" prepared by a Norwegian monitoring group, reports the website "TechCrunch".

The technology industry is more monitored and questioned than ever before. Although the most publicized case is the one that led to the appearance of Facebook Mark Zuckebrerg's CEO in the US Congress, the truth is that there are more forces that operate in a "not very positive" way. This was demonstrated during the development of the GDPR of Europe

During the GDPR consultation process, evidence was found and dismayed that companies that collect and benefit from user data have in their work their main source of income. This is detailed "Deceived by Design", explaining how these companies create the illusion of controlling their data while encouraging the choices that limit such control.

Although trademarks meet the requirements imposed by the GDPR, there are many ways to abuse their users. Thus, going through a set of confidentiality pop-up windows published by Facebook Google and Microsoft, it was found that in the first two they have "dark patterns, techniques and design features ". interface to manipulate users (…) used to push them towards "privacy" with intrusive options. "

They are subtle but effective behaviors to guide people towards the result that designers want, for example, in the privacy process of Facebook and Google the options more private are simply disabled by default.Any distracted user will never discover the existence of these.But for them to activate it takes a long enough process.

Disguised Abuse

In the particular case of Facebook the social network even goes so far as to suggest that users have messages and notifications, those who supposedly will not be able to access, in the pop-up window of the privacy, even if they do not exist, says "TechCrunch".

While in options such as disabling facial recognition, it requires a series of consequences. decides not to continue apart to inform his information. He says that "we will not be able to use this technology if a stranger uses his photo to impersonate your identity", but he avoids informing that he also applies it to the 39, advertising orientation or automatic coincidence with third party photos.

Google the story is not so different. If a person decides to disable the advertising giant's targeting of the Internet, she will no longer be able to silence certain advertisements in the future. People who do not understand this mechanism may be afraid of not being able to silence an advertisement on another subject and end up accepting the sharing of their information.

[ad_2]
Source link