A study shows how Facebook and Google are pushing users toward less intimacy



[ad_1]

28-06-2018
They discovered that the two companies that dominate the current Internet are using obscure interface design templates, techniques, and interface design features to manipulate users. The details

A study shows that Facebook and Google manipulate their users to accept options that negatively affect their privacy.

According to this report, made by Norwegian Consumers Council these companies are distracted with unclear choices that end up pushing their users to accept that their personal information be shared.

The study was conducted following the implementation of the new regulation General Data Protection (GDPR) . The law requires companies to clarify the different options on the use of personal data and to obtain the unequivocal consent of users.

This is the case of many social networks and free platforms, like Facebook. The findings of the report show that these companies quickly demonstrated that they are complying with the GDPR and launched new options for their users to create a false illusion of control over their data and privacy.

confusing Internet users with hidden and uncomfortable options and pushing them to accept conditions that limit their control over their own data.

The authors of the study badyzed pop-up windows of confidentiality that appeared in May on Facebook, Google and Microsoft pages. And the results are particularly dramatic for the first two.

So they discovered that in the case of Facebook and Google, they used dark patterns, techniques, and interface design features to manipulate users.

These "dark patterns" are based on subtle yet effective methods through the design of these emerging frames, to guide the For example, under the privacy settings of Facebook and Google, the options for greater privacy are disabled by default, so most users are already do not know that these options exist .

In case anyone would like to activate these options, he will find a long and heavy process, which will force him to have What to do up to 13 clicks to accept it .

But it is also that, according to the study reveals, Facebook would have shown false notifications that appeared just behind the pop-up window of privacy, divert users' attention making them believe that they had messages, while that was not always true.

Another example that defines the study outlines the strategy for the user to accept facial recognition. So, if anyone tried to disable this option, the system sent him a message such as " we can not use this technology if a stranger uses his photo to impersonate your identity", for scare him and push him to accept him. 19659003] However, in this case, it is not noticed that Facebook could use its image for advertising orientation or automatic coincidence in photos taken by others.

The options to disable images are also confusing Custom Ads on Google . The system warns that if the option is disabled, some ads will no longer be deleted in the future.

The user does not clearly understand who interprets that may be able to pbad audio advertisements to work or another compromised site. To avoid this, the user ends up accepting the option initially proposed, much less protective of his privacy.

In short, the report warns that users who want to protect their data are the most affected by these new options

A false sense of control is created on the information we share but it turns out that these controls are deliberately designed to undermine the effective control of our data.

[ad_2]
Source link