Facebook under new political pressure as British "watchdog" calls for "ethical break" ads – TechCrunch



[ad_1]

The UK privacy body revealed yesterday that it intended to impose on Facebook the maximum possible (£ 500,000) under the 1998 data protection regime for related violations to the Cambridge Analytica data scandal

. regulatory missiles are now directed to the platform and its advertising targeting methods – and, indeed, corrosive corruption of individual rights.

Parallel to yesterday's update on the Facebook-Cambridge Analytica data scandal, Office of the Information Commissioner (ICO) released a policy report – entitled Disturbed Democracy? Personal Information and Political Influence – in which he lays out a series of policy recommendations relating to how personal information is used in modern political campaigns.

In the report, she directly calls for an "ethical pause" of microtare advertising tools for political campaigns – to allow key actors – government, parliament, regulators, political parties, online platforms and citizens – to think about their responsibilities in the use of personal information in the era of big data before there is

The watchdog writes [emphasis ours]:

Rapid social and technological developments in The use of big data means that there is limited knowledge – or transparency around – of the "behind". the data processing techniques of the scenes (including algorithms, analysis, data matching and profiling) used by organizations and businesses to target uals. What is clear is that these tools can have a significant impact on people's privacy. It is important that there is greater transparency about the use of these techniques to ensure that people control their own data and that the law is upheld. When the purpose of using these techniques is related to the democratic process, the arguments for high standards of transparency are very strong.

Engagement with the electorate is essential to the democratic process; It is therefore understandable that political campaigns explore the potential of advanced data analysis tools to help win votes. The public has the right to expect this to be done in accordance with the law with regard to data protection and electronic marketing. Without a high level of transparency – and therefore trust among citizens in the proper use of their data – we risk developing a default voter monitoring system. This could have a long-term detrimental effect on the fabric of our democracy and our political life.

He also points to a number of specific concerns related to Facebook's platform and its impact on democratic rights and processes.

"An important finding of the ICO investigation is the finding that Facebook has not been transparent enough to allow users to understand how and why they might be targeted by a political party. campaign, "he writes." While these concerns about Facebook's advertising model generally exist in relation to its commercial use, they are exacerbated when these tools are used for political campaigns. The use by Facebook of interest categories relevant to targeted advertisements and categories of partner services are also of concern. Although the service has ceased in the EU, the OIC will examine these two areas, and in the case of categories of partners, will initiate a new, broader survey. "

The OIC says that its discussions with Facebook for this report focused on" the level of transparency on how Facebook user data and third-party data are used to target users, and the controls available to users on the advertisements they see. "

Among the concerns it raises The intricate online targeting advertising model [emphasis ours]:

Our investigation found significant concerns of fair treatment both in terms of information available to users about the sources of data used to determine what advertisements they see and the nature of profiling in progress. There were other concerns regarding the availability and transparency of the controls offered to users on the advertisements and messages they receive . The controls were difficult to find and were not intuitive to the user if they wanted to control the political advertising they received. While users were informed that their data would be used for commercial advertising, it was not clear that political advertising would take place on the platform

L & # 39; ICO also found that despite Overall, information and controls on the protection of personal information were not communicated to users about the possible uses of their personal information . In particular, more explicit information should be available at the first level of the privacy policy. The user tools available to block or delete advertisements were also complex and were not clearly accessible to users from the main pages they accessed. Controls were also limited in relation to political advertising .

The company has been criticized for years for complex and confusing privacy controls. But during the investigation, the ICO also did not receive satisfactory information from the company to understand the process used to determine the segments of the company. Interests in which individuals are placed for advertising targeting purposes. the content of user messages not being used to derive targeted categories or advertisements, it was difficult to understand how the different "signals", as Facebook called them, were constructed to classify individuals into categories. , he writes

. A parliamentary committee that is conducting a survey on false news and online misinformation also directed Facebook to ask him if he had any responses to information requests related to political ads. In April, the chairman of the committee accused Facebook of

The OIC is not alone in thinking that Facebook's responses to specific information requests have lacked specific information sought. (Markus Zuckerberg also annoyed the European Parliament with very evasive answers to their very detailed questions this spring.)

Meanwhile, a European media survey in May revealed that the Facebook platform allowed advertisers to target individuals such as political beliefs, sexuality and religion – which are considered sensitive information under the Regional Data Protection Act – suggest that this targeting is legally problematic.

The survey found that the Facebook platform allows this type of targeting in the EU. make some noticeable inferences about users – inferred interests, including Communism, Social Democrats, Hinduism and Christianity. And his defense against accusations that what he's doing is violating regional law is that the inferred interests are not personal data.

However, the OIC report sends a very cold wind over this fig leaf, noting categories, Facebook has processed sensitive personal information – and, in particular, data on political opinions.

He further wrote [emphasis ours]:

Facebook made it clear to the OIC that it was not targeting EU users. on the basis of sensitive personal data "… The OIC agrees that indicating that a person is interested in a subject does not mean to formally categorize them into a special category of D & O. Personal information. However, there is a clear risk that advertisers will use basic audience categories to target individuals on the basis of sensitive personal information . As part of this investigation, the OIC is particularly concerned that such categories may be used for political advertising.

The ICO believes that this is part of a larger issue regarding the processing of personal information online. platforms in the use of targeted advertising; It goes beyond political advertising. It is clear from academic research conducted by the University of Madrid on this subject that a significant risk to privacy may arise. For example, advertisers used these categories to target individuals with the assumption that they are, for example, homosexuals. Therefore, the effect was that individuals were targeted and targeted on the basis of their sexuality. This is deeply troubling, and the intention of the OIC as the authority concerned by the GDPR is to work through the single window system with the Irish Data Protection Commission to see if it is possible to undertake a broader action. the examination of the use by online platforms of special categories of data in their targeted advertising models.

Thus, essentially, the regulator claims that he will work with other EU data protection authorities investigating online advertising targeting platforms that rank users based on presumed interests – and certainly when these platforms allow targeting specific categories of data (data relating to racial or ethnic origin, political opinions, religious beliefs, health, etc.). data, sexuality).

Another concern raised by the OIC and specifically related to the activity of Facebook is the transparency around what is called the "categories of partners". e – an option for advertisers to use third-party data (ie, personal data collected by third-party data brokers) to create custom audiences on their platform.

In March, before a major update of the EU's data protection framework, Facebook announced that it would "reduce" this service.

"A preliminary inquiry into the service raised serious concerns about the transparency of the use of the service [partner categories] for political advertising and more general concerns about the legal basis for the service, including the Facebook claim that it acts only as a processor for third-party data providers, "he wrote." Facebook announced in March 2018 that it would terminate this service over a six-month period, and we understand that it has already stopped in the EU. The OIC has also initiated a further investigation into the service under the 1998 DPA (to be concluded at a later date) as we believe it is in the public interest to do so. "

In conclusion on Facebook, the regulator claims The company was not" sufficiently transparent to allow users to understand how and why they could be targeted by a party or political campaign. "

"Individuals can withdraw from special interests, which may reduce the number of ads they receive on political issues, but it will not block them completely," he says. he . " These concerns about transparency are at the heart of our investigation.While these concerns about Facebook's advertising model exist in general terms and its use in the commercial field, concerns are increased when these tools are used for the campaign. policy. "

The regulator also examined the use of the political campaigns of three other advertising platforms – Google, Twitter and Snapchat – although Facebook takes the lion's share in the report, given that the platform has also attracted the lion's share of digital spending by British political parties. ("Electoral Commission figures show that political parties spent £ 3.2m in direct advertising on Facebook in the 2017 general election," he notes, which equals £ 1.3m in the 2015 general election. ")

The OIC recommends that all online platforms that provide advertising services to political parties and campaigns include experts from the sales support team who can provide political parties and campaigns with "specific advice on" Social media companies have the responsibility to act as trustees of information because citizens live more and more their lives online He writes.

he will work with the European Data Protection Board and the data protection authorities of the region to ensure that online platforms comply with the new EU data protection framework (GDPR ) – and in particular to ensure that users "understand how personal information is processed in the targeted advertising model and that effective controls are available".

"he says.

The use of dark template design by Facebook and social engineering A / B allowed to test the user's consent for the processing of his data at the same time as & # 039; He obscured his intentions. People's data has been a long-standing critic of the company – but the one ICO reports here is very much on the regulatory radar in the EU.

Awaiting new laws – as well as many more GDPR lawsuits – seems cautious

The regulator is also pushing the four online platforms to "urgently roll out the transparency features provided for political advertising in the UK "In the case of Facebook, he has developed policies around the transparency of political advertising – in the midst of a series of data-related scandals in recent years, which have intensified political pressure. on the business. But self-regulation seems very unlikely to go far enough (or fast enough) to repair the real risks that are currently being raised at the highest political levels.

"We opened this report by asking whether democracy has been disrupted by the use of data, analysis, and new technologies." Throughout this investigation, we have seen evidence that this was beginning to have a profound effect whereby asymmetry of information between different groups of voters begins to emerge, "writes the ICO. "We are at a crucial time when trust in the integrity of our democratic process is likely to be compromised if an ethical break is not taken." The recommendations made in this report – if they are actually implemented – will change the behavior and compliance of all actors in the political campaign space. "

Another major policy recommendation of the ICO is to urge the British government to legislate. first opportunity "to introduce a code of legal practice under the new data protection law for the use of personal information in political campaigns

The report also calls on all political parties of the Kingdom United for data protection failures, obviously supercharged by the rise of accessible and powerful online platforms that have allowed political parties to combine (and thus enrich) the electoral databases to which they are legally entitled right with all kinds of online information collected by Facebook and others

hence the concern of the OIC to "develop a voting system r default surveillance". in other words, without exercising great responsibility over people's information, online advertising platforms like Facebook are at risk of becoming the layer that breaks democracy and breaks civil society.

The particular concerns that the OIC attaches to the activities of political parties are: the purchase of marketing lists and information on lifestyles by data brokers without diligence reasonable enough; a lack of fair treatment; and the use of third-party data analysis companies with insufficient controls around consent. In 1965, Information Commissioner Elizabeth Denham announced the findings of this report, telling a British parliamentary committee that she would recommend a code of conduct for the political use of personal data. We ask MPs: "We need transparent information, otherwise we push people into little filtering bubbles, where they have no idea what others are saying and what others are saying . the other side of the campaign says. We want to make sure that social media is well used. "

The OIC is now saying that it will work closely with the government to determine the scope of the Code, and it wants the government to review the regulatory gaps."

with the Cabinet Office to get a response from the government to the recommendations of the ICO. Update: A spokesman for the Cabinet sent us to the Department for Digital, Culture, Media and the sport – and a spokesman for DCMS told us that the government would wait to consider the full report of the ICO.

A spokesman for Facebook refused to answer specific questions related to the report – instead sending us this short statement, attributed to her Chief Privacy Officer, Erin Egan: "As we have already said, we should have done more about Cambridge Analytica and take d We have been working closely with the OIC in their investigation of Cambridge Analytica, just as with the authorities of the United States and other countries. We are reviewing the report and will soon respond to the ICO. "

Here is the summary of the ten recommendations of the ICO:

1) Political parties must work with the OIC, the Cabinet Office and the Electoral Commission for: 2) The ICO will work with the Electoral Commission, Cabinet Office and political parties to launch a successful version of the Your Data Matters campaign before the next general election.The goal will be to increase transparency and build confidence among voters on how their personal data are used in political campaigns

3) Political parties must exercise due diligence when seeking personal information from third-party organizations, including: data brokers, to ensure that the appropriate consent has been sought from the data subjects and that the persons are actually informed in accordance with the trans requirements. provided by the GDPR. This should be part of the data protection impact assessments conducted by political parties.

4) The government should legislate at the earliest opportunity to introduce a code of practice under DPA2018 for the use of personal information in political campaigns. OIC will work closely with the government to determine the scope of the Code.

(5) Third-party checks should be conducted after the conclusion of referendum campaigns to ensure that personal data held by the campaign are removed. 6) The Center for Data Ethics and Innovation should work with the OIC, the Electoral Commission to conduct an ethical debate in the form of a jury of citizens to better understand impact 7) All online platforms providing advertising services to political parties and campaigns should include expertise within the sales support team that can provide specific advice to political parties and political parties. campaigns on political parties. transparency and accountability in how data is used to target users.

8) The OIC will work with the European Data Protection Board (EDPB), and the authorities responsible for data protection concerned, to ensure the compliance of online platforms with the GDPR, that users understand how personal information is processed in the targeted ad model and that effective controls are available. This includes greater transparency with respect to privacy settings and the design and importance of privacy notices.

9) All platforms covered in this report should urgently deploy the transparency features of political advertising in the United Kingdom. This should include consultation and evaluation of these tools by the ICO and the Electoral Commission.

10) The government should examine regulatory gaps in the content and provenance and jurisdictional scope of online political advertising. This should include reviewing the requirements for archiving digital political advertising in an open data repository to allow for review and analysis of the data.

[ad_2]
Source link