[ad_1]
October 27 In 2012, Facebook CEO Mark Zuckerberg wrote an e-mail to his director of product development at the time. For years, Facebook had allowed third-party applications to access data on the unintentional friends of their users, and Zuckerberg was considering whether disclosure of all this information was risky. In his email, he suggested that this was not the case: "I am generally skeptical about the strategic risk of data leaks that you think," he wrote at the time. "I simply can not think of cases where this data leaked from developer to developer and caused a real problem for us."
If Zuckerberg had a time machine, he might have used it to go back to that moment. Who knows what would have happened if, in 2012, the young CEO could imagine how all this could go wrong? At the very least, he could have saved Facebook from the devastating year that he had just lived through.
But Zuckerberg could not see what there was in front of him – and the rest of the world either – really – until March 17, 2018, when a pink-haired whistleblower named Christopher Wylie said The New York Times and The watchman / observer on a company called Cambridge Analytica.
Cambridge Analytica had bought Facebook data on tens of millions of Americans without his knowledge to create a "psychological warfare tool," which he then pitched to US voters to help elect Donald Trump to the Presidency. Just before the news was announced, Facebook had banned Wylie, Cambridge Analytica, its parent company SCL, and Aleksandr Kogan, the researcher who had collected the data, from the platform. But these measures came years too late and could not stem the scandal of users, legislators, privacy advocates and media experts. Immediately, Facebook's stock price dropped and the boycott began. Zuckerberg has been called to testify before Congress and a year of controversial international debates on the right to privacy online consumers has begun. On Friday, Kogan filed a defamation suit against Facebook.
Wylie's words caught fire, although much of her talk was already well known. In 2013, two researchers from the University of Cambridge published an article explaining how to predict people's personality and other sensitive details from their freely accessible Facebook tastes. The researchers warned that these predictions could "pose a threat to the well-being, freedom or even the life of an individual". The predictions of Cambridge Analytica were largely based on this research. Two years later, in 2015, a guardian Harry Davies, a writer, said that Cambridge Analytica had collected data on millions of US Facebook users without their permission and had used their likes to create personality profiles for the election However, in the heat of the primaries, with so many polls, news and tweets to dissect, most Americans paid no attention.
The difference was that when Wylie had told this story in 2018, people knew how that had ended – with the election of Donald J. Trump.
This is not to say that the flip side was, as baderted former Cambridge CEO Analytica, Alexander Nix, a bad-mouthing conspiracy of anti-Trumpers dissatisfied with the election result. There is more than enough evidence of unscrupulous business practices in society to justify all the control it has received. But it is also true that politics can be destabilizing, like the transport of nitroglycerin. Despite the theories and badumptions that circulated about how data could to be misused, for many people, it took the election of Trump, the relaxed links of Cambridge Analytica and the role of Facebook in it to ensure that this immaterial and intangible element called privacy has consequences concrete.
Cambridge Analytica is perhaps the ideal reference child for misuse of data. But the Cambridge Analytica scandal, as it is called, has never been anything but the case and its work. In fact, the Trump campaign has repeatedly reiterated that it did not use the Cambridge Analytica information, but only its scientists. And some academics and political practitioners doubt that the personality profile is more than just snake oil. Instead, the scandal and repercussions have increased the way companies, including but not least Facebook, take more data from their customers than they need and give more than they should . even ask at all.
A year after being on the front page of the newspaper, Cambridge Analytica executives are still being summoned to Congress to answer for their actions in the 2016 elections. Yet the conversation about privacy has largely gone from society. now gone, which closed its offices last May. It's a good thing. While Cambridge Analytica was erasing, there were important questions such as: how can Facebook have given special data offers to device manufacturers or why is Google monitoring people's positions? even after disabling position tracking?
It is increasingly recognized that businesses can no longer be forced to regulate themselves and some states have begun to do so. Vermont has implemented a new law that requires data brokers who buy and sell third-party data to register with the state. In California, legislation is expected to come into effect in January, which would give residents the opportunity to refuse to sell their data. Several states have introduced similar bills in recent months alone. At Capitol Hill, Congress examines the contours of a federal data protection law, though progress is slow, as always in Washington.
These scandals and backtracks have severely damaged Facebook and probably the entire technology sector. If Zuckerberg had trouble seeing the "risk" of sloppy privacy protections in 2012, he should not know him now. The Federal Trade Commission is in danger of being fined. This week, news revealed that the company was currently under criminal investigation because of its data-sharing policy.
At the same time, the fallout from the Cambridge Analytica wave has caused Facebook to change its habits, at least in some respects. Last week, in a highly controversial blog post, Zuckerberg claimed that Facebook's future lay in privacy. He added that Facebook would add end-to-end encryption to Facebook Messenger and Instagram Direct as part of a broad plan to create a new social network for private communications.
Critics wondered whether Zuckerberg had finally emerged or whether he was really motivated by more mercenary interests. Nevertheless, encrypting these discussions would instantly improve the privacy of the personal messages of billions of people around the world. Of course, this could also do a lot of damage, creating even more dark spaces on the Internet for misinformation to spread and for criminal activities to infect. Last week, one of Zuckerberg's most trusted allies, Chris Cox, head of products at Facebook, announced that he was leaving Facebook, a decision that would have a lot to do with these concerns.
A year after the beginning of Cambridge Analytica's history, none of these privacy questions has provided an easy answer to businesses, regulators or consumers who want the Internet stays practical and free, as well as checking their information. But at least the test forced these conversations, which were once the preserve of academics and nerds of privacy protection, into the mainstream.
If only the world had seen it coming sooner.
More great cable stories
Source link