[ad_1]
The RPGD refers to legislation that aims to give consumers control of personal data collected by technology companies. It came into effect in May, just weeks after the data abuse scandal by Cambridge Analytica wrapped up Facebook – and reinforced the profile of data protection as a consumer need.
The scandal also prompted governments around the world to finally consider taking action against an often neglected area of law. But US lawmakers are seen as lagging behind their European peers.
Apple's CEO welcomed the "successful implementation" of the GDPR on Wednesday. And, in a thinly veiled message to technology giants in the US, Cook emphasized that US companies do not need to worry about tougher laws regulating privacy. .
"This crisis is real – it is neither imaginary nor exaggerated, nor foolish – and those of us who believe in the potential of technology must not back down at this time," Cook said.
"At Apple, we fully support a comprehensive federal privacy law in the US It should be rooted in four fundamental rights," said Cook: the right to minimize personal data, the right to privacy knowledge, the right of access and the right to security, he said.
Josh Lipton contributed to this report.
Hello.
It's an honor to be here with you today in this great hall … a room that represents what is possible when people of backgrounds, stories and philosophies different people come together to build something bigger than themselves.
I am deeply grateful to our guests. I want to thank Ventsislav Karadjov for his service and leadership. And it's a real privilege to be introduced by his co-host, a state man whom I admire a lot, Giovanni Butarelli.
Today, Italy has produced more than its share of top leaders and officials. Machiavelli taught us how leaders can escape evil actions … And Dante showed us what happens when they get caught.
Giovanni did something very different. Through his values, dedication and thoughtful work, Giovanni, his predecessor Peter Hustinx – and all of you – set an example to the world. We are deeply grateful.
We need you to keep moving – more than ever. Because these are times of transformation. Throughout the world, from Copenhagen to Chennai via Cupertino, new technologies are making major advances in the largest joint projects of humanity. From prevention and disease control … to reducing the effects of climate change … to ensuring that everyone has access to information and economic opportunities.
At the same time, we see clearly – painfully – how technology can hurt rather than help. The platforms and algorithms that have promised to improve our lives can actually magnify our worst human tendencies. Dishonest actors and even governments have taken advantage of the trust of users to deepen divisions, incite violence and even undermine our common sense of what is true and what is wrong.
This crisis is real. This is not imagined, nor exaggerated, nor "crazy". And those of us who believe in the potential of technology for good should not back down from that moment.
Now, more than ever – as heads of government, business decision-makers and citizens – we must ask ourselves a fundamental question: what kind of world do we want to live in?
I am here today because we hope to work with you as partners to answer this question.
At Apple, we are optimistic about the impressive potential of the technology. But we know that it will not happen by itself. Every day we work to infuse the devices we manufacture with the humanity that makes us. As I already said, "Technology is capable of doing great things, but it does not want to do big things, it does not want anything, it takes us all."
That's why I believe that our missions are so closely aligned. As Giovanni says, "we must act so that technology is designed and developed to serve humanity, not the reverse."
At Apple, we believe that privacy is a fundamental human right. But we also recognize that not everyone sees things like us. In a way, the desire to put profits before privacy is not new.
As early as 1890, the future Supreme Court Justice, Louis Brandeis, published an article in the Harvard Law Review, which advocated for a "right to privacy" in the United States.
He warned: "Gossiping is no longer the lazy and vicious resource, but has become a profession."
Today, this trade has exploded in an industrial data complex. Our own information, from daily to personal, is turned into weapons against us with military efficiency.
Every day, billions of dollars change hands and countless decisions are made based on what we love and do not like, our friends and family, our relationships and relationships. of our conversations … Our wishes and our fears … Our hopes and our dreams.
These fragments of data, each quite harmless in itself, are carefully assembled, synthesized, exchanged and sold.
Taken to the extreme, this process creates a sustainable digital profile and allows companies to know you better than yourself. Your profile is then analyzed with the help of algorithms capable of serving more and more extreme content and transforming our harmless preferences into stronger condemnations. If green is your favorite color, it is possible that you read a lot of articles, or that you watch a lot of videos, about the insidious threat of those who love orange.
In the news, almost every day, we are witnessing the deleterious and even deadly effects of these diminished worldviews.
We should not bear the consequences. This is the surveillance. And these stocks of personal data only serve to enrich the companies that collect them.
This should make us very uncomfortable. This should destabilize us. And this illustrates the importance of our common work and the challenges that lie ahead.
Fortunately, this year you have shown the world that a good policy and political will can be put together to protect the rights of all. We should celebrate the work of transforming the European institutions responsible for the proper implementation of the GDPR. We are also celebrating the new measures taken, not only here in Europe but around the world. In Singapore, Japan, Brazil, New Zealand and many other countries, regulators are asking difficult questions and developing effective reforms.
It is time for the rest of the world, including my country of origin, to follow your example.
At Apple, we fully support a comprehensive federal privacy law in the United States. There and everywhere, it should be based on four essential rights: first, the right to minimize personal data. Companies must challenge themselves to remove the identity of customer data or not to collect it. Second, the right to knowledge. Users must always know what data is collected and what it is used for. This is the only way to give users the power to decide which collection is legitimate and what is not. Nothing less is a sham. Third, the right of access. Businesses need to recognize that data belongs to users and we all need to help users easily get a copy of … correct … and delete their personal data. And fourth, the right to security. Security is fundamental to trust and all other privacy rights.
Now, there are those who would prefer that I did not say all that. Some oppose any form of privacy legislation. Others will support the reform in public, then resist and undermine it behind closed doors.
They may tell you that "our companies will never realize the true potential of technology if they are constrained by privacy regulations." But this notion is not just wrong, it is destructive.
The potential of technology is and must always be rooted in the faith that people believe in it … in the optimism and creativity it engenders in the hearts of individuals … in its promise and ability to make the world a better place.
It's time to face the facts. We will never be able to realize the true potential of technology without the absolute trust and confidence of those who use it.
At Apple, respect for privacy and a healthy suspicion of authority have always been present in our blood. Our first computers were built by the unsuited, the handymen and the rebels – not in a laboratory or a conference room, but in a suburban garage. We introduced the Macintosh with a famous television commercial channeling George Orwell's 1984 – a warning of what can happen when technology becomes a tool of power and loses contact with humanity.
And since the beginning of the year 2010, Steve Jobs said bluntly: "Confidentiality means that people know what they want to sign up for, in simple language and over and over again".
It is worth remembering the foresight and courage needed to make this statement. When designing this device, we knew that it could contain more personal data than most of us keep at home. And Steve and Apple have been under enormous pressure to silence our values and freely share this information. But we refused to compromise. In fact, we only deepened our engagement over the decade.
Hardware breakthroughs … that encrypt fingerprints and faces securely – and only – on your device … Simple, powerful notifications that make it clear to each user what they are sharing and when they are sharing it .
We are not absolutists and we do not claim to have all the answers. Instead, we always try to come back to this simple question: what kind of world do we want to live in?
At each stage of the creative process, at this time and today, we engage in an open, honest and solid ethical debate about the products we manufacture and their impact. It's just a part of our culture.
We do not do it because we have to do it, we do it because we should do it. The values that underpin our products are as important to us as any feature.
We understand that the dangers are real, cyber criminals in rogue nation-states. We do not want to leave our users to fend for themselves. And we have shown that we will defend these principles when they are challenged.
These values … this engagement in thoughtful debate and transparency … they will only become more important. As progress accelerates, these elements should continue to anchor us and connect us, above all, with the people we serve.
Artificial intelligence is an area I think a lot about. Clearly, this is also of concern to many of my peers.
Basically, this technology promises to teach people individually to benefit everyone. However, advancing artificial intelligence by collecting huge personal profiles is laziness and not efficiency. For artificial intelligence to be truly intelligent, it must respect human values, including privacy.
If we are wrong, the dangers are deep.
We can achieve both great artificial intelligence and high standards of confidentiality. This is not just a possibility, it's a responsibility.
In the pursuit of artificial intelligence, we should not sacrifice the humanity, creativity, and ingenuity that define our human intelligence.
And at Apple, we will never do it.
In the mid-nineteenth century, the great American writer Henry David Thoreau felt so tired of the pace and change of industrial society that he settled in a cabin in the woods near Walden Pond.
Call it the first digital cleaning.
Yet even where he hoped to find some peace, he could hear the distant roar and whistle of a steam locomotive. "We do not take the railroad," he said. "He rides on us."
Those of us who are lucky enough to work in technology have a huge responsibility.
This is not to please all the grumpy Thoreau. It is an unreasonable standard and we will never respect it.
However, we are responsible for recognizing that the devices we manufacture and the platforms we build have real, lasting and even permanent effects on the individuals and communities that use them.
We must never stop asking ourselves … What kind of world do we want to live in?
The answer to this question should not be an afterthought, this should be our main concern.
At Apple, we can – and do – offer the best to our users while processing their most personal data like the precious cargo they are. And if we can do it, then everyone can do it.
Fortunately, we have your example in front of us.
Thank you for your work … For your commitment to the possibility of a human-centered technology … And for your firm belief that our best days are still before us.
Thank you so much.
Source link