[ad_1]
Over the past six years, Tristan Harris has forced us to think differently about the digital devices and services we use every day.
First as a product manager at Google, then as an external critic of the technology sector, he highlighted what is called the attention economy. , the way our phones, apps and web services distract and distract us.
His criticism took years to spread. But the boy has it.
The news that provocateurs linked to Russia had hijacked Facebook and other social media sites to broadcast propaganda during the 2016 election had helped to boost his profile. The same was true of reports that social media led to a slight increase in depression among children. Since then, Harris has been able to find an attentive audience, ranging from ordinary citizens to heads of state who want to better understand how technology companies manipulate or are used to manipulate their customers.
Harris, who co-founded the Center for Humane Technology to develop and promote technology sector reform ideas, has already shaped the industry. Features such as Apple's Screen Time, which iPhone owners can use to set limits on the use of their devices and their apps, are the direct result of critics that it has formulated.
And more can be on the way. For the first time, decision makers in the United States and around the world, many of whom have consulted with Harris and his colleagues, are seriously considering putting in place regulations aimed at restoring the relationship between technology companies, their customers, and society in the world. his outfit. On Monday, for example, the Office of the Information Commissioner of the United Kingdom told the BBC that he was considering significantly restricting the amount of data that social networks can collect on children by setting up a series of measures, in particular by limiting their use.
Business Insider recently spoke to Harris about what inspired him to launch his move and what he thought he had accomplished so far. This interview has been modified for its length and clarity.
Harris felt that the industry was heading in the wrong direction
Troy Wolverton: You have tried to draw attention to and convince high tech companies to remedy the abuses of what you call the attention economy since submitting your presentation in 2013 How do you think you have affected the industry or the debate?
Tristan Harris: At first, people did not necessarily want to admit that there was a problem. I mean, the slide show at Google went viral, and [it resonated with] people. But there was no action. It was just a lot of Holocaust denial, a lot of "oh, people are addicted to a lot of things, cigarettes, alcohol … is not it just capitalism?"
And it's like, guys, we create a very specific form of psychological manipulation and influence that we, the technology industry, are responsible for repairing. And get people to admit that it took a lot of time. We had trouble getting people to hear that there was a problem to be solved.
And I think now what has changed is that people know – because they have been forced to know – that there is a problem. So now, people are talking about what we are really doing about it.
What I've heard recently, is that for the first time, Facebook's leaders, their friends, are now turning against them to tell them, "Which side do you want to be in?" 39; history?
And now, I think that since enough members of the public have convinced friends at the top of these companies, they realize now that something structural has to be changed.
Wolverton: What prompted you to compose the slide show in 2013?
Harris: I felt that basically there was something wrong with the direction it was taking, which is really a frightening thought when you see an entire industry heading in the wrong direction. Because until then I thought the technology was great.
This is not an anti-technology move. But what I was really beginning to realize was … what my most talented friends and engineers were doing more and more, it was getting better at playing tricks in the human spirit so that people remain attached to the screens.
I just thought that everyone I knew was not doing the kind of creative thinking that people used to do in the 90s and early 2000s and that it had become more of a race to manipulation of the human spirit.
Wolverton: But is there a moment that triggered this awareness, an epiphany?
Harris: I had a little epiphany. I spent a weekend in the mountains of Santa Cruz with my friend Aza Raskin, who is now co-founder of CHT. I came back from this weekend after reconnecting with nature and something deep hit me. I really do not know what took me.
I just felt that I had to say something. I felt bad. I had the impression that no one was going to say anything.
I am not the kind of person who starts revolutions or speaks. It's something that I had to learn to do.
Heads of state knocked on his door
Wolverton: How has your understanding of the magnitude of the problem changed since writing your 2013 presentation?
Harris: I was CEO of a very small Web 2.0 technology company called Apture. I had an academic background in cognitive science, computer science, linguistics, user interface design, human-computer interaction, and so on. I've been trained to think about building technological products and the human spirit.
Since I left Google and especially after Cambridge Analytica and my partnership with [Silicon Valley venture capitalist] Roger [McNamee], and these questions took off, my understanding and the stakes of the issue spread out in multiple orders of magnitude.
The scope of the question [been] raised by the way a product designer was thinking about the attention and the notifications and the home screens and the economics of app stores – that's how I've Started – now playing 12-dimensional geopolitical chess and seeing how Iran, North Korea, Russia, China, use these platforms for the World War of Information. [And it goes from there] These problems affect the daily social pressures and mental life of adolescents.
The governments of the world are knocking on our door because they want to understand these problems. Briefing of the Heads of State – I would never have thought of doing it. This has been wild, and that says a lot about the scale and severity of the problem.
Read it:Facebook's true Facebook dust lesson shows why Zuckerberg's "hacking method" is even more dangerous than we thought
I knew that this issue would affect everything, conceptually, in 2013. But I did not base this understanding as I did for a year and a half, where you actually meet the people of the countries whose elections are at stake. . Or you meet and talk to groups of parents, children, and teachers who struggle daily with these problems. So it literally affects everyone and everything. And this is the choking point of what is the pen of human history, and that's what I think people underestimate.
Wolverton: If you imagine this process as a curve from the identification of a problem to its proper treatment, where do you think we are?
Harris: Always the opening sleeve, I think. I think we are in the first runs of an account.
[Companies such as Facebook and YouTube] will be considered as fossil fuel companies, because in the economy of attention, they dig deeper and deeper into the race to the bottom of the brainstem to attract people's attention.
[They’re] now, wherever the pressure exists on them, trying to correct the greater of the damage that occurs, but only because the civil society research groups, usually unpaid or unpaid, remain up to 3 hours from morning, scrape Facebook and YouTube and calculate the referral systems and disinformation campaigns, then they say … to the New York Times … and then Facebook or YouTube could, if there is enough pressure, after [a Rep.] Adam Schiff or a Senator [Mark] Warner or [Sen. Richard] The Blumenthal letter from Congress is starting to do something.
I think looking back, we're going to say, "Oh my god, we're so happy to have woken up from this nightmare and started to design, fund and structure our technology in such a way that it is jointly owned by the users. and the constituencies that affect the most. This is not on an infinite growth treadmill. It is designed with human business models that take into account human sensitivities and vulnerabilities.
Technology companies have taken only modest steps up to now
Wolverton: You mentioned that companies such as Facebook, Apple, and Google have taken what you call small steps to address these issues, for example, by allowing users to limit the time spent on their devices. What is their importance?
Harris: They are celebrated step by step. I just want to be clear. I'm happy that they do it, because it triggers a race to the top.
I mean, one of the leaders of a big tech company, you may know, said that next to me on a stage at a private event, the whole industry is now in a race for the best time. I mean that is ridiculous. We have managed to reverse the trend: from one person to another – who can only steal attention by pulling on our Paleolithic puppet chains – now a race up. [Companies are now vying to] prove that they care more about the well-being of the individual and, hopefully, of society as a whole and the well-being of his civilization.
But that's why the baby's step is important. In fact, this has co-occurred with all the companies that have started running in this direction, and we must continue in this direction.
Wolverton: With all this emphasis on how devices and apps demand our attention, I wondered how much time you spend on your phone these days.
Harris: Well, this is one of the most important problems for the world of all time, and I play, as well as our organization, a role so important that, unfortunately, I am constantly working on this problem, which involves a use constant technology.
I could look at my screen application for you if you wish, and I now know the answer to this question thanks to the features that now exist in a billion phones.
Let me see. The duration of the screen, the last seven days, is on average 3 hours and 2 minutes per day.
[ad_2]
Source link