“Turn off notifications and don’t look at your mobile every five minutes”



[ad_1]

John hennessy (New York, 1952) and David Patterson (Illinois, United States, 1947) are not just two people in computers. His work in the 1970s on the so-called architecture of computers (the way these devices are built) made it possible to standardize their manufacture and improve their efficiency, giving way to the current technological boom and that of decades past. Hennessy is also the current president of Alphabet, the parent company of Google, and served as president of Stanford University. Patterson, for his part, was a professor at UC Berkeley for 40 years until 2016. They both won the Turing Prize in 2017 for their contributions to computational science..

Likewise, the BBVA Foundation announced this week that it had awarded both the Frontiers of Knowledge Prize in Information and Communication Technologies for “having founded as a new scientific space the computer architecture, the discipline that designs the brain of any computer system. , its central processor ”. Together, in the 1980s, they created the RISC system, an acronym for Reduced Instruction Set Computers, which is still in effect in today’s machines (used by around 99% of products on the market, according to ACM data. “We are in a new golden age of computing”, they assure in this interview carried out by videoconference.

Request. When they first started working, the IT architecture was a mess, with all manufacturers operating on their own. It’s like that?

David Patterson. Computers were designed decades ago and they were developed in a special way. What was relatively new were the microprocessors. Its emergence led us to the idea that things had to be done in a completely different way. Mainly, due to the so-called Moore’s Law, which since 1965 states that approximately every two years the number of transistors in a microprocessor doubles. Now, microprocessors are more powerful than central units (mainframe computers).

Q. How would you explain the system you developed, RISC, to someone who has no idea about computers?

DP Well, John and I have a lot of experience with this question… When a program (software) talks to the machine (hardware), it uses a vocabulary. The name of this vocabulary is instruction set (set of instructions). One can imagine a vocabulary made up of long words of several syllables. If we read a novel composed with these words, it would take us longer, because it would be more difficult to understand. The alternative is to have a lot more shorter words, which would allow you to read faster even if the novel was longer. The question is, where is the balance between the two ways to achieve the greatest efficiency? In the end, we found that it was four times faster to use shorter and easier vocabulary. At first it was a controversial and almost philosophical question.

Q. Can we say in one way or another that your work has enabled the technological development that we are experiencing?

John Hennessy. Well, we were motivated by all the changes that were happening. And I think it’s a good reminder that anytime there’s a disruption, like the one that was happening back then with the switch to microprocessors, you have to look back and revisit the way you troubleshoot and ask. if this method is still valid.

Q. What did computers look like at that time?

JH They were huge computers and central units. The computer we used to develop our work was the VAX-11/780 [un ordenador comercializado en 1977 por Digital Equipment Corporation (DEC), empresa que fue adquirida por Compaq en los 90 y esta, a su vez, por Hewlett Packard en 2002,], which cost between $ 250,000 and $ 500,000, and was much slower than any smartphone today.

DP It was the size of a refrigerator. I remember teaching in class and saying “one day a single chip will be faster than this refrigerator”… It was so big that it took a long time for the electricity to reach all of its components. At that moment, the students laughed, they believed that the bigger it is, the faster …

Q. Are we living in a golden age of computers, not only traditional computers, but also thanks to mobiles and other devices? Do you dare to predict where we are going, given that Moore’s Law is coming to an end?

DP Absolutely. In the past, half of development was related to advances in semiconductors, and the other half to what John and I do, how we put these devices together. With the end of Moore’s Law and people still wanting their computers to get faster and faster, more and more of the burden will fall on architecture. It will be the decade of computer architecture.

JH This can be seen on the Apple M1 chip [el primero que la compañía ha instalado en sus Mac tras su ruptura con Intel], which combines specific processors for each task. We are making the leap to general processors towards component specialization for more efficiency. The key is specialization: making small computers that do one thing more efficiently.

Q. What do you think of quantum computers? Will they be a viable alternative?

DP I’m curious to hear from John on this. It’s an exciting technology, but there will only be 20 or 30 things that it can do, and these are also very big computers – there will be no quantum phones. These devices will be hosted in data centers. It would not be effective.

JH I think the area we’re working on is what we call quasi-intermediate quantum, looking for applications that can work with smaller computers, because in the short to medium term, these machines will not be able to solve major problems. Applications are sought, it is a hunt that is currently underway, there are none killer app (successful application which leads to technological advancement).

DP And then there is the question of the cold with which they must operate. And that won’t do much for machine learning either (machine learning), as it is difficult to enter data into these devices.

Q. Do you think we are too dependent on computers and technology in general today?

JH I think it might. But it’s simple: turn off notifications and don’t check your mobile every five minutes. It would create a healthier lifestyle. Of course, this requires some discipline.

DP I am part of the television generation. I grew up with it, and some parents let their kids see what they want when they want it. Not mine, they put restrictions on me. The same is true with technology: if you allow your children to use the internet and technology in general all the time, you are probably not raising them the right way. You’re not going to have a balanced life, especially with the pandemic, if you just watch Netflix. Or video games …

It’s one thing to do your job, especially during the pandemic … and because computers exist … without them, I wouldn’t have a job to start with. The technology is empowering, but it is also addictive and alluring. It’s dangerous to overuse it, but I don’t know what the solution is.

DP There is a famous computer visionary named Alan Kay. 40 years ago he had a revolutionary idea called the Dynabook. His definition was that “the computer will be so important that if you leave it at home, you will have to turn around to get it.” What he was talking about, we see now, came from the cell phone. It’s a critical technology that we use a lot. But I can’t fault the people who create products that people love to use. Scientists are advised to be careful what they believe because of its consequences, but this is more of a popularity issue.

P. Do you think we should limit what machines are able to learn to do?

DP My colleague Stewart J. Russell is one of the great promoters of artificial intelligence and has written some of the most famous books on artificial intelligence. He’s one of those who started making rules about what computers can do before it becomes sensitive. And, although that’s not my domain, I believe the danger is still far away: maybe a century or more remains. I am already in awe of the things that machines can do. Especially driverless cars: when this technology is implemented, hundreds of billions will be saved in road accidents. It will have the same importance as the emergence of the Internet.

JH It doesn’t mean you have to worry about technology. All technologies have good and bad uses. Artificial intelligence can become a very powerful weapon, and there should be international agreement that it should never be used. We need to make deals. And, of course, there will also be economic imbalances, just like the industrial revolution. But new opportunities will also be created.

More information



[ad_2]
Source link