Stephen Hawking says that "superhumans" will take over, AI is a threat and humans will conquer space – Quartz



[ad_1]

The latest writings of the late physicist Stephen Hawking predict that a race of superhumans will take over, having used genetic engineering to surpass his fellow humans.

In Brief Answers to Big Questions, which will be published on October 16 and is being pulled today in the UK's Sunday Times (paywall), Hawking does not hit fists on topics such as the takeover by machines , the greatest threat to the Earth and the possibilities of intelligent life in space.

Artificial intelligence

Hawking launches a serious warning about the importance of regulating AI, noting that "in the future, AI could develop its own will, a will in conflict with ours". A possible arms race around autonomous weapons must be stopped before it. We can begin, he writes, wondering what would happen if a stock market crash similar to the 2010 stock market crash occurred with guns. He keeps on:

In short, the advent of a super intelligent artificial intelligence would be the best or the worst thing that ever happened to humanity. The real risk of AI is not malice, but competence. A super intelligent artificial intelligence will be extremely effective in achieving its goals, and if these goals are not aligned with ours, we will have problems. You're probably not a perverse hater who walks on ants by malice, but if you're in charge of a hydroelectric green power project and there's an anthill to flood in the area, too bad for the ants . Let's not put humanity in the position of these ants.

The dull future of the Earth, gene editing and superhuman

The bad news: at some point in the next 1,000 years, a nuclear war or environmental disaster will "paralyze the Earth". disaster. "However, other species on the planet will probably not do it.

Humans escaping from the Earth will likely be new "superhumans" who have used gene editing technology such as CRISPR to surpass others. They will do it by defying the laws against genetic engineering, improving their memory, disease resistance and life expectancy, he says.

Hawking seems curiously enthusiastic about this last point and writes: "There is no time to wait for Darwinian evolution to make us smarter and more natural."

Once such superhumans appear, there will be significant political problems with unimproved humans, who will not be able to compete. Presumably, they will die or become irrelevant. Instead, there will be a breed of self-designed beings who improve at an ever increasing rate. If the human race manages to redefine itself, it will spread and colonize other planets and stars.

Smart life in the space

Hawking recognizes that there are different reasons why intelligent life has not been found or visited the Earth. His predictions here are not so daring, but his favorite explanation is that humans have "neglected" the intelligent life forms that exist.

Does God exist?

No, said Hawking.

The question is whether the creation of the universe was chosen by God for incomprehensible reasons or was it determined by a law of science? I believe the second. If you wish, you can call the laws of science "God," but it would not be a personal God that you would meet and ask questions.

The biggest threats to the earth

The first threat is an asteroid collision, like the one that killed the dinosaurs. However, "we have no defense" against that, writes Hawking. More immediately: climate change. "A rise in ocean temperature would melt the ice caps and cause the release of large amounts of carbon dioxide," Hawking writes. "Both effects could make our climate similar to that of Venus with a temperature of 250 ° C.

The best idea that humanity could implement

Nuclear fusion energy. This would give us clean energy without pollution or global warming.

[ad_2]
Source link