China recruits high school students to build AI robots for the army


[ad_1]

A group of some of the smartest students in China has been recruited directly from high school to begin training as the youngest AI weapons scientist.

The 27 boys and four girls, aged under 18, were selected by more than 5,000 candidates for the "Experimental Program of Intelligent Weapons Systems" of the Beijing Institute of Technology (ILO) , said the school on its website.

The ILO is one of the country's leading weapons research institutes and the launch of this new program demonstrates the importance it attaches to the development of AI technology for military use.

China is competing with the United States and other countries for the development of deadly applications for artificial intelligence – nuclear submarines with self-learning chips to microscopic robots that can enter ships human blood.

"These children are all exceptionally brilliant, but being brilliant is not enough," said a ILO professor who participated in the selection process but asked not to be named because of the subject's sensitivity.

"We are looking for other qualities such as creativity, the will to fight, perseverance in the face of challenges," he said. "The passion to develop new weapons is imperative … and they must also be patriotic."

People's Liberation Army soldiers on Tiananmen Square, Beijing, September 2015.
Reuters

According to the program brochure, each student will be mentored by two experienced scientists in the field of armament, one from a university background and the other from the industry. defense.

After completing a short program of work during the first semester, students will be asked to choose a specialty area, such as mechanical engineering, electronics or general weapons design. They will then be assigned to an appropriate defense laboratory where they can develop their skills through practical experience.

One of the students is Qi Yishen, from Shandong Province (east), who said he was keenly interested in firearms and guns from a very young age and that he was very interested in firearms and weapons from a very young age. he liked to read books and magazines on the subject.

In addition to being offered an interview for the ILO program, he was looking for a place at Tsinghua University, one of China's leading learning centers, but both visits were scheduled for the same day.

"When I arrived in Beijing, I dragged myself to the train station for a long time, but then I went to the ILO … I could not resist this attraction," he said. the website of the institute.

He added that his decision was also influenced by his father, who wanted him to work in the defense sector.

On October 28, ILO launched the program at Norinco headquarters, one of the largest defense subcontractors in China.

"We are walking in a new direction, doing things no one has done before," student representative Cui Liyuan said in an official statement.

After completing this four year course, students should pursue a doctoral program and become the next leaders of the AI ​​weapons program in China, the institute announced.

Eleonore Pauwels, a researcher in emerging cybertechnologies at the United Nations University Center for Policy Research in New York, said she was worried about the launch of the ILO course.

"This is the first academic program in the world designed to aggressively and strategically encourage the next generation to think, design, and deploy AI for military research and use."

Paramilitary policemen march past the Great People's Palace in Beijing in March 2016.
REUTERS / Aly Song

While the United States had similar programs, such as those run by the Defense Advanced Research Projects Agency, they operated in relative secrecy and employed only the cream of established scientists, said Pauwels.

In contrast, the ILO program seemed more focused on training the next generation of students in the handling of AI weapons, she said. "This concept is both extremely powerful and disturbing."

Students design and design AI as an engine or force to equip self-learning, intelligent and automated systems, she said.

This knowledge could also be used alongside other new and existing technologies such as biotechnology, quantum computing, nanotechnology and robotics, which would have "dramatic consequences for security and military domination," said Pauwels. .

"Think of swarms of robots capable of releasing harmful toxins into food or biotech supply chains," she said.

With the undergraduate program, "you can imagine that students are starting to think about how to take advantage of the convergence of AI and genetics systems to design and deploy powerful combinations of weapons that can target , with surgical precision, specific populations, "she said.

"[It] It could also lead to new forms of warfare, ranging from highly sophisticated automated cyberattacks to what might be called an "Internet of Things of the Battle", where a set of robots and sensors play a role in defense, offensive and intelligence gathering. "

Chinese President Xi Jinping.
Reuters / Pool

When he was invited to comment on the ILO program, the Chinese Foreign Ministry said his country was actively engaged in the development and application of AI technology in order to serve its economic, social and scientific and technological development.

At the same time, he said he was very aware of the possible problems posed by a lethal autonomous weapon system and encouraged the exploration of preventive measures by the international community.

Indeed, AI offers a new arsenal of security to China, which is resolutely turned towards technological progress to achieve its goal of becoming a world leader.

"The fact that the national AI strategy in China relies on a doctrine of civil-military fusion means that a prototype AI for military use could be co-opted and perverted for surveillance or a harm in the civil context, "said Pauwels.

Stuart Russell, director of the Center for Intelligent Systems at the University of California at Berkeley, called the ILO program "a very bad idea."

"Machines should never be allowed to decide to kill human beings, such weapons quickly become weapons of mass destruction, and they increase the risk of war," he said.

"I hope all these students will start their class watching the movie Slaughterbots."

He was referring to a seven-minute film projected at a United Nations convention on arms control in Geneva last year, which describes a troubling future where swarms of low-cost drones could slaughter human beings livestock through artificial intelligence technology such as facial recognition.

The Chinese government has submitted to the UN a summary document on the use of AI weapons in April.

"As products of advanced emerging technologies, the development and use of lethal autonomous weapon systems would reduce the threshold of war and the cost of war for user countries. trigger more easily and more frequently wars, "Beijing said, calling for more discussion.

"Until such discussions have taken place, there should be no pre-established premise or detrimental results that could hamper the development of AI technology," he said.

[ad_2]Source link