Scientists call for ban on AI-controlled killer robots



[ad_1]

Killer robots

Killer robots are coming for us all.

But instead of a T-1000 spitting out unique lines, autonomous weapon systems, such as drones, could eventually completely replace soldiers on the battlefield, with enough time.

But is it ethical to use artificial intelligence and decide to take a human life?

Autonomous Weapons

The experts gathered at the meeting of the American Association for the Progress of Science in Washington DC this week have clearly answered: a categorical no. The group of experts – including professors of ethics and human rights defenders – calls for a ban on the development of weapons under AI control, BBC reports.

"We are not talking about end-robots that are walking, talking, about to conquer the world; Our concern is much more imminent: conventional weapons systems with autonomy, "said Mary Wareham, director of the Human Rights Watch outreach campaign, BBC.

Watch the rights of man

Another big question is: who is responsible when a machine decides to take a human life? Is this the person who made the machine?

"The delegation of power to kill a machine is not justified and constitutes a violation of human rights, because the machines are not moral agents and therefore can not be held to make decisions of life or death, "said Peter Asaro, associate professor at the New York School of New York says BBC.

But not everyone is willing to fully denounce the use of weapons systems controlled by the AI. The United States and Russia were among the countries that opposed a total ban on autonomous arms after a week of negotiations in Geneva in September.

[ad_2]

Source link