[ad_1]
Click to copy
<! – DESIGN WHEN IT'S FUNCTIONAL
->
By reading the title of the report, if you thought about the 3 laws of robotics, we are already moving forward: it was an accident, the dishonest robot did not want to hurt the humans it was working with at the Amazon warehouse. Robbinsville city com (According to ABC News, the robot accidentally drilled a can of 255 g of bear repellent with a concentrate of capsicane, an ingredient used to produce pepper spray. the entire environment and finally reached 54 people on the site, of whom 30 were released after receiving medication on the spot, but 24 had to be hospitalized, one from the hospital. among them in critical condition.
A spokeswoman for Amazon issued a statement: "In our Robbinsville distribution center, a safety of our employees is our top priority, so we have moved all employees in a safe place and employees with symptoms were treated locally. local hospitals for evaluation and treatment. "
Meet Kiva Robots
Amazon.com has an army of Kiva robots operating in 15 merchandise distribution centers across the US They are rectangular and low, weighing approximately 145 pounds, and painted Each Kiva robot can support a weight equivalent to that of a small car and its shape allows it to fit under the grids that support the products, taking them from one side to the other.
Robots travel the corridors laden with goods at a maximum speed of 4.7 km / h and have a five-minute rest period every hour to recharge the batteries (literally). total of 80,000 Kiva robots operating in the world
Do not forget the 3 laws of robotics
The 3 laws of robotics, or principles, were designed by writer Isaac Asimov as a means of ensuring peaceful coexistence between and humans. The principles must be used in the creation and programming of robots, to control and limit their behavior. The laws were presented in an Asimov tale titled Runaround, published in 1942.
These are:
1st Law: A robot can not hurt a human being nor, by default, allow a being to suffer harm.
2nd law: a robot must obey the orders given by humans, except in cases where such orders are contrary to the first law.
3rd Law: A Robot Must Protect His
In 1986, anticipating more problems for mankind, Asimov published Law Zero, in the fifth book of the Foundation series, entitled Foundation and Earth
The Zero Law is above all others and says "A robot can not harm humanity or, by default, allow humanity to suffer from wrong."
[ad_2]
Source link