Back in 1947 Einstein said something like “the fourth world war will be fought by throwing rocks” (there are a few different quotes attributed to Einstein on this point and no consensus on the precise wording he used). The point he made is that a third world war would be so catastrophic for humankind that the few surviving would be thrown back to the stone age.
The statement came under the sensation that the atomic bomb created and it potential to wipe out humanity. Today’s concerns are no more on the atomic bomb, that after over 60 years have proved to be such a strong deterrent that no one is willing to risk its use, rather on losing control on weapons, not because they might be highjacked by some terroristic organisation but because they are becoming “autonomous”.
There are very concrete reasons to have autonomous weapons: the reaction time of an autonomous weapon can be measured in milliseconds, if you place a human being in the control loop that time balloons to many seconds, to say the least (probably minutes and even hours in case you want to bring into the command chain top officials or the Country president). In that time an enemy using autonomous systems will have already destroyed your army/your Country.
Hence the race to build autonomous decision making weapons.
A number of scientists are opposing this evolution and are calling for a world wide stop. They point out that atomic weapons are bad, but autonomous systems might be worse. They can be equipped with any kind of weaponry, including atomic bombs, and being autonomous they are basically “uncontrollable”. One of the point is that they are incontrollable by design, so that an enemy cannot highjack them. Artificial intelligence is bringing along a sort of unpredictability, which is what derives from its complexity and is a fundamental component of “intelligence”. Intelligence is going, sooner or later to surprise you, it is not a mechanistic behaviour. Cleary being surprise by a witty conversation is quite different from being surprised by a drone aiming at you!
There is also another aspect that is very disturbing to many people: the idea that we are delegating death and life decisions to a machine.
Sometimes humans becomes insane, and consequently behave in an … insane way. Could a bug in a machine, or even a bullet hitting a machine turn the machine insane? I don’t know the answer but I would not rule it out.
It is not, just, about weapons. Hammers can, and had, caused damages, from hitting your own thumb to harm a co-worker. In the hand however, that harm was the result of a careless use of the hammer by ourselves. We surely have ourselves to blame.
What about exoskeletons? They multiply our strength and to be worn in a seamless way, to become a symbiotic extension of our own body, they need to have intelligence, to be, in a way, autonomous. In turns, they may also become dangerous, way more dangerous than a hammer. Who is to blame if something goes awry?
We are not there yet, but we better start looking seriously at these issues.