A chilling statement in the Bulletin of the Atomic Scientists presents yet another example of the dangers of pursuing technology without consideration of its consequences. Autonomous weapons use artificial intelligence to chart their course and choose their targets. No human intervention is involved. Once it’s deployed, the machine makes all the decisions. In the race for increasingly sophisticated weapons of war, nations are pursuing AI controlled drones and robots with abandon, including the United States, China, and Russia. Even Turkey is developing such weapons.
In a March 15, 2022 article about rumors Russia has used autonomous drones in the war with Ukraine, Zachary Kallenborn observes, “Thousands of artificial intelligence researchers have also signed a pledge by the Future of Life Institute against allowing machines to take human life. These concerns are well-justified. Current artificial intelligence is particularly brittle; it can be easily fooled or make mistakes. For example, a single pixel can convince an artificial intelligence that a stealth bomber is a dog. A complex, dynamic battlefield filled with smoke and debris makes correct target identification even harder, posing risk to both civilians and friendly soldiers.”
“…allowing machines to take human life…”
Think about that. If the loss of life by an invading force can be substantially reduced by letting autonomous machines make split second decisions on who lives or dies among those targeted by an invasion, humanity’s compassion and common decency are lost. With autonomous killers, the human costs of war become unbalanced in favor of the army with the better robots and drones. Since few countries can afford to build the kind of technology needed for such killers, a new arms race is in the offing. Imagine if Russia could have avoided losing thousands of its soldiers in its invasion of Ukraine if it had an autonomous army. As it is, Russia’s leaders appear to have no regard for the lives of their own soldiers but if they could avoid the negative impact of body bags coming home from the front for the families of dead soldiers to see, it doesn’t take a rocket scientist to realize autonomous killers are an attractive alternative battle plan.
“…a single pixel can convince an artificial intelligence that a stealth bomber is a dog…”
Think about that, too.
It translates to indiscriminate killing. It also means that deployment of truly autonomous weapons may be years away until the “glitches” can be ironed out. But make no mistake. Very smart scientists are working on the solutions every day. They will eventually find them.
So in the fog of war, AI weapons and autonomous soldiers need to give us pause. But it doesn’t stop there. Companies throughout the world are pursuing AI applications to replace as many tasks as possible. Autonomous cars. Autonomous assembly lines. Autonomous surgical procedures. The list goes on and on. All of this proceeds with virtually no oversight. The advances are made without serious consideration of the consequences, both intended and unintended.
In my novel Dragon on the Far Side of the Moon, the United States and China face off over the colonization of the Moon. AI plays a pivotal role in the conflict. As I read today’s headlines, I can’t help but wonder when the fictional tales I weave will become the realities we all will face.
Think about that.