Article: The Killer Robots Are Here
Photo Source: Flickr
Another grim milestone in the evolution of warfare technology has passed, according to the United Nations Security Council Report, S/2021/229. This 548-page UN report authored by experts on Libya disclosed the first known use in battle of fully autonomous attack drones on human beings. The behavior of war has deeply defined the human experience. From sticks and rocks thousands of years ago to guns, bombs, and missiles today, war has utilized the latest technology to gain an advantage in battle. Mechanization, aviation, ballistics, and more have altered the battlefield with each successive technological advance. So it stands to reason that the advent of artificial intelligence will again alter the waging of war.
The UN report to the Security Council describes an armed conflict in Libya between the national forces called the Government of National Accord and armed groups affiliated with Kalifa Haftar. Haftar has a long history in Libya as a general, warlord, and politician. The report details fighting from 2019-2020 and explicitly notes the following.
"Logistics convoys and retreating HAF [Haftar Affiliated Forces] were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (see annex 30) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true "fire, forget and find" capability." (S/2021/229)
The acronym LAWS stands for Lethal Autonomous Weapons Systems. In short, the UN report says that the Government of National Accord of Lybia deployed autonomous killer drones or LAWS to attack HAF rebels in the field. Such a deployment represents a first in using artificial intelligence to attack humans without a human in the loop to pull the trigger. The STM Kargu-2 is a Turkish drone with four propellers that look like many drones used by hobbyists and photographers for aerial videos of buildings and events. However, the Kargu-2 has weapons as well as an onboard camera. The language in the report does not indicate if the Kargu-2s successfully eliminated HAF combatants, but this signals that AI-driven machines have crossed the line from theoretical to actual threats.
On September 12, 2018, the Ministers of the European Parliament voted in favor of a resolution that calls for the international ban on lethal autonomous weapons (LAWS) or more popularly referred to as "killer robots." They defined lethal autonomous weapons as any machine that can take action to kill without having a human approve the decision.
Many analysts and watchdog agencies have focused closely on the current war between Russia and Ukraine for the use of LAWS. For example, a recent article in the Christian Science Monitor titled, "War Ethics: Are Drones in Ukraine a Step Toward Robots That Kill?" determined that the current conflict has not employed autonomous, killer robots, but the threat remains real.
Advances in artificial intelligence and robotics take armed robots out of the realm of science fiction and make them a reality today. The heavy use of drones in the war against terrorism in the Middle East still involves human operators, but the capability to make the drones independent appears imminent. The vision of a new, escalating arms race to build killer robots that can be sent into battle and fight without human intervention poses a genuine ethical dilemma. Suppose some countries choose to ban the development and use of killer robots but others do not. In that case, it means that the sons and daughters of one country will have to fight and die in battle against the invading robots while the aggressor's children stay at home out of harm's way letting their robots wage battle.
The European Union's ministers recently voted to promote an international ban on lethal autonomous weapons--LAWS. Because of significant advances in artificial intelligence and robotics, the specter of killer robots looms large, and it does not take much to make the jump from autonomous vehicles such as cars and planes to weaponized vehicles and robots. The report of Libyan forces using the Kargu-2 autonomous drone to attack rebels under the command of warlord Kalifa Haftar signals a first in using LAWS against enemy combatants. To date, Russia, South Korea, the United States, Israel, and Turkey have indicated that they will not participate in a ban on killer robots. Until all the countries come to the table and agree to ban lethal autonomous weapons, nation-states will need to prepare for a new era in warfare.
Dr. Smith’s career in scientific and information research spans the areas of bioinformatics, artificial intelligence, toxicology, and chemistry. He has published a number of peer-reviewed scientific papers. He has worked over the past seventeen years developing advanced analytics, machine learning, and knowledge management tools to enable research and support high-level decision making. Tim completed his Ph.D. in Toxicology at Cornell University and a Bachelor of Science in chemistry from the University of Washington.
You can buy his book on Amazon in paperback and in kindle format here.