Season 1 Episode 116
“I think there’s a moral question that one has to ask in general about whether it’s appropriate for a machine to make a decision as to whether or not a human ought to live or die”
[Editor’s Note: As observed in TRADOC Pamphlet 525-92, The Operational Environment 2024-2034: Large-Scale Combat Operations:
“The increase in the production, employment, and success of uncrewed systems means the Army can expect to encounter these systems across the breadth and depth of LSCO.”
Contemporary conflicts in Ukraine and Middle East have witnessed the burgeoning use of autonomous weapons — empowering lesser states (i.e., Ukraine) and non-state actors (i.e., the Houthi Movement in Yemen) to conduct asymmetric strikes against nations with more robust military capabilities (i.e., Russia and Israel, respectively). These capabilities are transforming warfighting in both the air/land and land/sea littoral, eroding and possibly negating traditional concepts of air and naval superiority. The battlefield successes achieved using these autonomous technologies has led to them being rapidly proliferated around the globe, with Transnational Criminal Organizations (TCO) like the Jalisco New Generation Cartel (CJNG) effectively employing armed Unmanned Aerial Vehicles (UAVs) against both their criminal competitors and the Mexican authorities alike.
In the ongoing race to develop more effective (read lethal) combat systems capable of overcoming kinetic and electromagnetic countermeasures, some nations are integrating Artificial Intelligence (AI) and Machine Vision (MV) with Lethal Autonomous Weapons Systems (LAWS) — in essence removing human operators from within or on the OODA loop. U.S. policy on LAWS is documented in DoD Directive 3000.09, Autonomy in Weapon Systems, which includes the following statement:
“Autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”
Per the U.S. Congress’s Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems:
“U.S. policy does not prohibit the development or employment of LAWS. Although the United States is not known to currently have LAWS in its inventory, some senior military and defense leaders have stated that the United States may be compelled to develop LAWS if U.S. competitors choose to do so. At the same time, a growing number of states and nongovernmental organizations are appealing to the international community for regulation of or a ban on LAWS due to ethical concerns.”
Today’s episode of The Convergence podcast features Dr. Mark Bailey, Department Chair, Cyber Intelligence and Data Science, National Intelligence University, exploring the tension that exists between the rapid convergence of AI and battlefield autonomy and our national values requiring transparency and oversight in our use of lethal force. With
Published on 13 hours ago
If you like Podbriefly.com, please consider donating to support the ongoing development.
Donate