Ever wonder what would happen if AI had its mechanical fingers on the nuclear button? Sounds rather alarming, right? Well, it’s not out of the realm of possibility. Hence, the recent introduction of the “Block Nuclear Launch by Autonomous Artificial Intelligence Act of 2023”.
The bipartisan and bicameral legislation is designed to safeguard the nuclear command and control process from any future change in policy that allows artificial intelligence to make nuclear launch decisions.
Currently, The Department of Defense requires that a human be in the loop “for all actions critical to informing and executing decisions by the President to initiate and terminate nuclear weapon employment in all cases”. The new legislation would codify the Department’s existing policy and ensure that no federal funds can be used for any launch of any nuclear weapon by any automated system without meaningful human control.
Senator Ed Markey (D-MA), who introduced the Senate version of the bill, commented, “As we live in an increasingly digital age, we need to ensure that humans hold the power alone to command, control, and launch nuclear weapons – not robots.”
Rep. Ken Buck (R-CO), who co-sponsored the House version of the bill, stated, “While U.S. military use of AI can be appropriate for enhancing national security purposes, use of AI for deploying nuclear weapons without a human chain of command and control is reckless, dangerous, and should be prohibited.”
To read the complete Senate version of the bill, click HERE. For the House version, click HERE.