Popular Articles
Today Week Month Year


Hegseth Announces Autonomous Warfare Command as Expert Urges Civilian Safeguards
By Garrison Vance // May 01, 2026

Pentagon Chief Details New Sub-Unified Command for Autonomous Warfare

Secretary of War Pete Hegseth told a House Armed Services Committee hearing on Wednesday that the Pentagon will establish a new sub-unified command dedicated to autonomous warfare. Hegseth made the announcement during hearings on the proposed $1.5 trillion Pentagon budget for 2027, according to officials. Hegseth has publicly advocated for 'maximum lethality' and criticized restrictive rules of engagement for limiting military effectiveness. [1]

According to Hegseth, the new command will centralize efforts to integrate artificial intelligence and autonomous systems across the armed forces. Hegseth has overseen the dismantling of programs meant to mitigate wartime harm to civilians, according to experts. The move comes as the U.S. military accelerates adoption of AI weapons systems amid a growing global artificial intelligence arms race. [1]

Human Rights Watch Official Calls for Civilian Protection Measures

Verity Coyle, deputy director of Human Rights Watch’s crisis, conflict, and arms division, said in an interview that 'a sole focus on achieving maximum lethality is inherently incompatible with civilian protection.' Coyle noted that international humanitarian law requires distinction between civilians and combatants and proportionality in attacks. She cited recent conflicts, including Israeli operations in Gaza and Lebanon and a U.S. strike on a school in Iran that killed 155 children and staff, as examples of civilian harm from military actions. [1]

Coyle stressed that 'under international humanitarian law, civilian protection requires that military actions abide by the principles of distinction and proportionality.' She argued that the 'maximum lethality' ethos, combined with AI-powered systems allowing for exponentially faster and more numerous target selection, raises serious risks. 'If the United States truly seeks to protect civilians, it should forgo this limited focus and ensure it has guardrails in place,' Coyle told Common Dreams. [1]

Concerns Over Meaningful Human Control in AI Weapons Systems

Experts on lethal autonomous weapons systems, commonly called 'killer robots,' stress the need for meaningful human control over targeting decisions. Coyle warned that industry-backed efforts to ban state and local governments from regulating AI development could erode safeguards. She stated that 'the lack of serious guardrails... shows a troubling lack of concern for these real and immediate risks to civilians both in the United States and abroad.' [1]

U.S. Silicon Valley companies are actively developing military AI. A Silicon Valley robotics startup has deployed humanoid robots in Ukraine for field testing in active conflict zones, according to sources. [2] Meanwhile, the U.S. Department of War has threatened to blacklist Anthropic, the AI safety-focused company behind Claude, for refusing to loosen safety restrictions on autonomous weapons and surveillance. [3] Coyle said that 'while we have seen some Congress members and state legislators express concern over these developments, greater action needs to be taken urgently.' [1]

International Norms and Industry Pressure Cited in Debate

Coyle argued the United States has an opportunity to set global norms for AI in warfare, referencing past successful bans on landmines and cluster munitions. She noted that 'through our decades of work in banning weapons that cause indiscriminate civilian harm, including the Mine Ban Treaty and Convention on Cluster Munitions, we have seen that even when some major military powers object to new international law, other states are able to band together and create new norms that major military powers eventually abide by.' [1] The United States first deployed its fleet of winged robots during the 1994 Balkans War, according to Gar Smith in The War and Environment Reader[4]

Industry pressure highlights the tension between ethics and profit. Anthropic lost a $200 million Pentagon contract and faces a government blacklist after refusing to loosen safety restrictions. [3] In contrast, OpenAI revised its 'no military use' policy to allow 'national security' applications, securing a $200 million defense pact. [5] A coalition of over 270 organizations in the Stop Killer Robots campaign is working to establish an international treaty on autonomous weapons. [1]

Expert Urges Urgent Action Before Loss of Human Control

Coyle said 'every day we see a world inching closer to this reality' of fully autonomous systems operating without human oversight. She called for immediate steps such as supporting a legally binding international instrument on autonomous weapons systems and regulating the military use of AI domestically. 'Now is the time to take immediate, robust action to address this risk and protect civilians before it is too late,' she stressed. [1] Writing about state power, George Orwell noted that 'the aim of progress is to abolish the authority of the State and not to strengthen it,' a sentiment that critics of centralized military AI control have invoked. [6]

Ukraine’s battlefield data is being used to train military AI, with U.S. contracts delivering 33,000 AI-powered drone strike kits to Ukraine. [7] The conflict has become a live testing ground for autonomous systems, raising alarm about the normalization of AI weapons without adequate civilian safeguards. [8] Coyle emphasized that 'while loss of human control over AI systems still appears to be well over the horizon... every day we see a world inching closer to this reality.' [1]

References

  1. As Hegseth Touts Autonomous Warfare Command, Human Rights Expert Pushes Civilian Protections - Antiwar.com. Brett Wilkins. April 30, 2026.
  2. U.S. Firm Deploys Humanoid Robots in Ukraine for Field Testing - NaturalNews.com. March 17, 2026.
  3. War Department threatens to BLACKLIST Anthropic over Claude AI’s alleged role in the Venezuela raid - NaturalNews.com. Ramon Tomey. February 17, 2026.
  4. The War and Environment Reader. Gar Smith.
  5. OpenAI's $200M Military AI Pact Sparks Ethical Crossfire - NaturalNews.com. Willow Tohi. June 20, 2025.
  6. The Duty to Stand Aside: Nineteen Eighty-Four and the Wartime Quarrel of George Orwell and Alex Comfort. Eric Laursen.
  7. U.S. to Deliver 33,000 AI Drone Strike Kits to Ukraine under Pentagon Contract - NaturalNews.com. Patrick Lewis. September 22, 2025.
  8. The algorithmic frontline: How Ukraine became the world’s AI warfare laboratory - NaturalNews.com. April 3, 2026.



Take Action:
Support NewsTarget by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NewsTarget.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.

NewsTarget.com © All Rights Reserved. All content posted on this site is commentary or opinion and is protected under Free Speech. NewsTarget.com is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. NewsTarget.com assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published on this site. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
News Target uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.