<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Topic:Gaza conflict &#8212; Global Security Review %</title>
	<atom:link href="https://globalsecurityreview.com/subject/gaza-conflict/feed/" rel="self" type="application/rss+xml" />
	<link>https://globalsecurityreview.com/subject/gaza-conflict/</link>
	<description>A division of the National Institute for Deterrence Studies (NIDS)</description>
	<lastBuildDate>Tue, 31 Mar 2026 10:44:13 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Lethal Autonomous Weapon Systems: A New Battlefield Reality</title>
		<link>https://globalsecurityreview.com/lethal-autonomous-weapon-systems-a-new-battlefield-reality/</link>
					<comments>https://globalsecurityreview.com/lethal-autonomous-weapon-systems-a-new-battlefield-reality/#respond</comments>
		
		<dc:creator><![CDATA[Jawad Ali Shah]]></dc:creator>
		<pubDate>Tue, 31 Mar 2026 12:10:24 +0000</pubDate>
				<category><![CDATA[AI & Deterrence]]></category>
		<category><![CDATA[Archive]]></category>
		<category><![CDATA[Emerging Threats]]></category>
		<category><![CDATA[Strategic Adversaries]]></category>
		<category><![CDATA[AI]]></category>
		<category><![CDATA[algorithmic bias]]></category>
		<category><![CDATA[artificial intelligence]]></category>
		<category><![CDATA[autonomous weapons]]></category>
		<category><![CDATA[civilian casualties]]></category>
		<category><![CDATA[confidence-building measures]]></category>
		<category><![CDATA[emerging military technologies]]></category>
		<category><![CDATA[false nuclear alarms]]></category>
		<category><![CDATA[full autonomy]]></category>
		<category><![CDATA[Gaza conflict]]></category>
		<category><![CDATA[global peace and security]]></category>
		<category><![CDATA[human out of the loop]]></category>
		<category><![CDATA[IHL]]></category>
		<category><![CDATA[international humanitarian law]]></category>
		<category><![CDATA[international verification mechanisms]]></category>
		<category><![CDATA[LAWS]]></category>
		<category><![CDATA[Lethal Autonomous Weapon Systems]]></category>
		<category><![CDATA[Meaningful Human Control]]></category>
		<category><![CDATA[MHC]]></category>
		<category><![CDATA[Military Technology]]></category>
		<category><![CDATA[moratorium]]></category>
		<category><![CDATA[Nuclear Deterrence]]></category>
		<category><![CDATA[Resolution A/RES/79/62]]></category>
		<category><![CDATA[South Asian nuclear deterrence]]></category>
		<category><![CDATA[strategic stability]]></category>
		<category><![CDATA[strategic stability dynamics]]></category>
		<category><![CDATA[Ukraine conflict]]></category>
		<category><![CDATA[UN CCW]]></category>
		<category><![CDATA[UN Convention on Certain Conventional Weapons]]></category>
		<category><![CDATA[UN General Assembly]]></category>
		<guid isPermaLink="false">https://globalsecurityreview.com/?p=32504</guid>

					<description><![CDATA[<p>Published:  March 31, 2026  Technological advances and rising military expenditures in recent years have accelerated the development of Lethal Autonomous Weapon Systems (LAWS). Though this technology is still in its infancy, it has already transformed modern warfare. LAWS, when fully evolved, will provide means for precise and independent selection and engagement of targets without exposing soldiers to [&#8230;]</p>
<p><a href="https://globalsecurityreview.com/lethal-autonomous-weapon-systems-a-new-battlefield-reality/">Lethal Autonomous Weapon Systems: A New Battlefield Reality</a> was originally published on <a href="https://globalsecurityreview.com">Global Security Review</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><i><span data-contrast="auto">Published: </span></i><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> March 31, 2026</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">Technological advances and rising military expenditures in recent years have accelerated the development of Lethal Autonomous Weapon Systems (LAWS). Though this technology is still in its infancy, it has already transformed modern warfare. LAWS, when fully evolved, will provide means for precise and independent selection and engagement of targets without exposing soldiers to battlefield dangers. A 2025 Congressional Research Service report titled </span><a href="https://www.congress.gov/crs-product/IF11150"><span data-contrast="none">Defense Primer: U.S. Policy</span></a><span data-contrast="auto"> on LAWS classifies it as “a special class of weapon systems that use sensor suites and computer algorithms to independently identify, target and employ an onboard weapon system to engage and destroy it without manual human control.” The US Department of Defense </span><a href="https://www.esd.whs.mil/portals/54/documents/dd/issuances/dodd/300009p.pdf"><span data-contrast="none">Directive 3000.09, Autonomy in Weapon Systems (2023)</span></a><span data-contrast="auto">, defined LAWS as systems that, once activated, “can select and engage targets without further intervention by a human operator.” This concept, known as “human out of the loop” or “full autonomy,” involves target selection and engagement based on inputs from artificial intelligence (AI), big data analytics, and sensor-based identification.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">According to </span><a href="https://www.datamintelligence.com/research-report/autonomous-weapons-market"><span data-contrast="none">Data M Intelligence</span></a><span data-contrast="auto">, the global autonomous weapons market reached USD 14.2 billion in 2024 and is expected to grow to USD 33.47 billion by 2032, with a compound annual growth rate of 11.39 percent during 2025-2032. Simultaneously, global civil society initiatives are advocating a ban on fully autonomous systems. In October 2012, Amnesty International launched the </span><a href="https://www.stopkillerrobots.org/stop-killer-robots-x-amnesty-international/"><span data-contrast="none">Stop Killer Robots</span></a><span data-contrast="auto"> campaign, an alliance of over 180 organizations across 65 countries, calling for an international law on autonomy in weapon systems to ensure machines are not allowed to make decisions that affect life and death.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">Concerns have arisen over unsupervised use and the potential for system errors that can cause unintended civilian casualties, escalate conflicts, and threaten global peace and security. The increasing integration of autonomous weapon systems in combat has already been highlighted by their reported use in Ukraine conflict and in Gaza. A February 2025 </span><a href="https://media.setav.org/en/file/2025/02/deadly-algorithms-destructive-role-of-artificial-intelligence-in-gaza-war.pdf"><span data-contrast="none">report</span></a><span data-contrast="auto"> by the Foundation for Political, Economic and Social Research titled </span><i><span data-contrast="auto">Deadly Algorithms: Destructive Role of Artificial Intelligence in Gaza War</span></i><span data-contrast="auto"> revealed that Israel employed AI-based systems, Lavender and Habsora, to identify and attack human targets. The report states that Lavender can approve targets within 20 seconds, often without substantive human review. Since October 2023, the system has compiled a list of 37,000 potential individuals labelled as Hamas members without verifying their military profile.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">Since 2014, the United Nations Convention on Certain Conventional Weapons (UN CCW) has debated the regulation of LAWS. In May 2024, Arms Campaign Director Steve Goose of Human Rights Watch </span><a href="https://www.dailymail.co.uk/news/article-13374209/chinese-russian-ai-nukes-ww3-fears-missiles-america.html"><span data-contrast="none">warned</span></a><span data-contrast="auto"> that “the world is approaching a tipping point for acting on concerns over autonomous weapons systems,” underscoring the urgency of an international legal instrument. On 2 December 2024, the UN General Assembly adopted </span><a href="https://documents.un.org/doc/undoc/gen/n24/391/35/pdf/n2439135.pdf"><span data-contrast="none">Resolution A/RES/79/62</span></a><span data-contrast="auto"> on LAWS by 166 votes in favor, 3 against, and 15 abstentions. The resolution marked a decisive step in acknowledging global concerns over autonomous weapon systems, affirmed the applicability of international humanitarian law (IHL) and called for further consultations in 2025. The </span><a href="https://www.stopkillerrobots.org/news/dynamic-consultations-demonstrate-a-clear-need-for-all-states-to-have-a-seat-at-the-table/"><span data-contrast="none">first UNGA meeting</span></a><span data-contrast="auto"> on autonomous weapons, held on 12-13 May 2025 and attended by 96 countries, including representatives from the International Committee of the Red Cross (ICRC) and civil society, reinforced momentum to prohibit and regulate LAWS. On that occasion, UN Secretary-General António Guterres advocated for a legally binding instrument to </span><a href="https://www.stopkillerrobots.org/news/un-secretary-general-calls-for-new-international-law-to-regulate-and-prohibit-killer-robots-by-2026/"><span data-contrast="none">ban LAWS by 2026</span></a><span data-contrast="auto">, describing them as “politically unacceptable and morally repugnant.”</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">Despite global concerns, progress on a legally binding treaty on LAWS remains elusive due to divergent strategic interests of major powers. The </span><a href="https://www.reuters.com/world/progress-rules-lethal-autonomous-weapons-urgently-needed-says-chair-geneva-talks-2026-03-03/"><span data-contrast="none">US continues to resist</span></a><span data-contrast="auto"> codification of a new binding framework, emphasizing the adequacy of national weapons review mechanisms to preserve strategic and technological flexibility. While the US maintains that it does not currently possess LAWS, senior military leaders have acknowledged that Washington may be compelled to develop them if adversaries do so. </span><a href="https://www.reuters.com/world/progress-rules-lethal-autonomous-weapons-urgently-needed-says-chair-geneva-talks-2026-03-03/"><span data-contrast="none">Russia has opposed</span></a><span data-contrast="auto"> any binding treaty, while </span><a href="https://un.china-mission.gov.cn/eng/chinaandun/disarmament_armscontrol/202510/t20251024_11739691.htm"><span data-contrast="none">China</span></a><span data-contrast="auto"> supports negotiations on the CCW and the development of norms “when conditions are ripe.” </span><a href="http://www.eeas.europa.eu/delegations/un-new-york/eu-statement-%E2%80%93-united-nations-1st-committee-thematic-discussion_en"><span data-contrast="none">The European Union</span></a><span data-contrast="auto">, in contrast, advocates for a legally binding international instrument, emphasizing Meaningful Human Control (MHC) and compliance with IHL. The EU’s approach seeks to differentiate between systems that incorporate human oversight and those that operate without it.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">The integration of artificial intelligence into weapon systems also presents an increasing challenge to nuclear deterrence and strategic stability. For instance, during the sidelines of the Asia-Pacific Economic Cooperation (APEC) Summit in Peru in November 2024, the then US President Joe Biden and China’s President Xi Jinping jointly </span><a href="https://www.reuters.com/world/biden-xi-agreed-that-humans-not-ai-should-control-nuclear-weapons-white-house-2024-11-16/"><span data-contrast="none">pledged not to integrate AI</span></a><span data-contrast="auto"> in nuclear command-and-control systems, recognizing the catastrophic risks of automation in nuclear decision-making. However, as AI rapidly improves surveillance, missile guidance and targeting systems, it is unclear whether this restraint will hold.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">The integration of AI in nuclear forces may introduce instability into deterrence dynamics by reducing decision-making time and increasing challenges caused by algorithmic bias in early warning systems, posing the threat of false nuclear alarms. Cold War history reminds us of human judgment, central to nuclear stability, and averted catastrophes. During the Cuban Missile Crisis, </span><a href="https://nsarchive.gwu.edu/briefing-book/russia-programs/2022-10-03/soviet-submarines-nuclear-torpedoes-cuban-missile-crisis"><span data-contrast="none">the B-59 submarine incident</span></a><span data-contrast="auto"> on 27 October 1962 brought the two superpowers close to nuclear exchange when a Soviet submarine commander considered launching a nuclear-tipped torpedo under the mistaken belief that hostilities had commenced. The refusal by Vasily Arkhipov to authorize the attack prevented a potential nuclear war. Similarly, Stanislav Yevgrafovich Petrov, a lieutenant colonel in the Soviet Air Defense Forces, chose to disregard a false early-warning alert indicating an incoming US nuclear strike in 1983, </span><a href="https://www.armscontrol.org/act/2017-10/news-briefs/man-who-saved-world-dies-77"><span data-contrast="none">preventing</span></a><span data-contrast="auto"> a global nuclear disaster. Such decision-making underscores the indispensable role of human rationality in nuclear command-and-control systems.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><span data-contrast="auto">As LAWS presents multifaceted threats to international peace and security, states need to consider negotiating a legally binding instrument that ensures MHC over autonomy in weapon systems. Enhancing transparency, accountability, and rigorous weapons reviews are essential to prevent destabilization and ensure that technological progress does not outpace the human element in the use of force. Confidence-building measures, such as transparency in military AI, the establishment of international verification mechanisms and a moratorium on the development and deployment of LAWS, could help mitigate future dangers.</span><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><i><span data-contrast="auto">Jawad Ali Shah is a Research Officer at the Center for International Strategic Studies Sindh (CISSS), Pakistan. He holds a BS in International Relations from the University of Sindh, Jamshoro, Pakistan. His research areas are emerging military technologies, and South Asian nuclear deterrence and strategic stability dynamics. The views are the author’s own.</span></i><span data-ccp-props="{&quot;201341983&quot;:0,&quot;335559739&quot;:0,&quot;335559740&quot;:240}"> </span></p>
<p><a href="http://globalsecurityreview.com/wp-content/uploads/2026/03/Lethal-Autonomous-Weapon-Systems-A-New-Battlefield-Reality.pdf"><img decoding="async" class="alignnone wp-image-32091" src="http://globalsecurityreview.com/wp-content/uploads/2026/01/2026-Download-Button.png" alt="" width="205" height="57" srcset="https://globalsecurityreview.com/wp-content/uploads/2026/01/2026-Download-Button.png 450w, https://globalsecurityreview.com/wp-content/uploads/2026/01/2026-Download-Button-300x83.png 300w" sizes="(max-width: 205px) 100vw, 205px" /></a></p>
<p><a href="https://globalsecurityreview.com/lethal-autonomous-weapon-systems-a-new-battlefield-reality/">Lethal Autonomous Weapon Systems: A New Battlefield Reality</a> was originally published on <a href="https://globalsecurityreview.com">Global Security Review</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://globalsecurityreview.com/lethal-autonomous-weapon-systems-a-new-battlefield-reality/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
