Lethal Autonomous Weapon Systems and Controversial Weapons
The ethical implications of lethal autonomous weapons systems (LAWS), often referred to by their dramatic moniker ‘killer robots’, have long been a topic of interest. Until recently, debates about LAWS were relegated as hypothetical, with the technology assumed to be under development and out of reach. Such assumptions may be due for re-evaluation, and while a firm conclusion is yet to be drawn, it is worthwhile presenting them to the ESG investment community.
In March 2021, the United Nations Security Council released a report outlining significant events of the last several years in the ongoing conflict in Libya.1 Included was the description of a March 2020 dispute where, ‘…lethal autonomous weapons systems (LAWS) were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true “fire, forget and find” capability.’ This claim, if true, would represent a significant evolution in the use of LAWS as offensive weapons. While there is still debate on the exact autonomy of the weapons used in this instance, the report has spurred renewed interest in the existence and nature of LAWS. For Sustainalytics, it also raised questions concerning its Controversial Weapons Radar product.
How should LAWS be described, and how advanced are they? How controversial is their use, and what is being done about it? Looking at the present state of autonomous weapons systems, should they be regarded as Controversial Weapons?
Currently, there is no widely accepted definition for lethal autonomous weapons, as different organizations and governments employ their own criteria. Despite the lack of uniformity, there is a general acceptance that LAWS are weapon systems capable of seeking, selecting, and engaging targets without meaningful human control or involvement. For this article’s purpose, we will use this definition.
Do LAWS Exist?
The existence of LAWS is predicated on the definition used to describe them. Ship-based, close-in weapon systems such as the Thales Goalkeeper,2 Raytheon Phalanx,3 and Aselsan Gokdeniz are all designed to target, track, and fire autonomously at incoming targets using radar.4 The Iron Dome manufactured by Rafael Advanced Defense Systems Israel Aerospace Industries works under a similar principle, adept at intercepting incoming projectiles through a network of missile launchers.5 These examples exhibit some of the characteristics attributed to LAWS; however, they are not capable of seeking out targets, and there is typically a human operator present to take control.
Loitering munitions are perhaps the closest existing weapons to LAWS.6 These are drones with the capability to hover over an area for hours. In a space between combat drones and cruise missiles, loitering munitions are designed to strike a target directly once found, earning them the nickname “suicide drones.” Several variants of this technology exist, though the two most advanced might be the Israeli Harop7 and the previously UN-cited Turkish Kargu-2,8 which can reportedly use facial recognition technology to identify individual human targets and eliminate them.9
Then, do LAWS exist? The answer is dependent on the technicalities:
- For example, the French government has defined LAWS as “weapon systems that have no human supervision once they are activated.” These criteria are difficult to match because a fully autonomous weapon may still have a kill switch.
- Greece provides a more specific but equally problematic definition, “a type of weapons that once launched or deployed (human decision) its mission cannot be terminated by human intervention.”
- Perhaps the most specific is The Netherlands, which has stated the following definition, “…(LAWS) can change their goal-function themselves or alter pre-programmed conditions and parameters, not to be under meaningful human control…” This would mean an autonomous weapon would need to be capable of choosing its targets and also decide what constitutes a target.
Despite the varying degrees of description, the Turkish Kargu-2 would fit none of these descriptions, even if it acted fully autonomously.
Activism and Ethics
Several organizations have expressed concern over the moral implications of LAWS, suggesting a growing movement against their use. In a 2012 joint report issued by Humans Rights Watch (HRW) and Harvard Law School’s International Human Rights Clinic, the organizations said that LAWS would be unable to properly distinguish between civilians and military targets and would struggle to calculate proportionality in taking lethal action.10 HRW reiterated these matters in a 2020 article,11 and in May of 2021, the International Committee of the Red Cross released their position on the potential human rights implications for LAWS.12
Activism isn’t limited to existing organizations. The Campaign to Stop Killer Robots is an example of a novel organization committed to regulating LAWS as their purpose for operating. The Campaign describes itself as “a growing global coalition of 172 international, regional, and national non-governmental organizations (NGOs) in 65 countries that is working to pre-emptively ban fully autonomous weapons.”13
Further, while worries on the moral implication of LAWS are evident, not all arguments are opposing. Proponents of the technology argue that it may reduce war crimes and increase adherence to the rules of war by removing the human factor. Machines do not suffer from cognitive biases steaming from strong emotions like stress, fear, or prejudice.14 Additionally, they do not have a sense of self-preservation, meaning they are free from “shoot first, ask questions later” mentalities that individuals may feel in a life-or-death situation.15 They also lack the capacity for sexual violence, a common crime committed against civilian populations.
Law, Regulation, and Responsible Investing
Over the past decade, many nations have expressed some degree of concern regarding the advancement and eventual use of LAWS. While several governments have outlined policy positions on the use and development of LAWS by their own armed forces, most of the pressure to regulate or ban their use has occurred at the international level.16
Presently, there are two methods of regulating LAWS that are being considered. One is the creation of new legislation through the UN, which would deal exclusively with defining and regulating lethal autonomous weapon systems. The second would be an expansion of the existing Convention on Conventional Weapons (CCW). The CCW has been expanded previously, with a protocol restricting blinding laser weapons being added in 1995. More recently, in 2003, a new CCW protocol was adopted that outline the obligations and best practices for dealing with the explosive remnants of war.17 Experience shows that a new protocol is likely to be more expedient and politically feasible than drafting new legislation.
In terms of responsible investment, if LAWS rely on artificial intelligence and a large amount of AI-related investments are related to military applications, according to a report by PAX, a cautionary approach needs to be taken to the exact destination of these funds.
LAWS as Controversial Weapons
LAWS classification as a controversial weapon under Sustainalytics’ methodology is predicated on the real-world impact of the technology. In application, LAWS would need to have an indiscriminate or disproportionate impact, and currently, they cannot be considered under Sustainalytics’ Controversial Weapons Radar product.
What should investors take from this? Any potential risk driven by any commercial development of LAWS can be mitigated by several investment practices around controversial weapons in general. Perhaps the best one would be to address it at the core: active engagement in sector initiatives or international policies to influence standard practices before LAWS development get a widely accepted definition.
 Thales Group (2021) “Goalkeeper - close-in weapon system,” accessed (20.08.2021) at https://www.thalesgroup.com/en/goalkeeper-close-weapon-system
 Raytheon Missiles & Defense (2021) “Phalanx Weapon System,” accessed (20.08.2021) at https://www.raytheonmissilesanddefense.com/capabilities/products/phalanx-close-in-weapon-system
 Aselsan (2019), “GOKDENIZ CLOSE-IN WEAPON SYSTEM (CIWS),” accessed (20.08.2021) at https://www.aselsan.com.tr/GOKDENIZ_CloseIn_Weapon_System_1178.pdf
 BBC News (2021), “How Israel's Iron Dome missile shield works,” accessed (20.08.2021) at https://www.bbc.com/news/world-middle-east-20385306
 Piper, K. (2019), “Death by algorithm: the age of killer robots is closer than you think,” Vox, accessed (20.08.2021) at https://www.vox.com/2019/6/21/18691459/killer-robots-lethal-autonomous-weapons-ai-war
 Rogoway, T. (2016), “Meet Israel’s 'Suicide Squad' of Self-Sacrificing Drones,” The Drive, accessed (20.08.2021) at https://www.thedrive.com/the-war-zone/4760/meet-israels-suicide-squad-of-self-sacrificing-drones
 Hernandez, J. (2021), “A Military Drone With A Mind Of Its Own Was Used In Combat, U.N. Says,” NPR, accessed (20.08.2021) at https://www.npr.org/2021/06/01/1002196245/a-u-n-report-suggests-libya-saw-the-first-battlefield-killing-by-an-autonomous-d?t=1629738049924
 Iddon, P. (2020), “Turkey, Israel And Iran Have Built Some Very Lethal Loitering Munitions,” Forbes, accessed (20.08.2021) at https://www.forbes.com/sites/pauliddon/2020/07/19/turkey-israel-and-iran-have-built-some-very-lethal-loitering-munitions/?sh=13370f8459de
 Docherty, B. (2012), Losing Humanity The Case against Killer Robots,” Human Rights Watch, accessed (20.08.2021) at https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots#
 Wareham, M. (2020), „Stopping Killer Robots,“ Human Rights Watch, accessed (20.08.2021) at https://www.hrw.org/report/2020/08/10/stopping-killer-robots/country-positions-banning-fully-autonomous-weapons-and
 International Committee of the Red Cross (2021), “ICRC position on autonomous weapon systems,”, accessed (20.08.2021) at https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems
 Patterson, D. (2020), “Ethical Imperatives for Lethal Autonomous Weapons,” Belfer Center for Science and International Affairs, accessed on (20.08.2021) at https://www.belfercenter.org/publication/ethical-imperatives-lethal-autonomous-weapons
 Etzioni, A and Etzioni, O. (2017), “Pros and Cons of Autonomous Weapons Systems,” Military Review, accessed on (20.08.2021) at https://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/May-June-2017/Pros-and-Cons-of-Autonomous-Weapons-Systems/
Physical Climate Risks: 6 Things Portfolio Managers Need to Know
The negative physical impacts of climate change are being felt by communities and corporations globally and are likely to get worse in the coming years. The knock-on costs of more frequent “once-in-a-century” climate events on economies are likely to rise. To prepare for this looming threat, investors must forecast the asset-level effects of climate change on companies in a granular and sophisticated way. Here are six things portfolio managers should know to manage and mitigate the physical risks of climate change to their portfolios and meet growing list of climate-focused reporting requirements.
Applying Business and Human Rights International Standards to Investor Due Diligence
Socially conscious ESG investors are interested in how to implement international business and human rights norms in their portfolios and understand the potential impacts of applying additional screening criteria within their strategy.
Telecom Network Outages, the ESG Risks of a Connected World
The telecom industry is exposed to several Material ESG Issues, including Data Privacy and Security, Business Ethics, Human Capital and Product Governance. Product Governance issues in the telecom industry include service quality, maintaining reliable, high-speed networks, and responding to customer billing concerns.
ESG Risks Affecting Data Centers: Why Water Resource Use Matters to Investors
Data centers play a critical role for many technology and telecom companies and for their supporting servers, digital storage equipment and network infrastructure for data processing and storage. Data centers require high volumes of water directly for cooling purposes and indirectly, through electricity generation. Morningstar Sustainalytics’ recent activation of the Resource Use Material ESG Issue (MEI) within its ESG Risk Ratings recognizes water risks of data centers.