THE PEACE APPEAL FOUNDATION
  • Home
  • About Us
    • History & Shared Values
    • People & Partners
    • Timeline
  • Our Work
    • What We're Learning
    • Processes We Support >
      • Processes We Support
      • Burma/Myanmar
      • Cyprus
      • Lebanon
      • Nepal
      • Sri Lanka
      • South Africa
      • Syria
      • USA
      • Zimbabwe
    • Children's Voices
  • Knowledge Resources
    • Recent Publications
    • P&D Platform >
      • P&D Platform
      • P&D Platform Facebook
    • Children and War
    • Decentralization
    • Federalism
    • Healing and Reconciliation
    • Land Reform
    • Natural Resources
    • Security Sector Reform
    • Music4Peace
  • Donate
  • Blog - Culture of Peace
  • Contact Us
    • Contact Us
    • Work With Us

New Challenges to Peace: Artificial Intelligence and Weapons of War

12/15/2019

0 Comments

 
Picture
By Sahana Bhagat

Technological advances are rapidly outpacing our ability to reflect and decide whether a
particular new technology is one that serves our public good. Technologists, entrepreneurs,
policymakers, ethicists and legal scholars from Microsoft to MIT are now openly questioning
how artificial intelligence and what is called “machine learning” can be designed and/or
regulated to ensure such automated systems don’t do harm, from entrenching racial stereotypes,
or other forms of discrimination, in insurance, criminal justice, healthcare, or in many other
applications.

Nowhere is the dystopian vision of machines increasingly taking over human agency more
frightening than in the current research on lethal autonomous weapons systems, known as
“LAWS”. LAWS are weapons that utilize artificial intelligence to locate, identify, and attack
targets without human intervention. Dubbed ‘killer robots’ their critics argue these technologies
lack human morality and judgement, and point out the danger in assuming that the automation of
the exercise of lethal force is more ‘objective’ than human rationale.

While it is generally assumed that lethal autonomous weapons systems have not yet been
deployed, existing weapons systems that are deployed, particularly defensive weapons, share
some of the same characteristics. A Turkish state-owned defense company, STM, recently
unveiled a “kamikaze drone” complete with facial recognition technology.  Increasing military
investment in artificial intelligence, and what are known as “loitering munitions” (weapons
systems that can “loiter” in a target area for some time before automatically identifying a target
and striking) could make LAWS a reality within the next few years.

Those who advocate for the development of LAWS cite their several advantages. As
autonomous weapons lack a ‘control-and-communication link’ between system and operator,
they are seen as more secure, i.e. less likely to be vulnerable to interception and attack.  They
also point out that in addition to being more secure, autonomous weapons can act without the
delay between command from the operator and interpretation and execution by the system.
Countering critics concerns about their use, proponents argue that because these systems do not
feel fear, they are capable of making more rational decisions than human combatants.  The
argument here is that systems will not react to a threat with an intense need for self-preservation,
and will therefore be less violent and show greater restraint than a soldier.

The weaponization of this new technology raises questions of how that technology should be
governed and regulated. LAWS mark a paradigm shift in warfare. They challenge long standing
views on the morality of war and blur existing conceptions of responsibility in war. As
technology moves further from direct automation and towards systems that can adapt, learn, and adjust, their actions become increasingly unpredictable. By definition, imbuing a system with
autonomous functions means humans cannot control how they will react. The real issue here,
then, is that there is an unprecedented degree of autonomy in a weapons system, and no legal,
moral, ethical, or technological infrastructure to support, regulate, or govern it.

At present, debates on these challenges are taking place under the United Nation’s Convention
on Certain Conventional Weapons in Geneva. The UN Secretary-General, António Guterres has
called for their prohibition in March of this year. The UN’s Group of Governmental Experts, a
subsidiary body of the Convention on Certain Conventional Weapons, began meeting in 2016 to
bring together state signatories, international organizations, nongovernmental organizations, and
academic institutions in discussions on LAWS. Their most recent meeting was in November of
20129. Though the GGE has been discussing LAWS since 2016, little has been achieved
beyond defining LAWS and outlining ‘best practices’ for their use. In the GGE’s August 2018
meeting, 26 states advocated for a ban on fully autonomous weapons, while 12 states including
the United States, the United Kingdom, and Russia, opposed a treaty on LAWS.

A report by the Human Rights Watch issued last year argues machines are unable to distinguish
between combatants and civilians, especially in armed conflicts where the lines between friend
and foe are unclear.  In these situations, the report argues, the opportunity for fratricide and
civilian death is high, and the pace of such an attack would be too fast for human intervention to
prevent it once it begins. From a legal perspective, the question of responsibility poses a major
challenge. How can a machine be held accountable for civilian deaths, or fratricide? Is it the
programmer who will be persecuted, even though the machine acts autonomously?

Experts have also expressed concerns over the unreliability of fully autonomous weapons and the
high risk of uncontrolled proliferation that would inevitably accompany development of LAWS.
In 2015, a large group of AI researchers and robotic engineers released an open letter calling for
a ban on lethal autonomous weapons.  As of 2018, the letter had over 20,000 signatures,
including those of Elon Musk and Steve Wozniak.

Advocacy efforts are largely centered in nongovernmental organizations. Campaign to Stop
Killer Robots, formed in October 2012, is a coalition of non-governmental organizations (NGOs)
that is working to ban fully autonomous weapons and thereby retain meaningful human control over the use of force. 30 countries and the European Parliament have signed on to a call to ban
the technology.

For more information see PAX’s report titled “Slippery Slope: The Arms Industry and
Increasingly Autonomous Weapons” published on Nov. 11, 2019.

For how you can become engaged in advocacy against LAWS, please visit
www.stopkillerrobots.org
0 Comments

    BLOG:
    ​
    Culture of Peace

    ​

    Picture

    Categories

    All
    Appeal For Peace
    Books
    Burma/Myanmar
    Common Spaces
    Conflict Assessment
    Cyprus
    Infographics
    Lebanon
    Media And Conflict
    National Dialogue Conference
    National Dialogues
    Nepal
    News
    Nobel Laureates
    Nobel Prize
    Peace And Dialogue Platform
    Peace Appeal Updates
    Peace Building
    Peace Changemakers
    Peace Resources
    Religion
    Sri Lanka
    Syria
    Truth And Reconciliation
    USA

    Archives

    December 2019
    November 2019
    January 2019
    December 2018
    November 2018
    August 2018
    December 2017
    September 2017
    May 2017
    February 2017
    January 2017
    September 2016
    May 2016
    January 2016
    December 2015
    November 2015
    May 2015
    April 2015
    March 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    April 2014
    March 2014
    February 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    July 2013
    June 2013
    May 2013
    February 2013
    January 2013
    December 2012
    January 2012
    May 2011
    December 2010
    August 2010
    February 2010
    January 2010
    February 2009
    January 2009
    December 2008
    September 2008
    August 2008
    June 2008
    May 2008
    November 2007
    September 2007
    December 2006
    November 2006
    October 2006
    October 2004
    November 2001
    December 2000
    November 2000
    January 2000
    December 1999
    September 1999
    June 1999
    October 1995
    April 1983

    RSS Feed

Copyright 2019, Peace Appeal Foundation.  All rights reserved.
  • Home
  • About Us
    • History & Shared Values
    • People & Partners
    • Timeline
  • Our Work
    • What We're Learning
    • Processes We Support >
      • Processes We Support
      • Burma/Myanmar
      • Cyprus
      • Lebanon
      • Nepal
      • Sri Lanka
      • South Africa
      • Syria
      • USA
      • Zimbabwe
    • Children's Voices
  • Knowledge Resources
    • Recent Publications
    • P&D Platform >
      • P&D Platform
      • P&D Platform Facebook
    • Children and War
    • Decentralization
    • Federalism
    • Healing and Reconciliation
    • Land Reform
    • Natural Resources
    • Security Sector Reform
    • Music4Peace
  • Donate
  • Blog - Culture of Peace
  • Contact Us
    • Contact Us
    • Work With Us