Skip to main content
The Australian National University
School of Philosophy
ANU College of Arts & Social Sciences
School of Philosophy ANU College of Arts & Social Sciences
 School of Philosophy

School of Philosophy

  • Home
  • People
  • Events
    • Event series
    • Conferences
      • Past conferences
    • Past events
  • News
    • Audio/Video Recordings
  • Research
  • Study with us
    • Prizes and scholarships
  • Visit us
  • Contact us
 Centres & Projects

Centres & Projects

  • Centre for Consciousness
  • Centre for Moral, Social and Political Theory
  • Centre for Philosophy of the Sciences
  • Humanising Machine Intelligence
 Related Sites

Related Sites

  • Research School of Social Sciences
  • ANU College of Arts & Social Sciences

Centre for Consciousness

Related Sites

Centre for Moral, Social and Political Theory

Centre for Philosophy of the Sciences

School of Philosophy

Administrator

Breadcrumb

HomeHomeToni Erskine (ANU): Flesh-and-Blood, Corporate, Robotic? Moral Agents of Restraint and The Problem of Misplaced Responsibility In War
Toni Erskine (ANU): Flesh-and-Blood, Corporate, Robotic? Moral Agents of Restraint and the Problem of Misplaced Responsibility in War

Who – or what – are ‘moral agents of restraint’ in war?

This is a critical moment for such an enquiry. Two different movements – one in the academic sphere, another in the realm of practice – contribute to its timeliness. First, within the just war tradition, a recent rift between ‘traditionalists’ and ‘revisionists’ might be viewed, from one angle, as a debate over what form relevant moral agents, or bearers of duties, can take with respect to particular prescriptions for restraint. Second, as sophisticated weapons systems slouch towards autonomy, the question of where exactly moral responsibility lies for specific acts and omissions involving sophisticated forms of artificial intelligence is posed with increasing frequency and urgency. Identifying the different types of moral agent involved in the practice of war, and what unites and distinguishes them, is necessary for understanding – and perhaps correcting – the responsibility judgements that are made in relation to them.

In this paper, I explore three potential categories of moral agent of restraint in war: (1) individual human beings, generally considered to be ‘paradigmatic’ moral agents; (2) corporate entities, or what I call ‘institutional moral agents’; and (3) intelligent artefacts in the form of sophisticated computers, robots, and other machines, which I will tentatively label ‘simulated moral agents’. Motivating this analysis is the simple principle that any compelling attribution of moral responsibility must be informed by the specific capacities and limitations of the entity towards which it is directed. I argue that understanding the general features that define different manifestations of moral agency is a crucial step in this endeavour. As a corollary to this point, I suggest that we risk misplacing responsibility – often to calamitous effect – when these defining features are ignored or misunderstood.

Date & time

  • Mon 04 Mar 2019, 12:30 pm - 2:00 pm

Location

Coombs Ext Rm 1.04

Speakers

  • Toni Erskine (ANU)

Event Series

MSPT seminars

Contact

  •  School of Philosophy
     Send email
Back to topicon-arrow-up-solid
The Australian National University
 
APRU
IARU
 
edX
Group of Eight Member

Acknowledgement of Country

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.


Contact ANUCopyrightDisclaimerPrivacyFreedom of Information

+61 2 6125 5111 The Australian National University, Canberra

TEQSA Provider ID: PRV12002 (Australian University) CRICOS Provider Code: 00120C ABN: 52 234 063 906